[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 30564 1726882800.94870: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-Xyq executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 30564 1726882800.96042: Added group all to inventory 30564 1726882800.96044: Added group ungrouped to inventory 30564 1726882800.96048: Group all now contains ungrouped 30564 1726882800.96051: Examining possible inventory source: /tmp/network-91m/inventory.yml 30564 1726882801.26487: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 30564 1726882801.26546: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 30564 1726882801.26573: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 30564 1726882801.26633: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 30564 1726882801.26710: Loaded config def from plugin (inventory/script) 30564 1726882801.26712: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 30564 1726882801.26752: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 30564 1726882801.26841: Loaded config def from plugin (inventory/yaml) 30564 1726882801.26843: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 30564 1726882801.26931: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 30564 1726882801.27359: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 30564 1726882801.27362: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 30564 1726882801.27370: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 30564 1726882801.27376: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 30564 1726882801.27381: Loading data from /tmp/network-91m/inventory.yml 30564 1726882801.27452: /tmp/network-91m/inventory.yml was not parsable by auto 30564 1726882801.27523: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 30564 1726882801.27578: Loading data from /tmp/network-91m/inventory.yml 30564 1726882801.27655: group all already in inventory 30564 1726882801.27662: set inventory_file for managed_node1 30564 1726882801.27671: set inventory_dir for managed_node1 30564 1726882801.27672: Added host managed_node1 to inventory 30564 1726882801.27675: Added host managed_node1 to group all 30564 1726882801.27677: set ansible_host for managed_node1 30564 1726882801.27678: set ansible_ssh_extra_args for managed_node1 30564 1726882801.27682: set inventory_file for managed_node2 30564 1726882801.27685: set inventory_dir for managed_node2 30564 1726882801.27686: Added host managed_node2 to inventory 30564 1726882801.27687: Added host managed_node2 to group all 30564 1726882801.27688: set ansible_host for managed_node2 30564 1726882801.27689: set ansible_ssh_extra_args for managed_node2 30564 1726882801.27691: set inventory_file for managed_node3 30564 1726882801.27694: set inventory_dir for managed_node3 30564 1726882801.27695: Added host managed_node3 to inventory 30564 1726882801.27696: Added host managed_node3 to group all 30564 1726882801.27697: set ansible_host for managed_node3 30564 1726882801.27697: set ansible_ssh_extra_args for managed_node3 30564 1726882801.27700: Reconcile groups and hosts in inventory. 30564 1726882801.27703: Group ungrouped now contains managed_node1 30564 1726882801.27705: Group ungrouped now contains managed_node2 30564 1726882801.27707: Group ungrouped now contains managed_node3 30564 1726882801.27790: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 30564 1726882801.27913: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 30564 1726882801.27961: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 30564 1726882801.27994: Loaded config def from plugin (vars/host_group_vars) 30564 1726882801.27996: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 30564 1726882801.28003: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 30564 1726882801.28011: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 30564 1726882801.28051: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 30564 1726882801.28390: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882801.28484: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 30564 1726882801.28522: Loaded config def from plugin (connection/local) 30564 1726882801.28525: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 30564 1726882801.29125: Loaded config def from plugin (connection/paramiko_ssh) 30564 1726882801.29128: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 30564 1726882801.30076: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 30564 1726882801.30117: Loaded config def from plugin (connection/psrp) 30564 1726882801.30120: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 30564 1726882801.30849: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 30564 1726882801.30893: Loaded config def from plugin (connection/ssh) 30564 1726882801.30896: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 30564 1726882801.32743: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 30564 1726882801.32786: Loaded config def from plugin (connection/winrm) 30564 1726882801.32789: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 30564 1726882801.32820: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 30564 1726882801.32886: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 30564 1726882801.32954: Loaded config def from plugin (shell/cmd) 30564 1726882801.32956: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 30564 1726882801.32986: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 30564 1726882801.33053: Loaded config def from plugin (shell/powershell) 30564 1726882801.33055: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 30564 1726882801.33113: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 30564 1726882801.33301: Loaded config def from plugin (shell/sh) 30564 1726882801.33303: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 30564 1726882801.33336: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 30564 1726882801.33601: Loaded config def from plugin (become/runas) 30564 1726882801.33603: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 30564 1726882801.33788: Loaded config def from plugin (become/su) 30564 1726882801.33790: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 30564 1726882801.33950: Loaded config def from plugin (become/sudo) 30564 1726882801.33952: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 30564 1726882801.33989: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_states_nm.yml 30564 1726882801.34324: in VariableManager get_vars() 30564 1726882801.34345: done with get_vars() 30564 1726882801.34476: trying /usr/local/lib/python3.12/site-packages/ansible/modules 30564 1726882801.37404: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 30564 1726882801.37523: in VariableManager get_vars() 30564 1726882801.37528: done with get_vars() 30564 1726882801.37530: variable 'playbook_dir' from source: magic vars 30564 1726882801.37531: variable 'ansible_playbook_python' from source: magic vars 30564 1726882801.37532: variable 'ansible_config_file' from source: magic vars 30564 1726882801.37533: variable 'groups' from source: magic vars 30564 1726882801.37534: variable 'omit' from source: magic vars 30564 1726882801.37534: variable 'ansible_version' from source: magic vars 30564 1726882801.37535: variable 'ansible_check_mode' from source: magic vars 30564 1726882801.37536: variable 'ansible_diff_mode' from source: magic vars 30564 1726882801.37537: variable 'ansible_forks' from source: magic vars 30564 1726882801.37538: variable 'ansible_inventory_sources' from source: magic vars 30564 1726882801.37538: variable 'ansible_skip_tags' from source: magic vars 30564 1726882801.37539: variable 'ansible_limit' from source: magic vars 30564 1726882801.37540: variable 'ansible_run_tags' from source: magic vars 30564 1726882801.37541: variable 'ansible_verbosity' from source: magic vars 30564 1726882801.37579: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_states.yml 30564 1726882801.38242: in VariableManager get_vars() 30564 1726882801.38259: done with get_vars() 30564 1726882801.38315: in VariableManager get_vars() 30564 1726882801.38328: done with get_vars() 30564 1726882801.38381: in VariableManager get_vars() 30564 1726882801.38394: done with get_vars() 30564 1726882801.38440: in VariableManager get_vars() 30564 1726882801.38454: done with get_vars() 30564 1726882801.38507: in VariableManager get_vars() 30564 1726882801.38521: done with get_vars() 30564 1726882801.38576: in VariableManager get_vars() 30564 1726882801.38593: done with get_vars() 30564 1726882801.38644: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 30564 1726882801.38657: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 30564 1726882801.38911: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 30564 1726882801.39086: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 30564 1726882801.39089: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-Xyq/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) 30564 1726882801.39119: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 30564 1726882801.39143: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 30564 1726882801.39317: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 30564 1726882801.39383: Loaded config def from plugin (callback/default) 30564 1726882801.39385: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 30564 1726882801.41594: Loaded config def from plugin (callback/junit) 30564 1726882801.41597: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 30564 1726882801.41645: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 30564 1726882801.41718: Loaded config def from plugin (callback/minimal) 30564 1726882801.41720: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 30564 1726882801.41759: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 30564 1726882801.41821: Loaded config def from plugin (callback/tree) 30564 1726882801.41824: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 30564 1726882801.41941: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 30564 1726882801.41944: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-Xyq/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_states_nm.yml ************************************************** 2 plays in /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_states_nm.yml 30564 1726882801.41981: in VariableManager get_vars() 30564 1726882801.41996: done with get_vars() 30564 1726882801.42002: in VariableManager get_vars() 30564 1726882801.42010: done with get_vars() 30564 1726882801.42014: variable 'omit' from source: magic vars 30564 1726882801.42056: in VariableManager get_vars() 30564 1726882801.42076: done with get_vars() 30564 1726882801.42099: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_states.yml' with nm as provider] *********** 30564 1726882801.42647: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 30564 1726882801.42837: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 30564 1726882801.42873: getting the remaining hosts for this loop 30564 1726882801.42875: done getting the remaining hosts for this loop 30564 1726882801.42878: getting the next task for host managed_node2 30564 1726882801.42882: done getting next task for host managed_node2 30564 1726882801.42884: ^ task is: TASK: Gathering Facts 30564 1726882801.42886: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882801.42888: getting variables 30564 1726882801.42889: in VariableManager get_vars() 30564 1726882801.42900: Calling all_inventory to load vars for managed_node2 30564 1726882801.42902: Calling groups_inventory to load vars for managed_node2 30564 1726882801.42905: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882801.42917: Calling all_plugins_play to load vars for managed_node2 30564 1726882801.42928: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882801.42932: Calling groups_plugins_play to load vars for managed_node2 30564 1726882801.42974: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882801.43029: done with get_vars() 30564 1726882801.43036: done getting variables 30564 1726882801.43105: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_states_nm.yml:6 Friday 20 September 2024 21:40:01 -0400 (0:00:00.012) 0:00:00.012 ****** 30564 1726882801.43126: entering _queue_task() for managed_node2/gather_facts 30564 1726882801.43127: Creating lock for gather_facts 30564 1726882801.44144: worker is 1 (out of 1 available) 30564 1726882801.44155: exiting _queue_task() for managed_node2/gather_facts 30564 1726882801.44170: done queuing things up, now waiting for results queue to drain 30564 1726882801.44171: waiting for pending results... 30564 1726882801.44403: running TaskExecutor() for managed_node2/TASK: Gathering Facts 30564 1726882801.44503: in run() - task 0e448fcc-3ce9-4216-acec-00000000001b 30564 1726882801.44526: variable 'ansible_search_path' from source: unknown 30564 1726882801.44566: calling self._execute() 30564 1726882801.44633: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882801.44645: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882801.44657: variable 'omit' from source: magic vars 30564 1726882801.44758: variable 'omit' from source: magic vars 30564 1726882801.44793: variable 'omit' from source: magic vars 30564 1726882801.44831: variable 'omit' from source: magic vars 30564 1726882801.44890: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882801.44930: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882801.44958: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882801.44984: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882801.45000: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882801.45033: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882801.45041: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882801.45048: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882801.45154: Set connection var ansible_timeout to 10 30564 1726882801.45169: Set connection var ansible_pipelining to False 30564 1726882801.45178: Set connection var ansible_shell_type to sh 30564 1726882801.45187: Set connection var ansible_shell_executable to /bin/sh 30564 1726882801.45197: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882801.45202: Set connection var ansible_connection to ssh 30564 1726882801.45227: variable 'ansible_shell_executable' from source: unknown 30564 1726882801.45233: variable 'ansible_connection' from source: unknown 30564 1726882801.45239: variable 'ansible_module_compression' from source: unknown 30564 1726882801.45245: variable 'ansible_shell_type' from source: unknown 30564 1726882801.45250: variable 'ansible_shell_executable' from source: unknown 30564 1726882801.45257: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882801.45269: variable 'ansible_pipelining' from source: unknown 30564 1726882801.45279: variable 'ansible_timeout' from source: unknown 30564 1726882801.45290: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882801.45476: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (found_in_cache=True, class_only=False) 30564 1726882801.45490: variable 'omit' from source: magic vars 30564 1726882801.45501: starting attempt loop 30564 1726882801.45506: running the handler 30564 1726882801.45523: variable 'ansible_facts' from source: unknown 30564 1726882801.45542: _low_level_execute_command(): starting 30564 1726882801.45552: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882801.46317: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882801.46331: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882801.46346: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882801.46365: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882801.46412: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882801.46425: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882801.46439: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882801.46458: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882801.46477: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882801.46491: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882801.46502: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882801.46515: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882801.46531: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882801.46543: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882801.46556: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882801.46576: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882801.46644: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882801.46673: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882801.46692: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882801.46833: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882801.48503: stdout chunk (state=3): >>>/root <<< 30564 1726882801.48682: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882801.48688: stdout chunk (state=3): >>><<< 30564 1726882801.48698: stderr chunk (state=3): >>><<< 30564 1726882801.48722: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882801.48738: _low_level_execute_command(): starting 30564 1726882801.48745: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882801.4872231-30606-215916981065852 `" && echo ansible-tmp-1726882801.4872231-30606-215916981065852="` echo /root/.ansible/tmp/ansible-tmp-1726882801.4872231-30606-215916981065852 `" ) && sleep 0' 30564 1726882801.50237: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882801.50243: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882801.50450: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882801.50454: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882801.50457: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882801.50641: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882801.50644: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882801.50646: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882801.50762: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882801.52664: stdout chunk (state=3): >>>ansible-tmp-1726882801.4872231-30606-215916981065852=/root/.ansible/tmp/ansible-tmp-1726882801.4872231-30606-215916981065852 <<< 30564 1726882801.52776: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882801.52867: stderr chunk (state=3): >>><<< 30564 1726882801.52870: stdout chunk (state=3): >>><<< 30564 1726882801.53072: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882801.4872231-30606-215916981065852=/root/.ansible/tmp/ansible-tmp-1726882801.4872231-30606-215916981065852 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882801.53077: variable 'ansible_module_compression' from source: unknown 30564 1726882801.53080: ANSIBALLZ: Using generic lock for ansible.legacy.setup 30564 1726882801.53082: ANSIBALLZ: Acquiring lock 30564 1726882801.53084: ANSIBALLZ: Lock acquired: 140506263950048 30564 1726882801.53086: ANSIBALLZ: Creating module 30564 1726882802.09055: ANSIBALLZ: Writing module into payload 30564 1726882802.09435: ANSIBALLZ: Writing module 30564 1726882802.09531: ANSIBALLZ: Renaming module 30564 1726882802.09541: ANSIBALLZ: Done creating module 30564 1726882802.09679: variable 'ansible_facts' from source: unknown 30564 1726882802.09690: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882802.09704: _low_level_execute_command(): starting 30564 1726882802.09721: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 30564 1726882802.10846: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882802.10850: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882802.10891: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882802.10895: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration <<< 30564 1726882802.10898: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882802.10900: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882802.10974: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882802.10978: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882802.10980: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882802.11100: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882802.12777: stdout chunk (state=3): >>>PLATFORM <<< 30564 1726882802.12861: stdout chunk (state=3): >>>Linux <<< 30564 1726882802.12878: stdout chunk (state=3): >>>FOUND /usr/bin/python3.9 /usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 30564 1726882802.13020: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882802.13096: stderr chunk (state=3): >>><<< 30564 1726882802.13099: stdout chunk (state=3): >>><<< 30564 1726882802.13242: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.9 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882802.13248 [managed_node2]: found interpreters: ['/usr/bin/python3.9', '/usr/bin/python3', '/usr/bin/python3'] 30564 1726882802.13251: _low_level_execute_command(): starting 30564 1726882802.13253: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 && sleep 0' 30564 1726882802.13489: Sending initial data 30564 1726882802.13492: Sent initial data (1181 bytes) 30564 1726882802.14034: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882802.14042: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882802.14079: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 30564 1726882802.14082: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882802.14084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 30564 1726882802.14086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882802.14147: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882802.14162: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882802.14293: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882802.18054: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"9\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"9\"\nPLATFORM_ID=\"platform:el9\"\nPRETTY_NAME=\"CentOS Stream 9\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:9\"\nHOME_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 9\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 30564 1726882802.18495: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882802.18777: stderr chunk (state=3): >>><<< 30564 1726882802.18780: stdout chunk (state=3): >>><<< 30564 1726882802.18783: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"9\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"9\"\nPLATFORM_ID=\"platform:el9\"\nPRETTY_NAME=\"CentOS Stream 9\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:9\"\nHOME_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 9\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882802.18785: variable 'ansible_facts' from source: unknown 30564 1726882802.18787: variable 'ansible_facts' from source: unknown 30564 1726882802.18789: variable 'ansible_module_compression' from source: unknown 30564 1726882802.18791: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 30564 1726882802.18793: variable 'ansible_facts' from source: unknown 30564 1726882802.18896: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882801.4872231-30606-215916981065852/AnsiballZ_setup.py 30564 1726882802.19067: Sending initial data 30564 1726882802.19070: Sent initial data (154 bytes) 30564 1726882802.20149: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882802.20165: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882802.20184: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882802.20214: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882802.20257: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882802.20274: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882802.20290: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882802.20320: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882802.20332: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882802.20342: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882802.20353: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882802.20367: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882802.20384: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882802.20395: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882802.20415: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882802.20433: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882802.20515: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882802.20549: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882802.20568: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882802.20701: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882802.22472: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882802.22566: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882802.22662: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmpzuxpr7y9 /root/.ansible/tmp/ansible-tmp-1726882801.4872231-30606-215916981065852/AnsiballZ_setup.py <<< 30564 1726882802.22758: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882802.26071: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882802.26103: stderr chunk (state=3): >>><<< 30564 1726882802.26112: stdout chunk (state=3): >>><<< 30564 1726882802.26151: done transferring module to remote 30564 1726882802.26163: _low_level_execute_command(): starting 30564 1726882802.26171: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882801.4872231-30606-215916981065852/ /root/.ansible/tmp/ansible-tmp-1726882801.4872231-30606-215916981065852/AnsiballZ_setup.py && sleep 0' 30564 1726882802.26629: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882802.26633: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882802.26644: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882802.26652: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882802.26660: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882802.26692: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882802.26695: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882802.26698: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 30564 1726882802.26700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882802.26746: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882802.26752: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882802.26867: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882802.29071: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882802.29122: stderr chunk (state=3): >>><<< 30564 1726882802.29125: stdout chunk (state=3): >>><<< 30564 1726882802.29147: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882802.29151: _low_level_execute_command(): starting 30564 1726882802.29154: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882801.4872231-30606-215916981065852/AnsiballZ_setup.py && sleep 0' 30564 1726882802.29596: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882802.29600: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882802.29637: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882802.29640: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882802.29642: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882802.29690: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882802.29702: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882802.29824: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882802.32460: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 30564 1726882802.32466: stdout chunk (state=3): >>>import _imp # builtin <<< 30564 1726882802.32504: stdout chunk (state=3): >>>import '_thread' # <<< 30564 1726882802.32526: stdout chunk (state=3): >>>import '_warnings' # <<< 30564 1726882802.32531: stdout chunk (state=3): >>>import '_weakref' # <<< 30564 1726882802.32623: stdout chunk (state=3): >>>import '_io' # <<< 30564 1726882802.32645: stdout chunk (state=3): >>>import 'marshal' # <<< 30564 1726882802.32701: stdout chunk (state=3): >>>import 'posix' # <<< 30564 1726882802.32750: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 30564 1726882802.32767: stdout chunk (state=3): >>># installing zipimport hook <<< 30564 1726882802.32818: stdout chunk (state=3): >>>import 'time' # <<< 30564 1726882802.32850: stdout chunk (state=3): >>>import 'zipimport' # <<< 30564 1726882802.32859: stdout chunk (state=3): >>># installed zipimport hook <<< 30564 1726882802.32934: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py<<< 30564 1726882802.32944: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' <<< 30564 1726882802.32981: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py <<< 30564 1726882802.33015: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc'<<< 30564 1726882802.33022: stdout chunk (state=3): >>> <<< 30564 1726882802.33039: stdout chunk (state=3): >>>import '_codecs' # <<< 30564 1726882802.33081: stdout chunk (state=3): >>>import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafbed8dc0> <<< 30564 1726882802.33128: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py <<< 30564 1726882802.33160: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' <<< 30564 1726882802.33186: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafbe7d3a0> <<< 30564 1726882802.33195: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafbed8b20> <<< 30564 1726882802.33231: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py <<< 30564 1726882802.33245: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' <<< 30564 1726882802.33283: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafbed8ac0> <<< 30564 1726882802.33317: stdout chunk (state=3): >>>import '_signal' # <<< 30564 1726882802.33321: stdout chunk (state=3): >>> <<< 30564 1726882802.33351: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py <<< 30564 1726882802.33370: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' <<< 30564 1726882802.33397: stdout chunk (state=3): >>>import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafbe7d490> <<< 30564 1726882802.33433: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py <<< 30564 1726882802.33449: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' <<< 30564 1726882802.33489: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py <<< 30564 1726882802.33498: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' <<< 30564 1726882802.33527: stdout chunk (state=3): >>>import '_abc' # <<< 30564 1726882802.33548: stdout chunk (state=3): >>>import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafbe7d940> <<< 30564 1726882802.33589: stdout chunk (state=3): >>>import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafbe7d670> <<< 30564 1726882802.33648: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py <<< 30564 1726882802.33677: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' <<< 30564 1726882802.33708: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py <<< 30564 1726882802.33746: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc'<<< 30564 1726882802.33756: stdout chunk (state=3): >>> <<< 30564 1726882802.33783: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py <<< 30564 1726882802.33815: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' <<< 30564 1726882802.33848: stdout chunk (state=3): >>>import '_stat' # <<< 30564 1726882802.33871: stdout chunk (state=3): >>>import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafbe34190> <<< 30564 1726882802.33898: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py <<< 30564 1726882802.33938: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' <<< 30564 1726882802.34055: stdout chunk (state=3): >>>import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafbe34220> <<< 30564 1726882802.34090: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py <<< 30564 1726882802.34094: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' <<< 30564 1726882802.34132: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' <<< 30564 1726882802.34137: stdout chunk (state=3): >>>import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafbe57850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafbe34940> <<< 30564 1726882802.34192: stdout chunk (state=3): >>>import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafbe95880> <<< 30564 1726882802.34221: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafbe2dd90> <<< 30564 1726882802.34278: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' <<< 30564 1726882802.34301: stdout chunk (state=3): >>>import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafbe57d90> <<< 30564 1726882802.34377: stdout chunk (state=3): >>>import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafbe7d970> <<< 30564 1726882802.34405: stdout chunk (state=3): >>>Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 30564 1726882802.34901: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' <<< 30564 1726882802.34927: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' <<< 30564 1726882802.34966: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py <<< 30564 1726882802.34990: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' <<< 30564 1726882802.35013: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafbbaeeb0> <<< 30564 1726882802.35065: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafbbb1f40> <<< 30564 1726882802.35096: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' <<< 30564 1726882802.35128: stdout chunk (state=3): >>>import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py <<< 30564 1726882802.35156: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py <<< 30564 1726882802.35196: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafbba7610> <<< 30564 1726882802.35223: stdout chunk (state=3): >>>import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafbbad640> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafbbae370> <<< 30564 1726882802.35240: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py <<< 30564 1726882802.35296: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' <<< 30564 1726882802.35320: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py <<< 30564 1726882802.35366: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' <<< 30564 1726882802.35397: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' <<< 30564 1726882802.35437: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faafba94d90> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafba94880> import 'itertools' # <<< 30564 1726882802.35472: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafba94e80> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py <<< 30564 1726882802.35495: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' <<< 30564 1726882802.35539: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafba94f40> <<< 30564 1726882802.35550: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafba94e50> import '_collections' # <<< 30564 1726882802.35598: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafbb89d00> <<< 30564 1726882802.35622: stdout chunk (state=3): >>>import '_functools' # <<< 30564 1726882802.35637: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafbb825e0> <<< 30564 1726882802.35702: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafbb96640> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafbbb5e20> <<< 30564 1726882802.35715: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' <<< 30564 1726882802.35760: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faafbaa6c40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafbb89220> <<< 30564 1726882802.35809: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faafbb96250> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafbbbb9d0> <<< 30564 1726882802.35832: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' <<< 30564 1726882802.35868: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' <<< 30564 1726882802.35926: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafbaa6f70> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafbaa6d60> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafbaa6cd0> <<< 30564 1726882802.35952: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' <<< 30564 1726882802.35967: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py <<< 30564 1726882802.35999: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py <<< 30564 1726882802.36046: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' <<< 30564 1726882802.36082: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc'<<< 30564 1726882802.36111: stdout chunk (state=3): >>> import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafba7a340> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' <<< 30564 1726882802.36136: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafba7a430> <<< 30564 1726882802.36309: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafbaaef70> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafbaa8a00> <<< 30564 1726882802.36335: stdout chunk (state=3): >>>import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafbaa84c0> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' <<< 30564 1726882802.36372: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py <<< 30564 1726882802.36474: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafb9ad190> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafba64cd0> <<< 30564 1726882802.36519: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafbaa8e80> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafbbbb040> <<< 30564 1726882802.36586: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafb9bfac0> import 'errno' # <<< 30564 1726882802.36706: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faafb9bfdf0> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' <<< 30564 1726882802.36812: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafb9d1700> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' <<< 30564 1726882802.36838: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafb9d1c40> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faafb95f370> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafb9bfee0> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' <<< 30564 1726882802.36923: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faafb96f250> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafb9d1580> <<< 30564 1726882802.36934: stdout chunk (state=3): >>>import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faafb96f310> <<< 30564 1726882802.36957: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafbaa69a0> <<< 30564 1726882802.36994: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' <<< 30564 1726882802.37047: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' <<< 30564 1726882802.37101: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faafb98b670> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' <<< 30564 1726882802.37203: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faafb98b940> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafb98b730> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faafb98b820> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' <<< 30564 1726882802.37388: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faafb98bc70> <<< 30564 1726882802.37790: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faafb9991c0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafb98b8b0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafb97ea00> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafbaa6580> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafb98ba60> <<< 30564 1726882802.37929: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7faafb8b9640> <<< 30564 1726882802.38304: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available <<< 30564 1726882802.38365: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.38403: stdout chunk (state=3): >>>import ansible # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/__init__.py <<< 30564 1726882802.38406: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.38432: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.38437: stdout chunk (state=3): >>>import ansible.module_utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/__init__.py <<< 30564 1726882802.38459: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.40294: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.41517: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafb7f7790> <<< 30564 1726882802.41520: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' <<< 30564 1726882802.41601: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faafb7f7160> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafb7f7280> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafb7f7ee0> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' <<< 30564 1726882802.41612: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafb7f7fd0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafb7f7d00> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faafb7f7f40> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py <<< 30564 1726882802.41770: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafb7f7100> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' <<< 30564 1726882802.41773: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py <<< 30564 1726882802.41779: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' <<< 30564 1726882802.41876: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafb7cc160> <<< 30564 1726882802.41901: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' <<< 30564 1726882802.41919: stdout chunk (state=3): >>># extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faafb1870a0> <<< 30564 1726882802.41924: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' <<< 30564 1726882802.41938: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faafb187280> <<< 30564 1726882802.41953: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py <<< 30564 1726882802.41959: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' <<< 30564 1726882802.42020: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafb187c10> <<< 30564 1726882802.42027: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafb7dedc0> <<< 30564 1726882802.42297: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafb7de3d0> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafb7def40> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' <<< 30564 1726882802.42301: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' <<< 30564 1726882802.42325: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py <<< 30564 1726882802.42331: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' <<< 30564 1726882802.42357: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' <<< 30564 1726882802.42364: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafb82cb20> <<< 30564 1726882802.42439: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafb7feca0> <<< 30564 1726882802.42445: stdout chunk (state=3): >>>import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafb7fe370> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafb7abbb0> <<< 30564 1726882802.42478: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' <<< 30564 1726882802.42500: stdout chunk (state=3): >>># extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faafb7fe490> <<< 30564 1726882802.42503: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' <<< 30564 1726882802.42508: stdout chunk (state=3): >>>import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafb7fe4c0> <<< 30564 1726882802.42534: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py <<< 30564 1726882802.42540: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' <<< 30564 1726882802.42556: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py <<< 30564 1726882802.42598: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' <<< 30564 1726882802.42658: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' <<< 30564 1726882802.42662: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faafb1e5220> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafb83e1c0> <<< 30564 1726882802.42689: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py <<< 30564 1726882802.42695: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' <<< 30564 1726882802.42765: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faafb1f28e0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafb83e340> <<< 30564 1726882802.42776: stdout chunk (state=3): >>># /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py <<< 30564 1726882802.42835: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' <<< 30564 1726882802.42838: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py <<< 30564 1726882802.42841: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # <<< 30564 1726882802.42905: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafb83eca0> <<< 30564 1726882802.43035: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafb1f2880> <<< 30564 1726882802.43126: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faafb7d7160> <<< 30564 1726882802.43154: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faafb8029a0> <<< 30564 1726882802.43196: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faafb83e6d0> <<< 30564 1726882802.43216: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafb837880> <<< 30564 1726882802.43234: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' <<< 30564 1726882802.43250: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py <<< 30564 1726882802.43272: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' <<< 30564 1726882802.43303: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' <<< 30564 1726882802.43309: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faafb1e79d0> <<< 30564 1726882802.43522: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faafb74dd00> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafb1f1640> <<< 30564 1726882802.43545: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faafb1e7f70> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafb1f1a30> # zipimport: zlib available <<< 30564 1726882802.43551: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.43576: stdout chunk (state=3): >>>import ansible.module_utils.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/__init__.py <<< 30564 1726882802.43583: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.43648: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.43729: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 30564 1726882802.43739: stdout chunk (state=3): >>>import ansible.module_utils.common # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/__init__.py <<< 30564 1726882802.43754: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.43759: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/__init__.py <<< 30564 1726882802.43792: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.43882: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.43974: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.44436: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.45579: stdout chunk (state=3): >>>import ansible.module_utils.six # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faafb776790> # /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafb77b7f0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafad9f9d0> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/_text.py<<< 30564 1726882802.45586: stdout chunk (state=3): >>> # zipimport: zlib available <<< 30564 1726882802.45877: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.46077: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafb7b5760> # zipimport: zlib available <<< 30564 1726882802.46723: stdout chunk (state=3): >>># zipimport: zlib available<<< 30564 1726882802.46731: stdout chunk (state=3): >>> <<< 30564 1726882802.47247: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.47308: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.47377: stdout chunk (state=3): >>>import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/collections.py <<< 30564 1726882802.47381: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.47442: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.47446: stdout chunk (state=3): >>>import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/warnings.py <<< 30564 1726882802.47449: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.47584: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.47596: stdout chunk (state=3): >>>import ansible.module_utils.errors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/errors.py <<< 30564 1726882802.47600: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.47610: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.47613: stdout chunk (state=3): >>>import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/__init__.py <<< 30564 1726882802.47615: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.47689: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.47693: stdout chunk (state=3): >>>import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/convert_bool.py <<< 30564 1726882802.47695: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.47885: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.48076: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py <<< 30564 1726882802.48105: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' <<< 30564 1726882802.48108: stdout chunk (state=3): >>>import '_ast' # <<< 30564 1726882802.48191: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafb7fa400> <<< 30564 1726882802.48198: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.48252: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.48322: stdout chunk (state=3): >>>import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/validation.py <<< 30564 1726882802.48326: stdout chunk (state=3): >>>import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/arg_spec.py <<< 30564 1726882802.48351: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.48383: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.48426: stdout chunk (state=3): >>>import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/locale.py <<< 30564 1726882802.48429: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.48467: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.48502: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.48596: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.48654: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py <<< 30564 1726882802.48686: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' <<< 30564 1726882802.48755: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' <<< 30564 1726882802.48759: stdout chunk (state=3): >>>import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faafb76da60> <<< 30564 1726882802.48874: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafb7fdf70> <<< 30564 1726882802.48903: stdout chunk (state=3): >>>import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/process.py <<< 30564 1726882802.48909: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.48958: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.49026: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.49037: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.49085: stdout chunk (state=3): >>># /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py <<< 30564 1726882802.49088: stdout chunk (state=3): >>># code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' <<< 30564 1726882802.49109: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py <<< 30564 1726882802.49136: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' <<< 30564 1726882802.49161: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py <<< 30564 1726882802.49181: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' <<< 30564 1726882802.49265: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafb77e640> <<< 30564 1726882802.49300: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafb7c9cd0> <<< 30564 1726882802.49360: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafb76d7f0> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/distro/__init__.py <<< 30564 1726882802.49367: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.49388: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.49414: stdout chunk (state=3): >>>import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/sys_info.py <<< 30564 1726882802.49501: stdout chunk (state=3): >>>import ansible.module_utils.basic # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/basic.py <<< 30564 1726882802.49505: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.49507: stdout chunk (state=3): >>># zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/modules/__init__.py <<< 30564 1726882802.49528: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.49580: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.49637: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.49643: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.49675: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.49700: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.49746: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.49773: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.49801: stdout chunk (state=3): >>>import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/namespace.py <<< 30564 1726882802.49807: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.49880: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.49934: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.49957: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.50010: stdout chunk (state=3): >>>import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/typing.py <<< 30564 1726882802.50014: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.50696: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafadc89d0> # /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py # code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py # code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' <<< 30564 1726882802.50773: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafad7eb20> # extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so'<<< 30564 1726882802.50779: stdout chunk (state=3): >>> # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so'<<< 30564 1726882802.50785: stdout chunk (state=3): >>> <<< 30564 1726882802.50866: stdout chunk (state=3): >>>import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faafad7ea90> <<< 30564 1726882802.50904: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafadb4820> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafadc8f70> <<< 30564 1726882802.51064: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafab1fe20> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafada4670> # /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py # code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' # extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faafb7dac70> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafad6d130> <<< 30564 1726882802.51069: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py <<< 30564 1726882802.51092: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' <<< 30564 1726882802.51114: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafb7da4f0> <<< 30564 1726882802.51139: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py <<< 30564 1726882802.51162: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' <<< 30564 1726882802.51198: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faafab87f70> <<< 30564 1726882802.51235: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafadb1d30> <<< 30564 1726882802.51258: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafada49a0> <<< 30564 1726882802.51283: stdout chunk (state=3): >>>import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/timeout.py <<< 30564 1726882802.51293: stdout chunk (state=3): >>>import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/collector.py <<< 30564 1726882802.51305: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.51308: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.51319: stdout chunk (state=3): >>>import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/__init__.py <<< 30564 1726882802.51323: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.51404: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.51468: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/facter.py <<< 30564 1726882802.51475: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.51528: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.51590: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/ohai.py <<< 30564 1726882802.51605: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.51610: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/__init__.py <<< 30564 1726882802.51632: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.51661: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.51701: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/apparmor.py <<< 30564 1726882802.51707: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.52393: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/caps.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/chroot.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/cmdline.py # zipimport: zlib available <<< 30564 1726882802.52887: stdout chunk (state=3): >>># zipimport: zlib available<<< 30564 1726882802.52893: stdout chunk (state=3): >>> <<< 30564 1726882802.53516: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/distribution.py<<< 30564 1726882802.53538: stdout chunk (state=3): >>> <<< 30564 1726882802.53541: stdout chunk (state=3): >>># zipimport: zlib available<<< 30564 1726882802.53544: stdout chunk (state=3): >>> <<< 30564 1726882802.53631: stdout chunk (state=3): >>># zipimport: zlib available<<< 30564 1726882802.53637: stdout chunk (state=3): >>> <<< 30564 1726882802.53717: stdout chunk (state=3): >>># zipimport: zlib available<<< 30564 1726882802.53723: stdout chunk (state=3): >>> <<< 30564 1726882802.53769: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.53833: stdout chunk (state=3): >>>import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/datetime.py<<< 30564 1726882802.53837: stdout chunk (state=3): >>> <<< 30564 1726882802.53840: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/date_time.py<<< 30564 1726882802.53842: stdout chunk (state=3): >>> <<< 30564 1726882802.53866: stdout chunk (state=3): >>># zipimport: zlib available<<< 30564 1726882802.53873: stdout chunk (state=3): >>> <<< 30564 1726882802.53912: stdout chunk (state=3): >>># zipimport: zlib available<<< 30564 1726882802.53919: stdout chunk (state=3): >>> <<< 30564 1726882802.53962: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/env.py <<< 30564 1726882802.53988: stdout chunk (state=3): >>># zipimport: zlib available<<< 30564 1726882802.53991: stdout chunk (state=3): >>> <<< 30564 1726882802.54069: stdout chunk (state=3): >>># zipimport: zlib available<<< 30564 1726882802.54078: stdout chunk (state=3): >>> <<< 30564 1726882802.54146: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/dns.py<<< 30564 1726882802.54174: stdout chunk (state=3): >>> <<< 30564 1726882802.54177: stdout chunk (state=3): >>># zipimport: zlib available<<< 30564 1726882802.54180: stdout chunk (state=3): >>> <<< 30564 1726882802.54217: stdout chunk (state=3): >>># zipimport: zlib available<<< 30564 1726882802.54223: stdout chunk (state=3): >>> <<< 30564 1726882802.54261: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/fips.py<<< 30564 1726882802.54275: stdout chunk (state=3): >>> <<< 30564 1726882802.54287: stdout chunk (state=3): >>># zipimport: zlib available<<< 30564 1726882802.54294: stdout chunk (state=3): >>> <<< 30564 1726882802.54333: stdout chunk (state=3): >>># zipimport: zlib available<<< 30564 1726882802.54339: stdout chunk (state=3): >>> <<< 30564 1726882802.54382: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/loadavg.py<<< 30564 1726882802.54400: stdout chunk (state=3): >>> <<< 30564 1726882802.54404: stdout chunk (state=3): >>># zipimport: zlib available<<< 30564 1726882802.54410: stdout chunk (state=3): >>> <<< 30564 1726882802.54513: stdout chunk (state=3): >>># zipimport: zlib available<<< 30564 1726882802.54519: stdout chunk (state=3): >>> <<< 30564 1726882802.54646: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py<<< 30564 1726882802.54652: stdout chunk (state=3): >>> <<< 30564 1726882802.54655: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' <<< 30564 1726882802.54704: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafaaa1df0><<< 30564 1726882802.54710: stdout chunk (state=3): >>> <<< 30564 1726882802.54740: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py<<< 30564 1726882802.54746: stdout chunk (state=3): >>> <<< 30564 1726882802.54792: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc'<<< 30564 1726882802.54798: stdout chunk (state=3): >>> <<< 30564 1726882802.55061: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafaaa19d0><<< 30564 1726882802.55081: stdout chunk (state=3): >>> <<< 30564 1726882802.55085: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/local.py<<< 30564 1726882802.55108: stdout chunk (state=3): >>> <<< 30564 1726882802.55111: stdout chunk (state=3): >>># zipimport: zlib available<<< 30564 1726882802.55114: stdout chunk (state=3): >>> <<< 30564 1726882802.55210: stdout chunk (state=3): >>># zipimport: zlib available<<< 30564 1726882802.55217: stdout chunk (state=3): >>> <<< 30564 1726882802.55296: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/lsb.py<<< 30564 1726882802.55316: stdout chunk (state=3): >>> <<< 30564 1726882802.55321: stdout chunk (state=3): >>># zipimport: zlib available<<< 30564 1726882802.55327: stdout chunk (state=3): >>> <<< 30564 1726882802.55450: stdout chunk (state=3): >>># zipimport: zlib available<<< 30564 1726882802.55458: stdout chunk (state=3): >>> <<< 30564 1726882802.55582: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py<<< 30564 1726882802.55603: stdout chunk (state=3): >>> <<< 30564 1726882802.55606: stdout chunk (state=3): >>># zipimport: zlib available<<< 30564 1726882802.55611: stdout chunk (state=3): >>> <<< 30564 1726882802.55706: stdout chunk (state=3): >>># zipimport: zlib available<<< 30564 1726882802.55712: stdout chunk (state=3): >>> <<< 30564 1726882802.55819: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/platform.py<<< 30564 1726882802.55838: stdout chunk (state=3): >>> <<< 30564 1726882802.55841: stdout chunk (state=3): >>># zipimport: zlib available<<< 30564 1726882802.55847: stdout chunk (state=3): >>> <<< 30564 1726882802.55907: stdout chunk (state=3): >>># zipimport: zlib available<<< 30564 1726882802.55910: stdout chunk (state=3): >>> <<< 30564 1726882802.55988: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py <<< 30564 1726882802.56036: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' <<< 30564 1726882802.56275: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so'<<< 30564 1726882802.56289: stdout chunk (state=3): >>> # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so'<<< 30564 1726882802.56292: stdout chunk (state=3): >>> import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faafaa94b50><<< 30564 1726882802.56294: stdout chunk (state=3): >>> <<< 30564 1726882802.56720: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafaade6a0> <<< 30564 1726882802.56741: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/python.py <<< 30564 1726882802.56745: stdout chunk (state=3): >>># zipimport: zlib available<<< 30564 1726882802.56751: stdout chunk (state=3): >>> <<< 30564 1726882802.56831: stdout chunk (state=3): >>># zipimport: zlib available<<< 30564 1726882802.56838: stdout chunk (state=3): >>> <<< 30564 1726882802.56913: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/selinux.py <<< 30564 1726882802.56946: stdout chunk (state=3): >>># zipimport: zlib available<<< 30564 1726882802.56951: stdout chunk (state=3): >>> <<< 30564 1726882802.57062: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.57190: stdout chunk (state=3): >>># zipimport: zlib available<<< 30564 1726882802.57193: stdout chunk (state=3): >>> <<< 30564 1726882802.57353: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.57567: stdout chunk (state=3): >>>import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/version.py <<< 30564 1726882802.57591: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py # zipimport: zlib available <<< 30564 1726882802.57630: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.57659: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py <<< 30564 1726882802.57685: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.57715: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.57835: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' <<< 30564 1726882802.57853: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faafaa11340> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafaa11640> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/user.py <<< 30564 1726882802.57884: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 30564 1726882802.57902: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py <<< 30564 1726882802.57905: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.57953: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.57993: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/base.py <<< 30564 1726882802.58010: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.58209: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.58420: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/aix.py <<< 30564 1726882802.58423: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.58546: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.58670: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.58727: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.58771: stdout chunk (state=3): >>>import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/sysctl.py import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py <<< 30564 1726882802.58785: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.58909: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.58936: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.59116: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.59310: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py <<< 30564 1726882802.59313: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.59473: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.59638: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py <<< 30564 1726882802.59641: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.59680: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.59723: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.60439: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.61139: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py <<< 30564 1726882802.61152: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.61285: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.61425: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py <<< 30564 1726882802.61430: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.61556: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.61716: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py <<< 30564 1726882802.61719: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.61879: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.62089: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py <<< 30564 1726882802.62092: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.62115: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/__init__.py <<< 30564 1726882802.62131: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.62170: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.62231: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/base.py # zipimport: zlib available <<< 30564 1726882802.62363: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.62487: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.62765: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.63033: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/aix.py <<< 30564 1726882802.63046: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.63088: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.63143: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/darwin.py <<< 30564 1726882802.63146: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.63167: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.63206: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py <<< 30564 1726882802.63209: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.63292: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.63403: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py <<< 30564 1726882802.63407: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.63436: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.63451: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/freebsd.py # zipimport: zlib available <<< 30564 1726882802.63523: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.63591: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hpux.py <<< 30564 1726882802.63610: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.63668: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.63737: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hurd.py <<< 30564 1726882802.63743: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.64102: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.64451: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/linux.py <<< 30564 1726882802.64458: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.64526: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.64603: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/iscsi.py # zipimport: zlib available <<< 30564 1726882802.64654: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.64703: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/nvme.py <<< 30564 1726882802.64707: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.64733: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.64813: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/netbsd.py <<< 30564 1726882802.64818: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.64820: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.64859: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/openbsd.py <<< 30564 1726882802.64868: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.65092: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.65096: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/sunos.py <<< 30564 1726882802.65099: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.65108: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.65111: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py <<< 30564 1726882802.65114: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.65182: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.65251: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/base.py <<< 30564 1726882802.65254: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.65257: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.65285: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.65337: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.65407: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.65490: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.65593: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py <<< 30564 1726882802.65629: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py # zipimport: zlib available <<< 30564 1726882802.65673: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.65743: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py # zipimport: zlib available <<< 30564 1726882802.66475: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py # zipimport: zlib available # zipimport: zlib available <<< 30564 1726882802.66748: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/default_collectors.py <<< 30564 1726882802.66751: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.66859: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.66972: stdout chunk (state=3): >>>import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/__init__.py <<< 30564 1726882802.67079: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882802.67666: stdout chunk (state=3): >>>import 'gc' # <<< 30564 1726882802.68137: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' <<< 30564 1726882802.68181: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py <<< 30564 1726882802.68188: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' <<< 30564 1726882802.68249: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faafaa15310> <<< 30564 1726882802.68253: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafab810d0> <<< 30564 1726882802.68341: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafa8912e0> <<< 30564 1726882802.73416: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/queues.py <<< 30564 1726882802.73479: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafa891430> <<< 30564 1726882802.73483: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/synchronize.py <<< 30564 1726882802.73513: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc' <<< 30564 1726882802.73544: stdout chunk (state=3): >>>import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafa879e20> <<< 30564 1726882802.73617: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc' <<< 30564 1726882802.73680: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafa8a6940> <<< 30564 1726882802.73702: stdout chunk (state=3): >>>import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafa879c10> <<< 30564 1726882802.74121: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame <<< 30564 1726882802.74125: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 30564 1726882802.99216: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-158", "ansible_nodename": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "21e18164a0c64d0daed004bd8a1b67b7", "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fibre_channel_wwn": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "40", "second": "02", "epoch": "1726882802", "epoch_int": "1726882802", "date": "2024-09-20", "time": "21:40:02", "iso8601_micro": "2024-09-21T01:40:02.700519Z", "iso8601": "2024-09-21T01:40:02Z", "iso8601_basic": "20240920T214002700519", "iso8601_basic_short": "20240920T214002", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "r<<< 30564 1726882802.99285: stdout chunk (state=3): >>>hgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALEARW5ZJ51XTLSDuUsPojumVU0f1DmiQsXjMOap4QLlljOiysapjSUe6pZOyAdiI/KfARhDoOFvlC07kCLCcs7DDk8JxBZpsM0D55SdDlfwsB3FVgWNP+9by8G6kzbePHWdZyyWlAuavj4OAEwAjpWpP8/daus0ha4xywlVVoKjAAAAFQCbiW4bR+tgMvjrxC198dqI1mTbjQAAAIBzCzkJTtnGDKfOHq2dFI5cUEuaj1PgRot3wyaXENzUjZVnIFgXUmgKDCxO+EAtU6uAkBPQF4XNgiuaw5bavYpZxcJ4WIpM4ZDRoSkc7BBbJPRLZ45GfrHJwgqAmAZ3RSvVqeXE4WKQHLm43/eDHewgPqqqWe6QVuQH5SEe79yk3wAAAIEArG+AuupiAeoVJ9Lh36QMj4kRo5pTASh2eD5MqSOdy39UhsXbWBcj3JCIvNk/nwep/9neGyRZ5t5wT05dRX80vlgZJX65hrbepO+lqC3wlng+6GQ34D7TJKYnvEkR3neE0+06kx5R6IRWZf1YQV6fMQhx8AJ2JmvnLFicmYlkhQQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDND+RJCrYgIUzolo5fZ64Ey6cksefKDUWmGDjsqVTmuT3HrlDyUZOro4JAnUQBmiamXsJUFbrFdJAVpukD4yyowqCQLr0ZFuKNEzrt5CObrtWflOskKynO3kaoU0WhDkqIbwS2j/+NxBCxgDGqd/5Os3cOMv3eyjUElz6xoI4zsmGMfxVYmT+/SHBfoyxyqY8Hw2Ooq+H5L9OlYgV4hqu7kKPpM1THUJTjy47m6qvws5gztclLjPA1KIW2Dz6kKzUYspNJcoS2sK1xFvL7mBjpGAP7WhXVH2n5ySenQ24Z6mEj+tG2f11rjPpjCUjDzzciGCWiRDZWBLm/GGmQXJJ8zAYnw82yIUKqufLrr1wmcXICPMVj9pFjXSoBWe/yhX9E87w7YD5HWsUrgrLdSctdV4QYy+R5g9ERi7FjwbRsuZ04BihZs70+f/29hUzuc6MA87KVovGT0Uc7GVC7bx8NLt0bTBsbydlONVHVQuol/YEpQrQophDvmBfh+PgMDH8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOEITn1vyppR+Moe1UdR0WGPhUnQ/dwHNcNi0OYy21LkBQ5jsxOPLvZ+C2MbRYlz2afs4nYYIV8E0AuK6aRks3w=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKEdFOHVk9tX1R+zEyLVdxS/U5QeeeFYWSnUmjpXlpt7", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_apparmor": {"status": "disabled"}, "ansible_is_chroot": false, "ansible_service_mgr": "systemd", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2804, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 728, "free": 2804}, "nocache": {"free": 3268, "used": 264}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2e6858-9a88-b36a-7765-70992ab591a7", "ansible_product_uuid": "ec2e6858-9a88-b36a-7765-70992ab591a7", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 741, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264234078208, "block_size": 4096, "block_total": 65519355, "block_available": 64510273, "block_used": 1009082, "inode_total": 131071472, "inode_available": 130998688, "inode_used": 72784, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_fips": false, "ansible_interfaces": ["rpltstbr", "lo", "eth0"], "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "2e:06:5a:d7:92:57", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:4f:68:7a:de:b1", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.158", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::104f:68ff:fe7a:deb1", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback":<<< 30564 1726882802.99295: stdout chunk (state=3): >>> "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.158", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:4f:68:7a:de:b1", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["192.0.2.72", "10.31.11.158"], "ansible_all_ipv6_addresses": ["fe80::104f:68ff:fe7a:deb1"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.158", "127.0.0.0/8", "127.0.0.1", "192.0.2.72"], "ipv6": ["::1", "fe80::104f:68ff:fe7a:deb1"]}, "ansible_local": {}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 33528 10.31.11.158 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 33528 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_lsb": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_iscsi_iqn": "", "ansible_loadavg": {"1m": 0.44, "5m": 0.42, "15m": 0.26}, "ansible_pkg_mgr": "dnf", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 30564 1726882802.99858: stdout chunk (state=3): >>># clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases <<< 30564 1726882802.99895: stdout chunk (state=3): >>># cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp <<< 30564 1726882802.99951: stdout chunk (state=3): >>># cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text <<< 30564 1726882803.00036: stdout chunk (state=3): >>># destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves <<< 30564 1726882803.00176: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl <<< 30564 1726882803.00230: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing gc # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 30564 1726882803.00521: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 30564 1726882803.00536: stdout chunk (state=3): >>># destroy importlib.util # destroy importlib.abc # destroy importlib.machinery <<< 30564 1726882803.00580: stdout chunk (state=3): >>># destroy zipimport <<< 30564 1726882803.00607: stdout chunk (state=3): >>># destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma <<< 30564 1726882803.00628: stdout chunk (state=3): >>># destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings <<< 30564 1726882803.00647: stdout chunk (state=3): >>># destroy syslog # destroy uuid <<< 30564 1726882803.00695: stdout chunk (state=3): >>># destroy selinux # destroy distro # destroy logging # destroy argparse <<< 30564 1726882803.00753: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector <<< 30564 1726882803.00789: stdout chunk (state=3): >>># destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy pickle # destroy _compat_pickle <<< 30564 1726882803.00823: stdout chunk (state=3): >>># destroy queue # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 <<< 30564 1726882803.00851: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json <<< 30564 1726882803.00914: stdout chunk (state=3): >>># destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector <<< 30564 1726882803.00928: stdout chunk (state=3): >>># destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy array # destroy multiprocessing.dummy.connection <<< 30564 1726882803.00985: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping gc # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes <<< 30564 1726882803.01055: stdout chunk (state=3): >>># cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno <<< 30564 1726882803.01124: stdout chunk (state=3): >>># cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse <<< 30564 1726882803.01155: stdout chunk (state=3): >>># cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath <<< 30564 1726882803.01181: stdout chunk (state=3): >>># cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins <<< 30564 1726882803.01202: stdout chunk (state=3): >>># destroy unicodedata # destroy gc # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal <<< 30564 1726882803.01405: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy _sre # destroy sre_parse <<< 30564 1726882803.01456: stdout chunk (state=3): >>># destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select <<< 30564 1726882803.01478: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator <<< 30564 1726882803.01481: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal <<< 30564 1726882803.01501: stdout chunk (state=3): >>># destroy _frozen_importlib # clear sys.audit hooks <<< 30564 1726882803.01913: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882803.01916: stdout chunk (state=3): >>><<< 30564 1726882803.01918: stderr chunk (state=3): >>><<< 30564 1726882803.02175: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafbed8dc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafbe7d3a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafbed8b20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafbed8ac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafbe7d490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafbe7d940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafbe7d670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafbe34190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafbe34220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafbe57850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafbe34940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafbe95880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafbe2dd90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafbe57d90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafbe7d970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafbbaeeb0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafbbb1f40> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafbba7610> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafbbad640> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafbbae370> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faafba94d90> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafba94880> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafba94e80> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafba94f40> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafba94e50> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafbb89d00> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafbb825e0> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafbb96640> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafbbb5e20> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faafbaa6c40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafbb89220> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faafbb96250> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafbbbb9d0> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafbaa6f70> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafbaa6d60> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafbaa6cd0> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafba7a340> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafba7a430> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafbaaef70> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafbaa8a00> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafbaa84c0> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafb9ad190> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafba64cd0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafbaa8e80> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafbbbb040> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafb9bfac0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faafb9bfdf0> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafb9d1700> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafb9d1c40> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faafb95f370> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafb9bfee0> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faafb96f250> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafb9d1580> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faafb96f310> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafbaa69a0> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faafb98b670> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faafb98b940> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafb98b730> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faafb98b820> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faafb98bc70> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faafb9991c0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafb98b8b0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafb97ea00> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafbaa6580> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafb98ba60> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7faafb8b9640> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafb7f7790> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faafb7f7160> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafb7f7280> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafb7f7ee0> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafb7f7fd0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafb7f7d00> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faafb7f7f40> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafb7f7100> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafb7cc160> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faafb1870a0> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faafb187280> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafb187c10> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafb7dedc0> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafb7de3d0> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafb7def40> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafb82cb20> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafb7feca0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafb7fe370> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafb7abbb0> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faafb7fe490> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafb7fe4c0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faafb1e5220> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafb83e1c0> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faafb1f28e0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafb83e340> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafb83eca0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafb1f2880> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faafb7d7160> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faafb8029a0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faafb83e6d0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafb837880> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faafb1e79d0> # extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faafb74dd00> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafb1f1640> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faafb1e7f70> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafb1f1a30> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.six # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faafb776790> # /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafb77b7f0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafad9f9d0> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafb7b5760> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafb7fa400> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faafb76da60> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafb7fdf70> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafb77e640> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafb7c9cd0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafb76d7f0> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/namespace.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafadc89d0> # /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py # code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py # code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafad7eb20> # extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faafad7ea90> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafadb4820> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafadc8f70> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafab1fe20> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafada4670> # /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py # code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' # extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faafb7dac70> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafad6d130> # /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafb7da4f0> # /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faafab87f70> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafadb1d30> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafada49a0> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/timeout.py import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/collector.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/facter.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/ohai.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/apparmor.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/caps.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/chroot.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/cmdline.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/distribution.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/date_time.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/env.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/dns.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/fips.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/loadavg.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafaaa1df0> # /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py # code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafaaa19d0> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/local.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/lsb.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/platform.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py # code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faafaa94b50> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafaade6a0> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/python.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/selinux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faafaa11340> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafaa11640> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/user.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/base.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/aix.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/sysctl.py import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/aix.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/darwin.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/freebsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/iscsi.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/nvme.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/default_collectors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_3p4vr2kr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/__init__.py # zipimport: zlib available import 'gc' # # /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py # code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faafaa15310> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafab810d0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafa8912e0> # /usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/queues.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafa891430> # /usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafa879e20> # /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafa8a6940> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7faafa879c10> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-158", "ansible_nodename": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "21e18164a0c64d0daed004bd8a1b67b7", "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fibre_channel_wwn": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "40", "second": "02", "epoch": "1726882802", "epoch_int": "1726882802", "date": "2024-09-20", "time": "21:40:02", "iso8601_micro": "2024-09-21T01:40:02.700519Z", "iso8601": "2024-09-21T01:40:02Z", "iso8601_basic": "20240920T214002700519", "iso8601_basic_short": "20240920T214002", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALEARW5ZJ51XTLSDuUsPojumVU0f1DmiQsXjMOap4QLlljOiysapjSUe6pZOyAdiI/KfARhDoOFvlC07kCLCcs7DDk8JxBZpsM0D55SdDlfwsB3FVgWNP+9by8G6kzbePHWdZyyWlAuavj4OAEwAjpWpP8/daus0ha4xywlVVoKjAAAAFQCbiW4bR+tgMvjrxC198dqI1mTbjQAAAIBzCzkJTtnGDKfOHq2dFI5cUEuaj1PgRot3wyaXENzUjZVnIFgXUmgKDCxO+EAtU6uAkBPQF4XNgiuaw5bavYpZxcJ4WIpM4ZDRoSkc7BBbJPRLZ45GfrHJwgqAmAZ3RSvVqeXE4WKQHLm43/eDHewgPqqqWe6QVuQH5SEe79yk3wAAAIEArG+AuupiAeoVJ9Lh36QMj4kRo5pTASh2eD5MqSOdy39UhsXbWBcj3JCIvNk/nwep/9neGyRZ5t5wT05dRX80vlgZJX65hrbepO+lqC3wlng+6GQ34D7TJKYnvEkR3neE0+06kx5R6IRWZf1YQV6fMQhx8AJ2JmvnLFicmYlkhQQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDND+RJCrYgIUzolo5fZ64Ey6cksefKDUWmGDjsqVTmuT3HrlDyUZOro4JAnUQBmiamXsJUFbrFdJAVpukD4yyowqCQLr0ZFuKNEzrt5CObrtWflOskKynO3kaoU0WhDkqIbwS2j/+NxBCxgDGqd/5Os3cOMv3eyjUElz6xoI4zsmGMfxVYmT+/SHBfoyxyqY8Hw2Ooq+H5L9OlYgV4hqu7kKPpM1THUJTjy47m6qvws5gztclLjPA1KIW2Dz6kKzUYspNJcoS2sK1xFvL7mBjpGAP7WhXVH2n5ySenQ24Z6mEj+tG2f11rjPpjCUjDzzciGCWiRDZWBLm/GGmQXJJ8zAYnw82yIUKqufLrr1wmcXICPMVj9pFjXSoBWe/yhX9E87w7YD5HWsUrgrLdSctdV4QYy+R5g9ERi7FjwbRsuZ04BihZs70+f/29hUzuc6MA87KVovGT0Uc7GVC7bx8NLt0bTBsbydlONVHVQuol/YEpQrQophDvmBfh+PgMDH8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOEITn1vyppR+Moe1UdR0WGPhUnQ/dwHNcNi0OYy21LkBQ5jsxOPLvZ+C2MbRYlz2afs4nYYIV8E0AuK6aRks3w=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKEdFOHVk9tX1R+zEyLVdxS/U5QeeeFYWSnUmjpXlpt7", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_apparmor": {"status": "disabled"}, "ansible_is_chroot": false, "ansible_service_mgr": "systemd", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2804, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 728, "free": 2804}, "nocache": {"free": 3268, "used": 264}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2e6858-9a88-b36a-7765-70992ab591a7", "ansible_product_uuid": "ec2e6858-9a88-b36a-7765-70992ab591a7", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 741, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264234078208, "block_size": 4096, "block_total": 65519355, "block_available": 64510273, "block_used": 1009082, "inode_total": 131071472, "inode_available": 130998688, "inode_used": 72784, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_fips": false, "ansible_interfaces": ["rpltstbr", "lo", "eth0"], "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "2e:06:5a:d7:92:57", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:4f:68:7a:de:b1", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.158", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::104f:68ff:fe7a:deb1", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.158", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:4f:68:7a:de:b1", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["192.0.2.72", "10.31.11.158"], "ansible_all_ipv6_addresses": ["fe80::104f:68ff:fe7a:deb1"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.158", "127.0.0.0/8", "127.0.0.1", "192.0.2.72"], "ipv6": ["::1", "fe80::104f:68ff:fe7a:deb1"]}, "ansible_local": {}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 33528 10.31.11.158 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 33528 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_lsb": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_iscsi_iqn": "", "ansible_loadavg": {"1m": 0.44, "5m": 0.42, "15m": 0.26}, "ansible_pkg_mgr": "dnf", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing gc # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy pickle # destroy _compat_pickle # destroy queue # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping gc # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy unicodedata # destroy gc # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. [WARNING]: Module invocation had junk after the JSON data: # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing gc # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy pickle # destroy _compat_pickle # destroy queue # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping gc # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy unicodedata # destroy gc # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks [WARNING]: Platform linux on host managed_node2 is using the discovered Python interpreter at /usr/bin/python3.9, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 30564 1726882803.03586: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882801.4872231-30606-215916981065852/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882803.03589: _low_level_execute_command(): starting 30564 1726882803.03591: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882801.4872231-30606-215916981065852/ > /dev/null 2>&1 && sleep 0' 30564 1726882803.05234: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882803.05243: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882803.05253: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882803.05268: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882803.05327: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882803.05334: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882803.05344: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882803.05423: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882803.05433: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882803.05449: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882803.05452: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882803.05461: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882803.05753: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882803.06058: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882803.06062: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882803.06067: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882803.06069: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882803.06071: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882803.06073: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882803.06075: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882803.07892: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882803.07896: stdout chunk (state=3): >>><<< 30564 1726882803.07902: stderr chunk (state=3): >>><<< 30564 1726882803.07928: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882803.07953: handler run complete 30564 1726882803.08084: variable 'ansible_facts' from source: unknown 30564 1726882803.08211: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882803.08743: variable 'ansible_facts' from source: unknown 30564 1726882803.08834: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882803.09007: attempt loop complete, returning result 30564 1726882803.09010: _execute() done 30564 1726882803.09013: dumping result to json 30564 1726882803.09046: done dumping result, returning 30564 1726882803.09060: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [0e448fcc-3ce9-4216-acec-00000000001b] 30564 1726882803.09071: sending task result for task 0e448fcc-3ce9-4216-acec-00000000001b 30564 1726882803.13696: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000001b 30564 1726882803.13699: WORKER PROCESS EXITING ok: [managed_node2] 30564 1726882803.13874: no more pending results, returning what we have 30564 1726882803.13877: results queue empty 30564 1726882803.13878: checking for any_errors_fatal 30564 1726882803.13879: done checking for any_errors_fatal 30564 1726882803.13879: checking for max_fail_percentage 30564 1726882803.13880: done checking for max_fail_percentage 30564 1726882803.13880: checking to see if all hosts have failed and the running result is not ok 30564 1726882803.13881: done checking to see if all hosts have failed 30564 1726882803.13881: getting the remaining hosts for this loop 30564 1726882803.13882: done getting the remaining hosts for this loop 30564 1726882803.13885: getting the next task for host managed_node2 30564 1726882803.13887: done getting next task for host managed_node2 30564 1726882803.13889: ^ task is: TASK: meta (flush_handlers) 30564 1726882803.13890: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882803.13892: getting variables 30564 1726882803.13893: in VariableManager get_vars() 30564 1726882803.13904: Calling all_inventory to load vars for managed_node2 30564 1726882803.13906: Calling groups_inventory to load vars for managed_node2 30564 1726882803.13908: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882803.13914: Calling all_plugins_play to load vars for managed_node2 30564 1726882803.13915: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882803.13917: Calling groups_plugins_play to load vars for managed_node2 30564 1726882803.14030: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882803.14153: done with get_vars() 30564 1726882803.14160: done getting variables 30564 1726882803.14193: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ 30564 1726882803.14228: in VariableManager get_vars() 30564 1726882803.14235: Calling all_inventory to load vars for managed_node2 30564 1726882803.14237: Calling groups_inventory to load vars for managed_node2 30564 1726882803.14239: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882803.14242: Calling all_plugins_play to load vars for managed_node2 30564 1726882803.14247: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882803.14249: Calling groups_plugins_play to load vars for managed_node2 30564 1726882803.14336: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882803.14450: done with get_vars() 30564 1726882803.14460: done queuing things up, now waiting for results queue to drain 30564 1726882803.14462: results queue empty 30564 1726882803.14467: checking for any_errors_fatal 30564 1726882803.14470: done checking for any_errors_fatal 30564 1726882803.14470: checking for max_fail_percentage 30564 1726882803.14471: done checking for max_fail_percentage 30564 1726882803.14472: checking to see if all hosts have failed and the running result is not ok 30564 1726882803.14473: done checking to see if all hosts have failed 30564 1726882803.14473: getting the remaining hosts for this loop 30564 1726882803.14475: done getting the remaining hosts for this loop 30564 1726882803.14478: getting the next task for host managed_node2 30564 1726882803.14482: done getting next task for host managed_node2 30564 1726882803.14484: ^ task is: TASK: Include the task 'el_repo_setup.yml' 30564 1726882803.14486: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882803.14488: getting variables 30564 1726882803.14488: in VariableManager get_vars() 30564 1726882803.14496: Calling all_inventory to load vars for managed_node2 30564 1726882803.14498: Calling groups_inventory to load vars for managed_node2 30564 1726882803.14500: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882803.14503: Calling all_plugins_play to load vars for managed_node2 30564 1726882803.14504: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882803.14506: Calling groups_plugins_play to load vars for managed_node2 30564 1726882803.14656: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882803.14843: done with get_vars() 30564 1726882803.14850: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_states_nm.yml:11 Friday 20 September 2024 21:40:03 -0400 (0:00:01.717) 0:00:01.730 ****** 30564 1726882803.14923: entering _queue_task() for managed_node2/include_tasks 30564 1726882803.14924: Creating lock for include_tasks 30564 1726882803.15401: worker is 1 (out of 1 available) 30564 1726882803.15413: exiting _queue_task() for managed_node2/include_tasks 30564 1726882803.15423: done queuing things up, now waiting for results queue to drain 30564 1726882803.15424: waiting for pending results... 30564 1726882803.16027: running TaskExecutor() for managed_node2/TASK: Include the task 'el_repo_setup.yml' 30564 1726882803.16122: in run() - task 0e448fcc-3ce9-4216-acec-000000000006 30564 1726882803.16134: variable 'ansible_search_path' from source: unknown 30564 1726882803.16187: calling self._execute() 30564 1726882803.16289: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882803.16293: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882803.16302: variable 'omit' from source: magic vars 30564 1726882803.16400: _execute() done 30564 1726882803.16403: dumping result to json 30564 1726882803.16408: done dumping result, returning 30564 1726882803.16418: done running TaskExecutor() for managed_node2/TASK: Include the task 'el_repo_setup.yml' [0e448fcc-3ce9-4216-acec-000000000006] 30564 1726882803.16425: sending task result for task 0e448fcc-3ce9-4216-acec-000000000006 30564 1726882803.16528: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000006 30564 1726882803.16531: WORKER PROCESS EXITING 30564 1726882803.16570: no more pending results, returning what we have 30564 1726882803.16575: in VariableManager get_vars() 30564 1726882803.16605: Calling all_inventory to load vars for managed_node2 30564 1726882803.16608: Calling groups_inventory to load vars for managed_node2 30564 1726882803.16611: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882803.16623: Calling all_plugins_play to load vars for managed_node2 30564 1726882803.16626: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882803.16630: Calling groups_plugins_play to load vars for managed_node2 30564 1726882803.16781: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882803.16898: done with get_vars() 30564 1726882803.16904: variable 'ansible_search_path' from source: unknown 30564 1726882803.16913: we have included files to process 30564 1726882803.16914: generating all_blocks data 30564 1726882803.16915: done generating all_blocks data 30564 1726882803.16915: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 30564 1726882803.16916: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 30564 1726882803.16918: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 30564 1726882803.17377: in VariableManager get_vars() 30564 1726882803.17389: done with get_vars() 30564 1726882803.17397: done processing included file 30564 1726882803.17398: iterating over new_blocks loaded from include file 30564 1726882803.17399: in VariableManager get_vars() 30564 1726882803.17405: done with get_vars() 30564 1726882803.17406: filtering new block on tags 30564 1726882803.17415: done filtering new block on tags 30564 1726882803.17417: in VariableManager get_vars() 30564 1726882803.17423: done with get_vars() 30564 1726882803.17424: filtering new block on tags 30564 1726882803.17433: done filtering new block on tags 30564 1726882803.17435: in VariableManager get_vars() 30564 1726882803.17440: done with get_vars() 30564 1726882803.17441: filtering new block on tags 30564 1726882803.17448: done filtering new block on tags 30564 1726882803.17449: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed_node2 30564 1726882803.17453: extending task lists for all hosts with included blocks 30564 1726882803.17487: done extending task lists 30564 1726882803.17488: done processing included files 30564 1726882803.17489: results queue empty 30564 1726882803.17489: checking for any_errors_fatal 30564 1726882803.17490: done checking for any_errors_fatal 30564 1726882803.17491: checking for max_fail_percentage 30564 1726882803.17492: done checking for max_fail_percentage 30564 1726882803.17492: checking to see if all hosts have failed and the running result is not ok 30564 1726882803.17493: done checking to see if all hosts have failed 30564 1726882803.17494: getting the remaining hosts for this loop 30564 1726882803.17495: done getting the remaining hosts for this loop 30564 1726882803.17497: getting the next task for host managed_node2 30564 1726882803.17500: done getting next task for host managed_node2 30564 1726882803.17502: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 30564 1726882803.17503: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882803.17505: getting variables 30564 1726882803.17505: in VariableManager get_vars() 30564 1726882803.17511: Calling all_inventory to load vars for managed_node2 30564 1726882803.17513: Calling groups_inventory to load vars for managed_node2 30564 1726882803.17514: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882803.17517: Calling all_plugins_play to load vars for managed_node2 30564 1726882803.17519: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882803.17520: Calling groups_plugins_play to load vars for managed_node2 30564 1726882803.17619: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882803.17741: done with get_vars() 30564 1726882803.17748: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Friday 20 September 2024 21:40:03 -0400 (0:00:00.028) 0:00:01.759 ****** 30564 1726882803.17794: entering _queue_task() for managed_node2/setup 30564 1726882803.17994: worker is 1 (out of 1 available) 30564 1726882803.18042: exiting _queue_task() for managed_node2/setup 30564 1726882803.18053: done queuing things up, now waiting for results queue to drain 30564 1726882803.18054: waiting for pending results... 30564 1726882803.18230: running TaskExecutor() for managed_node2/TASK: Gather the minimum subset of ansible_facts required by the network role test 30564 1726882803.18332: in run() - task 0e448fcc-3ce9-4216-acec-00000000002c 30564 1726882803.18350: variable 'ansible_search_path' from source: unknown 30564 1726882803.18357: variable 'ansible_search_path' from source: unknown 30564 1726882803.18410: calling self._execute() 30564 1726882803.18492: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882803.18509: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882803.18524: variable 'omit' from source: magic vars 30564 1726882803.19078: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882803.20907: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882803.20949: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882803.20979: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882803.21015: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882803.21034: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882803.21093: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882803.21115: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882803.21133: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882803.21158: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882803.21172: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882803.21286: variable 'ansible_facts' from source: unknown 30564 1726882803.21332: variable 'network_test_required_facts' from source: task vars 30564 1726882803.21358: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 30564 1726882803.21361: variable 'omit' from source: magic vars 30564 1726882803.21388: variable 'omit' from source: magic vars 30564 1726882803.21409: variable 'omit' from source: magic vars 30564 1726882803.21428: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882803.21452: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882803.21467: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882803.21483: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882803.21491: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882803.21511: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882803.21514: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882803.21516: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882803.21590: Set connection var ansible_timeout to 10 30564 1726882803.21593: Set connection var ansible_pipelining to False 30564 1726882803.21596: Set connection var ansible_shell_type to sh 30564 1726882803.21601: Set connection var ansible_shell_executable to /bin/sh 30564 1726882803.21607: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882803.21610: Set connection var ansible_connection to ssh 30564 1726882803.21627: variable 'ansible_shell_executable' from source: unknown 30564 1726882803.21630: variable 'ansible_connection' from source: unknown 30564 1726882803.21633: variable 'ansible_module_compression' from source: unknown 30564 1726882803.21635: variable 'ansible_shell_type' from source: unknown 30564 1726882803.21639: variable 'ansible_shell_executable' from source: unknown 30564 1726882803.21647: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882803.21654: variable 'ansible_pipelining' from source: unknown 30564 1726882803.21657: variable 'ansible_timeout' from source: unknown 30564 1726882803.21660: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882803.21794: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30564 1726882803.21808: variable 'omit' from source: magic vars 30564 1726882803.21811: starting attempt loop 30564 1726882803.21814: running the handler 30564 1726882803.21825: _low_level_execute_command(): starting 30564 1726882803.21831: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882803.22721: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882803.22830: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 30564 1726882803.25167: stdout chunk (state=3): >>>/root <<< 30564 1726882803.25339: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882803.25381: stderr chunk (state=3): >>><<< 30564 1726882803.25385: stdout chunk (state=3): >>><<< 30564 1726882803.25401: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 30564 1726882803.25410: _low_level_execute_command(): starting 30564 1726882803.25417: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882803.2540042-30685-240996739498269 `" && echo ansible-tmp-1726882803.2540042-30685-240996739498269="` echo /root/.ansible/tmp/ansible-tmp-1726882803.2540042-30685-240996739498269 `" ) && sleep 0' 30564 1726882803.25967: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882803.25985: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882803.26000: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882803.26018: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882803.26066: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882803.26083: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882803.26099: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882803.26118: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882803.26132: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882803.26151: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882803.26167: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882803.26181: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882803.26201: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882803.26213: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882803.26253: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882803.26296: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882803.26537: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882803.26554: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882803.26573: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882803.26742: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 30564 1726882803.29471: stdout chunk (state=3): >>>ansible-tmp-1726882803.2540042-30685-240996739498269=/root/.ansible/tmp/ansible-tmp-1726882803.2540042-30685-240996739498269 <<< 30564 1726882803.29632: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882803.29678: stderr chunk (state=3): >>><<< 30564 1726882803.29682: stdout chunk (state=3): >>><<< 30564 1726882803.29696: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882803.2540042-30685-240996739498269=/root/.ansible/tmp/ansible-tmp-1726882803.2540042-30685-240996739498269 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 30564 1726882803.29735: variable 'ansible_module_compression' from source: unknown 30564 1726882803.29773: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 30564 1726882803.29942: variable 'ansible_facts' from source: unknown 30564 1726882803.30017: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882803.2540042-30685-240996739498269/AnsiballZ_setup.py 30564 1726882803.30384: Sending initial data 30564 1726882803.30393: Sent initial data (154 bytes) 30564 1726882803.31043: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882803.31059: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882803.31078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882803.31095: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882803.31134: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882803.31145: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882803.31157: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882803.31179: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882803.31190: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882803.31201: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882803.31211: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882803.31223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882803.31239: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882803.31250: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882803.31259: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882803.31275: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882803.31349: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882803.31371: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882803.31387: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882803.31517: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 30564 1726882803.33984: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882803.34087: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882803.34194: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmpas5syw0_ /root/.ansible/tmp/ansible-tmp-1726882803.2540042-30685-240996739498269/AnsiballZ_setup.py <<< 30564 1726882803.34299: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882803.37169: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882803.37358: stderr chunk (state=3): >>><<< 30564 1726882803.37361: stdout chunk (state=3): >>><<< 30564 1726882803.37371: done transferring module to remote 30564 1726882803.37374: _low_level_execute_command(): starting 30564 1726882803.37377: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882803.2540042-30685-240996739498269/ /root/.ansible/tmp/ansible-tmp-1726882803.2540042-30685-240996739498269/AnsiballZ_setup.py && sleep 0' 30564 1726882803.37918: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882803.37933: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882803.37948: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882803.37971: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882803.38011: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882803.38024: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882803.38039: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882803.38056: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882803.38071: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882803.38082: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882803.38094: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882803.38108: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882803.38123: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882803.38135: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882803.38147: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882803.38161: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882803.38234: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882803.38251: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882803.38268: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882803.38453: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 30564 1726882803.40142: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882803.40208: stderr chunk (state=3): >>><<< 30564 1726882803.40211: stdout chunk (state=3): >>><<< 30564 1726882803.40270: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 30564 1726882803.40274: _low_level_execute_command(): starting 30564 1726882803.40277: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882803.2540042-30685-240996739498269/AnsiballZ_setup.py && sleep 0' 30564 1726882803.40992: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882803.41006: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882803.41021: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882803.41040: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882803.41088: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882803.41100: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882803.41115: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882803.41134: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882803.41146: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882803.41156: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882803.41172: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882803.41187: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882803.41203: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882803.41217: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882803.41229: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882803.41243: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882803.41317: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882803.41339: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882803.41353: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882803.41494: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882803.44172: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin <<< 30564 1726882803.44215: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 30564 1726882803.44260: stdout chunk (state=3): >>>import '_io' # <<< 30564 1726882803.44265: stdout chunk (state=3): >>>import 'marshal' # <<< 30564 1726882803.44304: stdout chunk (state=3): >>>import 'posix' # <<< 30564 1726882803.44330: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 30564 1726882803.44388: stdout chunk (state=3): >>>import 'time' # <<< 30564 1726882803.44391: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 30564 1726882803.44435: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py <<< 30564 1726882803.44442: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' <<< 30564 1726882803.44471: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py <<< 30564 1726882803.44474: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' <<< 30564 1726882803.44506: stdout chunk (state=3): >>>import '_codecs' # <<< 30564 1726882803.44512: stdout chunk (state=3): >>>import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f74204d8dc0> <<< 30564 1726882803.44544: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py <<< 30564 1726882803.44547: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f742047d3a0> <<< 30564 1726882803.44566: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f74204d8b20> <<< 30564 1726882803.44591: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' <<< 30564 1726882803.44598: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f74204d8ac0> <<< 30564 1726882803.44639: stdout chunk (state=3): >>>import '_signal' # <<< 30564 1726882803.44668: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' <<< 30564 1726882803.44707: stdout chunk (state=3): >>>import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f742047d490> <<< 30564 1726882803.44733: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' <<< 30564 1726882803.44753: stdout chunk (state=3): >>>import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f742047d940> <<< 30564 1726882803.44756: stdout chunk (state=3): >>>import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f742047d670> <<< 30564 1726882803.44781: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' <<< 30564 1726882803.44796: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py <<< 30564 1726882803.44833: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' <<< 30564 1726882803.44850: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py <<< 30564 1726882803.44882: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # <<< 30564 1726882803.44894: stdout chunk (state=3): >>>import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7420434190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py <<< 30564 1726882803.45058: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7420434220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' <<< 30564 1726882803.45120: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7420457850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7420434940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7420495880> <<< 30564 1726882803.45178: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f742042dd90> <<< 30564 1726882803.45182: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' <<< 30564 1726882803.45309: stdout chunk (state=3): >>>import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7420457d90> <<< 30564 1726882803.45313: stdout chunk (state=3): >>>import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f742047d970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 30564 1726882803.45587: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' <<< 30564 1726882803.45646: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' <<< 30564 1726882803.45690: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py <<< 30564 1726882803.45693: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' <<< 30564 1726882803.45708: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f74201aeeb0> <<< 30564 1726882803.45760: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f74201b1f40> <<< 30564 1726882803.45796: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py <<< 30564 1726882803.45804: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # <<< 30564 1726882803.45816: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' <<< 30564 1726882803.45843: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' <<< 30564 1726882803.45887: stdout chunk (state=3): >>>import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f74201a7610> <<< 30564 1726882803.45909: stdout chunk (state=3): >>>import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f74201ad640> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f74201ae370> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py <<< 30564 1726882803.45989: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' <<< 30564 1726882803.46003: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py <<< 30564 1726882803.46037: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' <<< 30564 1726882803.46050: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' <<< 30564 1726882803.46094: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7420093df0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f74200938e0> import 'itertools' # <<< 30564 1726882803.46133: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7420093ee0> <<< 30564 1726882803.46149: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py <<< 30564 1726882803.46159: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' <<< 30564 1726882803.46183: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7420093fa0> <<< 30564 1726882803.46228: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' <<< 30564 1726882803.46242: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7420093eb0> import '_collections' # <<< 30564 1726882803.46283: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7420189d60> import '_functools' # <<< 30564 1726882803.46310: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7420182640> <<< 30564 1726882803.46371: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f74201956a0> <<< 30564 1726882803.46381: stdout chunk (state=3): >>>import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f74201b5df0> <<< 30564 1726882803.46418: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' <<< 30564 1726882803.46430: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f74200a6ca0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7420189280> <<< 30564 1726882803.46474: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' <<< 30564 1726882803.46498: stdout chunk (state=3): >>>import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f74201952b0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f74201bb9a0> <<< 30564 1726882803.46519: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' <<< 30564 1726882803.46540: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' <<< 30564 1726882803.46571: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' <<< 30564 1726882803.46591: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f74200a6fd0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f74200a6dc0> <<< 30564 1726882803.46619: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py <<< 30564 1726882803.46631: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f74200a6d30> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' <<< 30564 1726882803.46660: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' <<< 30564 1726882803.46688: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py <<< 30564 1726882803.46732: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' <<< 30564 1726882803.46759: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f74200793a0> <<< 30564 1726882803.46789: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' <<< 30564 1726882803.46820: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7420079490> <<< 30564 1726882803.46949: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f74200adfd0> <<< 30564 1726882803.47019: stdout chunk (state=3): >>>import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f74200a8a60> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f74200a8580> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py <<< 30564 1726882803.47040: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' <<< 30564 1726882803.47060: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py <<< 30564 1726882803.47076: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' <<< 30564 1726882803.47640: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741ffad1f0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7420064b80> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f74200a8ee0> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f74201b5fd0> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741ffbfb20> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f741ffbfe50> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741ffd1760> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741ffd1ca0> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f741ff693d0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741ffbff40> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f741ff7a2b0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741ffd15e0> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f741ff7a370> <<< 30564 1726882803.47658: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f74200a6a00> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py <<< 30564 1726882803.47977: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f741ff956d0> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f741ff959a0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741ff95790> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f741ff95880> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' <<< 30564 1726882803.48491: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f741ff95cd0> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f741ffa2220> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741ff95910> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741ff89a60> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f74200a65e0> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741ff95ac0> <<< 30564 1726882803.48573: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/cp437.pyc'<<< 30564 1726882803.48576: stdout chunk (state=3): >>> <<< 30564 1726882803.48620: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f741febe6a0><<< 30564 1726882803.48623: stdout chunk (state=3): >>> <<< 30564 1726882803.49015: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip' <<< 30564 1726882803.49019: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.49152: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.49204: stdout chunk (state=3): >>>import ansible # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/__init__.py # zipimport: zlib available <<< 30564 1726882803.49222: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/__init__.py <<< 30564 1726882803.49247: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.50886: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.51777: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741fdfb7f0> <<< 30564 1726882803.51781: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' <<< 30564 1726882803.51814: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' <<< 30564 1726882803.51853: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f741fdfb160> <<< 30564 1726882803.51882: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741fdfb280> <<< 30564 1726882803.51950: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741fdfbf40> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' <<< 30564 1726882803.51983: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741fdfb4f0> <<< 30564 1726882803.51997: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741fdfbd60> import 'atexit' # <<< 30564 1726882803.52150: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f741fdfbfa0> <<< 30564 1726882803.52444: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741fdfb100> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741fdd2100> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f741f788100> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f741f7882e0> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741f788c70> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741fde1dc0> <<< 30564 1726882803.52603: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741fde13a0> <<< 30564 1726882803.52626: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' <<< 30564 1726882803.52673: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741fde1fa0> <<< 30564 1726882803.52703: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py <<< 30564 1726882803.52730: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' <<< 30564 1726882803.52796: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' <<< 30564 1726882803.52800: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741fe32c70> <<< 30564 1726882803.52889: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741fdddd00> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741fddd3d0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741fdb0b50> <<< 30564 1726882803.52932: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f741fddd4f0> <<< 30564 1726882803.52942: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741fddd520> <<< 30564 1726882803.52980: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' <<< 30564 1726882803.52983: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py <<< 30564 1726882803.53012: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' <<< 30564 1726882803.53085: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f741f7e4310> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741fe44220> <<< 30564 1726882803.53122: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py <<< 30564 1726882803.53126: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' <<< 30564 1726882803.53166: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f741f7f0880> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741fe443a0> <<< 30564 1726882803.53193: stdout chunk (state=3): >>># /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py <<< 30564 1726882803.53242: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' <<< 30564 1726882803.53264: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # <<< 30564 1726882803.53317: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741fe44ca0> <<< 30564 1726882803.53474: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741f7f0820> <<< 30564 1726882803.53583: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f741fddcaf0> <<< 30564 1726882803.53644: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f741fe44940> <<< 30564 1726882803.53688: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f741fe445b0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741fe3c8e0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' <<< 30564 1726882803.53734: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f741f7e6970> <<< 30564 1726882803.53965: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f741fd52d60> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741f7ef5e0> <<< 30564 1726882803.54006: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f741f7e6f10> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741f7ef9d0> <<< 30564 1726882803.54009: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/compat/__init__.py <<< 30564 1726882803.54029: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.54091: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.54183: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 30564 1726882803.54216: stdout chunk (state=3): >>>import ansible.module_utils.common # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/common/text/__init__.py <<< 30564 1726882803.54232: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.54324: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.54421: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.54885: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.55386: stdout chunk (state=3): >>>import ansible.module_utils.six # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # <<< 30564 1726882803.55390: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' <<< 30564 1726882803.55439: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f741fd517f0> <<< 30564 1726882803.55522: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' <<< 30564 1726882803.55525: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741fd8c880> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741f3a69a0> <<< 30564 1726882803.55705: stdout chunk (state=3): >>>import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available <<< 30564 1726882803.55783: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available <<< 30564 1726882803.55961: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741fdb8730> # zipimport: zlib available <<< 30564 1726882803.56301: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.56673: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.56724: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.56796: stdout chunk (state=3): >>>import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available <<< 30564 1726882803.56831: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.56878: stdout chunk (state=3): >>>import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available <<< 30564 1726882803.56921: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.57018: stdout chunk (state=3): >>>import ansible.module_utils.errors # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/errors.py <<< 30564 1726882803.57021: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.57026: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/parsing/__init__.py <<< 30564 1726882803.57038: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.57067: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.57110: stdout chunk (state=3): >>>import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/parsing/convert_bool.py <<< 30564 1726882803.57113: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.57298: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.57521: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py <<< 30564 1726882803.57524: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # <<< 30564 1726882803.57600: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741fdfe3a0> <<< 30564 1726882803.57603: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.57662: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.57745: stdout chunk (state=3): >>>import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/common/parameters.py <<< 30564 1726882803.57752: stdout chunk (state=3): >>>import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/common/arg_spec.py <<< 30564 1726882803.57893: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available <<< 30564 1726882803.57978: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.58250: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f741fd70610> <<< 30564 1726882803.58264: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741f237b50> <<< 30564 1726882803.58310: stdout chunk (state=3): >>>import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/common/process.py <<< 30564 1726882803.58313: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.58376: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.58423: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.58439: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.58490: stdout chunk (state=3): >>># /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py <<< 30564 1726882803.58497: stdout chunk (state=3): >>># code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' <<< 30564 1726882803.58513: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py <<< 30564 1726882803.58542: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' <<< 30564 1726882803.58569: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py <<< 30564 1726882803.58590: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' <<< 30564 1726882803.58671: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741fd836a0> <<< 30564 1726882803.58706: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741fdcfe50> <<< 30564 1726882803.58774: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741fdfe850> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/distro/__init__.py <<< 30564 1726882803.58780: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.58800: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.58823: stdout chunk (state=3): >>>import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/common/_utils.py <<< 30564 1726882803.58826: stdout chunk (state=3): >>>import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/common/sys_info.py <<< 30564 1726882803.58926: stdout chunk (state=3): >>>import ansible.module_utils.basic # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available <<< 30564 1726882803.58933: stdout chunk (state=3): >>>import ansible.modules # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/modules/__init__.py <<< 30564 1726882803.58944: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.58994: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.59050: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.59062: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.59085: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.59121: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.59156: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.59189: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.59229: stdout chunk (state=3): >>>import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/namespace.py <<< 30564 1726882803.59231: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.59292: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.59368: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.59380: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.59414: stdout chunk (state=3): >>>import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/compat/typing.py <<< 30564 1726882803.59417: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.59561: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.59701: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.59736: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.59784: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' <<< 30564 1726882803.59803: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py <<< 30564 1726882803.59817: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' <<< 30564 1726882803.59840: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py <<< 30564 1726882803.59843: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' <<< 30564 1726882803.59871: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741f3a46d0> <<< 30564 1726882803.59898: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' <<< 30564 1726882803.59925: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py <<< 30564 1726882803.59950: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' <<< 30564 1726882803.59988: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py <<< 30564 1726882803.59991: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741f388a30> <<< 30564 1726882803.60033: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' <<< 30564 1726882803.60036: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f741f3889a0> <<< 30564 1726882803.60095: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741f3b8040> <<< 30564 1726882803.60115: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741f3a4520> <<< 30564 1726882803.60133: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741f121fa0> <<< 30564 1726882803.60153: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741f121be0> <<< 30564 1726882803.60156: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py <<< 30564 1726882803.60182: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' <<< 30564 1726882803.60203: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py # code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' <<< 30564 1726882803.60248: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' <<< 30564 1726882803.60252: stdout chunk (state=3): >>># extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f741fdded00> <<< 30564 1726882803.60257: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741f368e80> <<< 30564 1726882803.60280: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py <<< 30564 1726882803.60282: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' <<< 30564 1726882803.60311: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741fdde0d0> <<< 30564 1726882803.60326: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py <<< 30564 1726882803.60343: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' <<< 30564 1726882803.60371: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' <<< 30564 1726882803.60389: stdout chunk (state=3): >>># extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f741f18afd0> <<< 30564 1726882803.60404: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741f3b6e50> <<< 30564 1726882803.60433: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741f121e50> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/timeout.py <<< 30564 1726882803.60452: stdout chunk (state=3): >>>import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/collector.py # zipimport: zlib available <<< 30564 1726882803.60482: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/other/__init__.py <<< 30564 1726882803.60485: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.60535: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.60593: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/other/facter.py <<< 30564 1726882803.60595: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.60641: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.60711: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/other/ohai.py <<< 30564 1726882803.60714: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.60716: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.60718: stdout chunk (state=3): >>>import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/system/__init__.py <<< 30564 1726882803.60724: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.60743: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.60783: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/system/apparmor.py <<< 30564 1726882803.60786: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.60823: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.60869: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/system/caps.py # zipimport: zlib available <<< 30564 1726882803.60906: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.60944: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/system/chroot.py <<< 30564 1726882803.60947: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.61000: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.61055: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.61099: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.61147: stdout chunk (state=3): >>>import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/system/cmdline.py <<< 30564 1726882803.61159: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.61553: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.61921: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/system/distribution.py # zipimport: zlib available <<< 30564 1726882803.61966: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.62009: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.62048: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.62077: stdout chunk (state=3): >>>import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/system/date_time.py # zipimport: zlib available <<< 30564 1726882803.62104: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.62146: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/system/env.py <<< 30564 1726882803.62150: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.62186: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.62242: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/system/dns.py # zipimport: zlib available <<< 30564 1726882803.62277: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.62301: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/system/fips.py # zipimport: zlib available <<< 30564 1726882803.62323: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.62368: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/system/loadavg.py # zipimport: zlib available <<< 30564 1726882803.62423: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.62512: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' <<< 30564 1726882803.62543: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741f0a0e50> <<< 30564 1726882803.62555: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py <<< 30564 1726882803.62575: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' <<< 30564 1726882803.62727: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741f0a09d0> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/system/local.py # zipimport: zlib available <<< 30564 1726882803.62791: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.62845: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/system/lsb.py # zipimport: zlib available <<< 30564 1726882803.62918: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.63002: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py # zipimport: zlib available <<< 30564 1726882803.63091: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.63140: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/system/platform.py <<< 30564 1726882803.63143: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.63218: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.63221: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py <<< 30564 1726882803.63243: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' <<< 30564 1726882803.63392: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f741f098790> <<< 30564 1726882803.64093: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741f3a97f0> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/system/python.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/system/selinux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py # zipimport: zlib available <<< 30564 1726882803.64374: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f741f05e310> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741f05e340> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/system/user.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py # zipimport: zlib available # zipimport: zlib available <<< 30564 1726882803.64419: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/base.py # zipimport: zlib available <<< 30564 1726882803.64550: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.64678: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/aix.py <<< 30564 1726882803.64690: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.64762: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.64967: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/sysctl.py import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py # zipimport: zlib available <<< 30564 1726882803.65033: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.65045: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.65161: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.65286: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py # zipimport: zlib available <<< 30564 1726882803.65392: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.65501: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py <<< 30564 1726882803.65504: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.65523: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.65555: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.65994: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.66416: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py <<< 30564 1726882803.66419: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.66500: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.66591: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py # zipimport: zlib available <<< 30564 1726882803.66679: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.66771: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py <<< 30564 1726882803.66775: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.66890: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.67038: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py <<< 30564 1726882803.67043: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.67047: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/network/__init__.py <<< 30564 1726882803.67050: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.67088: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.67135: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/network/base.py <<< 30564 1726882803.67138: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.67220: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.67300: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.67472: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.67646: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/network/aix.py <<< 30564 1726882803.67653: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.67679: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.67708: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/network/darwin.py <<< 30564 1726882803.67734: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.67737: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.67762: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py # zipimport: zlib available <<< 30564 1726882803.67826: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.67900: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py <<< 30564 1726882803.67904: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.67927: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.67942: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/network/freebsd.py # zipimport: zlib available <<< 30564 1726882803.67989: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.68280: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/network/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/network/hurd.py # zipimport: zlib available <<< 30564 1726882803.68364: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.68590: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/network/linux.py <<< 30564 1726882803.68594: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.68624: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.68686: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/network/iscsi.py # zipimport: zlib available <<< 30564 1726882803.68719: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.68754: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/network/nvme.py <<< 30564 1726882803.68758: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.68781: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.68811: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/network/netbsd.py # zipimport: zlib available <<< 30564 1726882803.68845: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.68875: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/network/openbsd.py <<< 30564 1726882803.68888: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.68943: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.69025: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/network/sunos.py # zipimport: zlib available <<< 30564 1726882803.69058: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py <<< 30564 1726882803.69061: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.69088: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.69146: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/base.py <<< 30564 1726882803.69150: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.69174: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 30564 1726882803.69214: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.69252: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.69316: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.69390: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py <<< 30564 1726882803.69403: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py # zipimport: zlib available <<< 30564 1726882803.69432: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.69485: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py # zipimport: zlib available <<< 30564 1726882803.69646: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.69814: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/linux.py <<< 30564 1726882803.69817: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.69846: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.69893: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py # zipimport: zlib available <<< 30564 1726882803.69936: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.69985: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py <<< 30564 1726882803.69987: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.70044: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.70133: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/default_collectors.py <<< 30564 1726882803.70136: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.70199: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.70288: stdout chunk (state=3): >>>import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/__init__.py <<< 30564 1726882803.70353: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882803.70516: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' <<< 30564 1726882803.70558: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py # code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' <<< 30564 1726882803.70610: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f741f081e50> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741f010520> <<< 30564 1726882803.70660: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741f010d90> <<< 30564 1726882803.72026: stdout chunk (state=3): >>>import 'gc' # <<< 30564 1726882803.72029: stdout chunk (state=3): >>> <<< 30564 1726882803.73488: stdout chunk (state=3): >>> <<< 30564 1726882803.73521: stdout chunk (state=3): >>>{"ansible_facts": {"ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALEARW5ZJ51XTLSDuUsPojumVU0f1DmiQsXjMOap4QLlljOiysapjSUe6pZOyAdiI/KfARhDoOFvlC07kCLCcs7DDk8JxBZpsM0D55SdDlfwsB3FVgWNP+9by8G6kzbePHWdZyyWlAuavj4OAEwAjpWpP8/daus0ha4xywlVVoKjAAAAFQCbiW4bR+tgMvjrxC198dqI1mTbjQAAAIBzCzkJTtnGDKfOHq2dFI5cUEuaj1PgRot3wyaXENzUjZVnIFgXUmgKDCxO+EAtU6uAkBPQF4XNgiuaw5bavYpZxcJ4WIpM4ZDRoSkc7BBbJPRLZ45GfrHJwgqAmAZ3RSvVqeXE4WKQHLm43/eDHewgPqqqWe6QVuQH5SEe79yk3wAAAIEArG+AuupiAeoVJ9Lh36QMj4kRo5pTASh2eD5MqSOdy39UhsXbWBcj3JCIvNk/nwep/9neGyRZ5t5wT05dRX80vlgZJX65hrbepO+lqC3wlng+6GQ34D7TJKYnvEkR3neE0+06kx5R6IRWZf1YQV6fMQhx8AJ2JmvnLFicmYlkhQQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDND+RJCrYgIUzolo5fZ64Ey6cksefKDUWmGDjsqVTmuT3HrlDyUZOro4JAnUQBmiamXsJUFbrFdJAVpukD4yyowqCQLr0ZFuKNEzrt5CObrtWflOskKynO3kaoU0WhDkqIbwS2j/+NxBCxgDGqd/5Os3cOMv3eyjUElz6xoI4zsmGMfxVYmT+/SHBfoyxyqY8Hw2Ooq+H5L9OlYgV4hqu7kKPpM1THUJTjy47m6qvws5gztclLjPA1KIW2Dz6kKzUYspNJcoS2sK1xFvL7mBjpGAP7WhXVH2n5ySenQ24Z6mEj+tG2f11rjPpjCUjDzzciGCWiRDZWBLm/GGmQXJJ8zAYnw82yIUKqufLrr1wmcXICPMVj9pFjXSoBWe/yhX9E87w7YD5HWsUrgrLdSctdV4QYy+R5g9ERi7FjwbRsuZ04BihZs70+f/29hUzuc6MA87KVovGT0Uc7GVC7bx8NLt0bTBsbydlONVHVQuol/YEpQrQophDvmBfh+PgMDH8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOEITn1vyppR+Moe1UdR0WGPhUnQ/dwHNcNi0OYy21LkBQ5jsxOPLvZ+C2MbRYlz2afs4nYYIV8E0AuK6aRks3w=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKEdFOHVk9tX1R+zEyLVdxS/U5QeeeFYWSnUmjpXlpt7", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64",<<< 30564 1726882803.73532: stdout chunk (state=3): >>> "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-158", "ansible_nodename": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "21e18164a0c64d0daed004bd8a1b67b7", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_lsb": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_local": {}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "F<<< 30564 1726882803.73568: stdout chunk (state=3): >>>riday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "40", "second": "03", "epoch": "1726882803", "epoch_int": "1726882803", "date": "2024-09-20", "time": "21:40:03", "iso8601_micro": "2024-09-21T01:40:03.731320Z", "iso8601": "2024-09-21T01:40:03Z", "iso8601_basic": "20240920T214003731320", "iso8601_basic_short": "20240920T214003", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_service_mgr": "systemd", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 33528 10.31.11.158 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 33528 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_pkg_mgr": "dnf", "ansible_apparmor": {"status": "disabled"}, "ansible_fips": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 30564 1726882803.74260: stdout chunk (state=3): >>># clear builtins._ # clear sys.path <<< 30564 1726882803.74300: stdout chunk (state=3): >>># clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache <<< 30564 1726882803.74314: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.__interactivehook__ <<< 30564 1726882803.74322: stdout chunk (state=3): >>># restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io <<< 30564 1726882803.74328: stdout chunk (state=3): >>># cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword <<< 30564 1726882803.74335: stdout chunk (state=3): >>># cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc <<< 30564 1726882803.74341: stdout chunk (state=3): >>># cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect <<< 30564 1726882803.74354: stdout chunk (state=3): >>># cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess <<< 30564 1726882803.74387: stdout chunk (state=3): >>># cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file <<< 30564 1726882803.74400: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai <<< 30564 1726882803.74430: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin <<< 30564 1726882803.74434: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd <<< 30564 1726882803.74442: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector <<< 30564 1726882803.74445: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time <<< 30564 1726882803.74454: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd <<< 30564 1726882803.74482: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc <<< 30564 1726882803.74902: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport <<< 30564 1726882803.75005: stdout chunk (state=3): >>># destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings <<< 30564 1726882803.75739: stdout chunk (state=3): >>># destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy queue # destroy multiprocessing.process # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy gc # destroy unicodedata # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors <<< 30564 1726882803.75743: stdout chunk (state=3): >>># destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator <<< 30564 1726882803.75757: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal <<< 30564 1726882803.75796: stdout chunk (state=3): >>># destroy _frozen_importlib # clear sys.audit hooks <<< 30564 1726882803.76269: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882803.76272: stdout chunk (state=3): >>><<< 30564 1726882803.76293: stderr chunk (state=3): >>><<< 30564 1726882803.76435: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f74204d8dc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f742047d3a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f74204d8b20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f74204d8ac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f742047d490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f742047d940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f742047d670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7420434190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7420434220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7420457850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7420434940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7420495880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f742042dd90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7420457d90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f742047d970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f74201aeeb0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f74201b1f40> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f74201a7610> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f74201ad640> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f74201ae370> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7420093df0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f74200938e0> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7420093ee0> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7420093fa0> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7420093eb0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7420189d60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7420182640> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f74201956a0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f74201b5df0> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f74200a6ca0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7420189280> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f74201952b0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f74201bb9a0> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f74200a6fd0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f74200a6dc0> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f74200a6d30> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f74200793a0> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7420079490> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f74200adfd0> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f74200a8a60> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f74200a8580> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741ffad1f0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7420064b80> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f74200a8ee0> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f74201b5fd0> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741ffbfb20> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f741ffbfe50> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741ffd1760> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741ffd1ca0> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f741ff693d0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741ffbff40> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f741ff7a2b0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741ffd15e0> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f741ff7a370> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f74200a6a00> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f741ff956d0> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f741ff959a0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741ff95790> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f741ff95880> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f741ff95cd0> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f741ffa2220> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741ff95910> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741ff89a60> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f74200a65e0> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741ff95ac0> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f741febe6a0> # zipimport: found 103 names in '/tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741fdfb7f0> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f741fdfb160> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741fdfb280> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741fdfbf40> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741fdfb4f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741fdfbd60> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f741fdfbfa0> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741fdfb100> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741fdd2100> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f741f788100> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f741f7882e0> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741f788c70> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741fde1dc0> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741fde13a0> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741fde1fa0> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741fe32c70> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741fdddd00> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741fddd3d0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741fdb0b50> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f741fddd4f0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741fddd520> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f741f7e4310> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741fe44220> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f741f7f0880> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741fe443a0> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741fe44ca0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741f7f0820> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f741fddcaf0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f741fe44940> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f741fe445b0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741fe3c8e0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f741f7e6970> # extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f741fd52d60> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741f7ef5e0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f741f7e6f10> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741f7ef9d0> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.six # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f741fd517f0> # /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741fd8c880> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741f3a69a0> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741fdb8730> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741fdfe3a0> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f741fd70610> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741f237b50> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741fd836a0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741fdcfe50> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741fdfe850> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/namespace.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741f3a46d0> # /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py # code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py # code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741f388a30> # extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f741f3889a0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741f3b8040> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741f3a4520> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741f121fa0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741f121be0> # /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py # code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' # extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f741fdded00> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741f368e80> # /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741fdde0d0> # /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f741f18afd0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741f3b6e50> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741f121e50> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/timeout.py import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/collector.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/other/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/other/facter.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/other/ohai.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/system/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/system/apparmor.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/system/caps.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/system/chroot.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/system/cmdline.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/system/distribution.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/system/date_time.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/system/env.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/system/dns.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/system/fips.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/system/loadavg.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741f0a0e50> # /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py # code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741f0a09d0> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/system/local.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/system/lsb.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/system/platform.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py # code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f741f098790> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741f3a97f0> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/system/python.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/system/selinux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f741f05e310> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741f05e340> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/system/user.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/base.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/aix.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/sysctl.py import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/network/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/network/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/network/aix.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/network/darwin.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/network/freebsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/network/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/network/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/network/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/network/iscsi.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/network/nvme.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/network/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/network/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/network/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/default_collectors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_setup_payload_5z53mahs/ansible_setup_payload.zip/ansible/module_utils/facts/__init__.py # zipimport: zlib available # /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py # code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f741f081e50> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741f010520> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f741f010d90> import 'gc' # {"ansible_facts": {"ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALEARW5ZJ51XTLSDuUsPojumVU0f1DmiQsXjMOap4QLlljOiysapjSUe6pZOyAdiI/KfARhDoOFvlC07kCLCcs7DDk8JxBZpsM0D55SdDlfwsB3FVgWNP+9by8G6kzbePHWdZyyWlAuavj4OAEwAjpWpP8/daus0ha4xywlVVoKjAAAAFQCbiW4bR+tgMvjrxC198dqI1mTbjQAAAIBzCzkJTtnGDKfOHq2dFI5cUEuaj1PgRot3wyaXENzUjZVnIFgXUmgKDCxO+EAtU6uAkBPQF4XNgiuaw5bavYpZxcJ4WIpM4ZDRoSkc7BBbJPRLZ45GfrHJwgqAmAZ3RSvVqeXE4WKQHLm43/eDHewgPqqqWe6QVuQH5SEe79yk3wAAAIEArG+AuupiAeoVJ9Lh36QMj4kRo5pTASh2eD5MqSOdy39UhsXbWBcj3JCIvNk/nwep/9neGyRZ5t5wT05dRX80vlgZJX65hrbepO+lqC3wlng+6GQ34D7TJKYnvEkR3neE0+06kx5R6IRWZf1YQV6fMQhx8AJ2JmvnLFicmYlkhQQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDND+RJCrYgIUzolo5fZ64Ey6cksefKDUWmGDjsqVTmuT3HrlDyUZOro4JAnUQBmiamXsJUFbrFdJAVpukD4yyowqCQLr0ZFuKNEzrt5CObrtWflOskKynO3kaoU0WhDkqIbwS2j/+NxBCxgDGqd/5Os3cOMv3eyjUElz6xoI4zsmGMfxVYmT+/SHBfoyxyqY8Hw2Ooq+H5L9OlYgV4hqu7kKPpM1THUJTjy47m6qvws5gztclLjPA1KIW2Dz6kKzUYspNJcoS2sK1xFvL7mBjpGAP7WhXVH2n5ySenQ24Z6mEj+tG2f11rjPpjCUjDzzciGCWiRDZWBLm/GGmQXJJ8zAYnw82yIUKqufLrr1wmcXICPMVj9pFjXSoBWe/yhX9E87w7YD5HWsUrgrLdSctdV4QYy+R5g9ERi7FjwbRsuZ04BihZs70+f/29hUzuc6MA87KVovGT0Uc7GVC7bx8NLt0bTBsbydlONVHVQuol/YEpQrQophDvmBfh+PgMDH8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOEITn1vyppR+Moe1UdR0WGPhUnQ/dwHNcNi0OYy21LkBQ5jsxOPLvZ+C2MbRYlz2afs4nYYIV8E0AuK6aRks3w=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKEdFOHVk9tX1R+zEyLVdxS/U5QeeeFYWSnUmjpXlpt7", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-158", "ansible_nodename": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "21e18164a0c64d0daed004bd8a1b67b7", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_lsb": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_local": {}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "40", "second": "03", "epoch": "1726882803", "epoch_int": "1726882803", "date": "2024-09-20", "time": "21:40:03", "iso8601_micro": "2024-09-21T01:40:03.731320Z", "iso8601": "2024-09-21T01:40:03Z", "iso8601_basic": "20240920T214003731320", "iso8601_basic_short": "20240920T214003", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_service_mgr": "systemd", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 33528 10.31.11.158 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 33528 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_pkg_mgr": "dnf", "ansible_apparmor": {"status": "disabled"}, "ansible_fips": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy queue # destroy multiprocessing.process # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy gc # destroy unicodedata # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. [WARNING]: Module invocation had junk after the JSON data: # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy queue # destroy multiprocessing.process # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy gc # destroy unicodedata # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks 30564 1726882803.77620: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882803.2540042-30685-240996739498269/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882803.77623: _low_level_execute_command(): starting 30564 1726882803.77626: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882803.2540042-30685-240996739498269/ > /dev/null 2>&1 && sleep 0' 30564 1726882803.78687: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882803.78703: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882803.78716: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882803.78732: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882803.78777: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882803.78789: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882803.78804: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882803.78820: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882803.78830: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882803.78840: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882803.78850: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882803.78861: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882803.78881: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882803.78892: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882803.78901: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882803.78914: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882803.78992: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882803.79012: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882803.79028: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882803.79161: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 30564 1726882803.81476: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882803.81480: stdout chunk (state=3): >>><<< 30564 1726882803.81496: stderr chunk (state=3): >>><<< 30564 1726882803.81675: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 30564 1726882803.81679: handler run complete 30564 1726882803.81682: variable 'ansible_facts' from source: unknown 30564 1726882803.81684: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882803.81752: variable 'ansible_facts' from source: unknown 30564 1726882803.81834: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882803.81990: attempt loop complete, returning result 30564 1726882803.82007: _execute() done 30564 1726882803.82014: dumping result to json 30564 1726882803.82031: done dumping result, returning 30564 1726882803.82044: done running TaskExecutor() for managed_node2/TASK: Gather the minimum subset of ansible_facts required by the network role test [0e448fcc-3ce9-4216-acec-00000000002c] 30564 1726882803.82081: sending task result for task 0e448fcc-3ce9-4216-acec-00000000002c ok: [managed_node2] 30564 1726882803.82406: no more pending results, returning what we have 30564 1726882803.82409: results queue empty 30564 1726882803.82410: checking for any_errors_fatal 30564 1726882803.82412: done checking for any_errors_fatal 30564 1726882803.82413: checking for max_fail_percentage 30564 1726882803.82415: done checking for max_fail_percentage 30564 1726882803.82415: checking to see if all hosts have failed and the running result is not ok 30564 1726882803.82416: done checking to see if all hosts have failed 30564 1726882803.82417: getting the remaining hosts for this loop 30564 1726882803.82419: done getting the remaining hosts for this loop 30564 1726882803.82423: getting the next task for host managed_node2 30564 1726882803.82434: done getting next task for host managed_node2 30564 1726882803.82436: ^ task is: TASK: Check if system is ostree 30564 1726882803.82439: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882803.82443: getting variables 30564 1726882803.82444: in VariableManager get_vars() 30564 1726882803.82475: Calling all_inventory to load vars for managed_node2 30564 1726882803.82478: Calling groups_inventory to load vars for managed_node2 30564 1726882803.82481: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882803.82493: Calling all_plugins_play to load vars for managed_node2 30564 1726882803.82496: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882803.82499: Calling groups_plugins_play to load vars for managed_node2 30564 1726882803.82711: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882803.83130: done with get_vars() 30564 1726882803.83141: done getting variables 30564 1726882803.83173: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000002c 30564 1726882803.83176: WORKER PROCESS EXITING TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Friday 20 September 2024 21:40:03 -0400 (0:00:00.654) 0:00:02.413 ****** 30564 1726882803.83248: entering _queue_task() for managed_node2/stat 30564 1726882803.83498: worker is 1 (out of 1 available) 30564 1726882803.83509: exiting _queue_task() for managed_node2/stat 30564 1726882803.83519: done queuing things up, now waiting for results queue to drain 30564 1726882803.83520: waiting for pending results... 30564 1726882803.83672: running TaskExecutor() for managed_node2/TASK: Check if system is ostree 30564 1726882803.83736: in run() - task 0e448fcc-3ce9-4216-acec-00000000002e 30564 1726882803.83746: variable 'ansible_search_path' from source: unknown 30564 1726882803.83750: variable 'ansible_search_path' from source: unknown 30564 1726882803.83784: calling self._execute() 30564 1726882803.83835: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882803.83838: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882803.83848: variable 'omit' from source: magic vars 30564 1726882803.84180: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882803.84352: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882803.84386: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882803.84411: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882803.84455: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882803.84521: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882803.84540: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882803.84560: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882803.84581: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882803.84668: Evaluated conditional (not __network_is_ostree is defined): True 30564 1726882803.84676: variable 'omit' from source: magic vars 30564 1726882803.84701: variable 'omit' from source: magic vars 30564 1726882803.84724: variable 'omit' from source: magic vars 30564 1726882803.84746: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882803.84765: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882803.84782: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882803.84795: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882803.84802: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882803.84823: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882803.84826: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882803.84828: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882803.84899: Set connection var ansible_timeout to 10 30564 1726882803.84902: Set connection var ansible_pipelining to False 30564 1726882803.84904: Set connection var ansible_shell_type to sh 30564 1726882803.84910: Set connection var ansible_shell_executable to /bin/sh 30564 1726882803.84916: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882803.84919: Set connection var ansible_connection to ssh 30564 1726882803.84936: variable 'ansible_shell_executable' from source: unknown 30564 1726882803.84939: variable 'ansible_connection' from source: unknown 30564 1726882803.84943: variable 'ansible_module_compression' from source: unknown 30564 1726882803.84947: variable 'ansible_shell_type' from source: unknown 30564 1726882803.84961: variable 'ansible_shell_executable' from source: unknown 30564 1726882803.84963: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882803.84978: variable 'ansible_pipelining' from source: unknown 30564 1726882803.84980: variable 'ansible_timeout' from source: unknown 30564 1726882803.84983: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882803.85219: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30564 1726882803.85234: variable 'omit' from source: magic vars 30564 1726882803.85245: starting attempt loop 30564 1726882803.85263: running the handler 30564 1726882803.85283: _low_level_execute_command(): starting 30564 1726882803.85295: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882803.86124: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882803.86128: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882803.86191: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 30564 1726882803.86195: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882803.86198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882803.86233: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882803.86249: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882803.86400: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 30564 1726882803.88669: stdout chunk (state=3): >>>/root <<< 30564 1726882803.88903: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882803.88906: stdout chunk (state=3): >>><<< 30564 1726882803.88909: stderr chunk (state=3): >>><<< 30564 1726882803.89475: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 30564 1726882803.89485: _low_level_execute_command(): starting 30564 1726882803.89488: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882803.8941927-30731-21224656307516 `" && echo ansible-tmp-1726882803.8941927-30731-21224656307516="` echo /root/.ansible/tmp/ansible-tmp-1726882803.8941927-30731-21224656307516 `" ) && sleep 0' 30564 1726882803.90331: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882803.90339: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882803.90349: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882803.90362: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882803.90403: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882803.90412: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882803.90418: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882803.90430: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882803.90438: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882803.90444: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882803.90452: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882803.90461: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882803.90481: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882803.90487: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882803.90494: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882803.90504: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882803.90577: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882803.90591: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882803.90602: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882803.90738: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 30564 1726882803.93392: stdout chunk (state=3): >>>ansible-tmp-1726882803.8941927-30731-21224656307516=/root/.ansible/tmp/ansible-tmp-1726882803.8941927-30731-21224656307516 <<< 30564 1726882803.93582: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882803.93669: stderr chunk (state=3): >>><<< 30564 1726882803.93673: stdout chunk (state=3): >>><<< 30564 1726882803.93773: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882803.8941927-30731-21224656307516=/root/.ansible/tmp/ansible-tmp-1726882803.8941927-30731-21224656307516 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 30564 1726882803.93778: variable 'ansible_module_compression' from source: unknown 30564 1726882803.93882: ANSIBALLZ: Using lock for stat 30564 1726882803.93885: ANSIBALLZ: Acquiring lock 30564 1726882803.93888: ANSIBALLZ: Lock acquired: 140506262892192 30564 1726882803.93970: ANSIBALLZ: Creating module 30564 1726882804.05331: ANSIBALLZ: Writing module into payload 30564 1726882804.05425: ANSIBALLZ: Writing module 30564 1726882804.05439: ANSIBALLZ: Renaming module 30564 1726882804.05444: ANSIBALLZ: Done creating module 30564 1726882804.05458: variable 'ansible_facts' from source: unknown 30564 1726882804.05523: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882803.8941927-30731-21224656307516/AnsiballZ_stat.py 30564 1726882804.06053: Sending initial data 30564 1726882804.06062: Sent initial data (152 bytes) 30564 1726882804.06597: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882804.06605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882804.06648: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 30564 1726882804.06651: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882804.06653: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882804.06655: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 30564 1726882804.06657: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882804.06717: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882804.06720: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882804.06727: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882804.06852: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 30564 1726882804.09384: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882804.09479: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882804.09578: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmp6lvkv2ap /root/.ansible/tmp/ansible-tmp-1726882803.8941927-30731-21224656307516/AnsiballZ_stat.py <<< 30564 1726882804.09687: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882804.10886: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882804.10982: stderr chunk (state=3): >>><<< 30564 1726882804.10985: stdout chunk (state=3): >>><<< 30564 1726882804.10999: done transferring module to remote 30564 1726882804.11010: _low_level_execute_command(): starting 30564 1726882804.11015: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882803.8941927-30731-21224656307516/ /root/.ansible/tmp/ansible-tmp-1726882803.8941927-30731-21224656307516/AnsiballZ_stat.py && sleep 0' 30564 1726882804.11458: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882804.11462: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882804.11498: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882804.11501: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882804.11503: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882804.11559: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882804.11563: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882804.11679: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 30564 1726882804.14172: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882804.14203: stderr chunk (state=3): >>><<< 30564 1726882804.14214: stdout chunk (state=3): >>><<< 30564 1726882804.14314: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 30564 1726882804.14318: _low_level_execute_command(): starting 30564 1726882804.14320: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882803.8941927-30731-21224656307516/AnsiballZ_stat.py && sleep 0' 30564 1726882804.14938: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882804.14951: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882804.14966: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882804.14985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882804.15029: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882804.15043: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882804.15057: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882804.15077: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882804.15091: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882804.15124: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882804.15126: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882804.15129: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882804.15181: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882804.15184: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882804.15198: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882804.15330: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 30564 1726882804.18067: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # <<< 30564 1726882804.18161: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 30564 1726882804.18207: stdout chunk (state=3): >>>import 'posix' # <<< 30564 1726882804.18245: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 30564 1726882804.18310: stdout chunk (state=3): >>>import 'time' # <<< 30564 1726882804.18313: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 30564 1726882804.18416: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' <<< 30564 1726882804.18421: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py <<< 30564 1726882804.18435: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # <<< 30564 1726882804.18468: stdout chunk (state=3): >>>import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981cd73dc0> <<< 30564 1726882804.18524: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py <<< 30564 1726882804.18538: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981cd183a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981cd73b20> <<< 30564 1726882804.18567: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' <<< 30564 1726882804.18597: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981cd73ac0> <<< 30564 1726882804.18631: stdout chunk (state=3): >>>import '_signal' # <<< 30564 1726882804.18648: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' <<< 30564 1726882804.18659: stdout chunk (state=3): >>>import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981cd18490> <<< 30564 1726882804.18687: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' <<< 30564 1726882804.18730: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' <<< 30564 1726882804.18742: stdout chunk (state=3): >>>import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981cd18940> <<< 30564 1726882804.18768: stdout chunk (state=3): >>>import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981cd18670> <<< 30564 1726882804.18800: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py <<< 30564 1726882804.18833: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' <<< 30564 1726882804.18844: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py <<< 30564 1726882804.18869: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' <<< 30564 1726882804.18888: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py <<< 30564 1726882804.18913: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' <<< 30564 1726882804.18951: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981cccf190> <<< 30564 1726882804.18963: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py <<< 30564 1726882804.18995: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' <<< 30564 1726882804.19105: stdout chunk (state=3): >>>import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981cccf220> <<< 30564 1726882804.19139: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' <<< 30564 1726882804.19182: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981ccf2850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981cccf940> <<< 30564 1726882804.19223: stdout chunk (state=3): >>>import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981cd30880> <<< 30564 1726882804.19258: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py <<< 30564 1726882804.19266: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981ccc8d90> <<< 30564 1726882804.19335: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' <<< 30564 1726882804.19347: stdout chunk (state=3): >>>import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981ccf2d90> <<< 30564 1726882804.19436: stdout chunk (state=3): >>>import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981cd18970> <<< 30564 1726882804.19480: stdout chunk (state=3): >>>Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 30564 1726882804.19822: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py <<< 30564 1726882804.19830: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' <<< 30564 1726882804.19857: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py <<< 30564 1726882804.19879: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' <<< 30564 1726882804.19891: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py <<< 30564 1726882804.19918: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' <<< 30564 1726882804.19957: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' <<< 30564 1726882804.19971: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981cc6eeb0> <<< 30564 1726882804.20068: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981cc71f40> <<< 30564 1726882804.20101: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py <<< 30564 1726882804.20120: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' <<< 30564 1726882804.20134: stdout chunk (state=3): >>>import '_sre' # <<< 30564 1726882804.20172: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' <<< 30564 1726882804.20201: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' <<< 30564 1726882804.20238: stdout chunk (state=3): >>>import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981cc67610> <<< 30564 1726882804.20267: stdout chunk (state=3): >>>import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981cc6d640> <<< 30564 1726882804.20284: stdout chunk (state=3): >>>import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981cc6e370> <<< 30564 1726882804.20294: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py <<< 30564 1726882804.20394: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' <<< 30564 1726882804.20417: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py <<< 30564 1726882804.20466: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' <<< 30564 1726882804.20496: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' <<< 30564 1726882804.20536: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f981c9d4df0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c9d48e0> <<< 30564 1726882804.20549: stdout chunk (state=3): >>>import 'itertools' # <<< 30564 1726882804.20576: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c9d4ee0> <<< 30564 1726882804.20612: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py <<< 30564 1726882804.20624: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' <<< 30564 1726882804.20662: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c9d4fa0> <<< 30564 1726882804.20718: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c9d4eb0> <<< 30564 1726882804.20730: stdout chunk (state=3): >>>import '_collections' # <<< 30564 1726882804.20776: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981cc49d60> <<< 30564 1726882804.20779: stdout chunk (state=3): >>>import '_functools' # <<< 30564 1726882804.20813: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981cc42640> <<< 30564 1726882804.20888: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981cc556a0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981cc75df0> <<< 30564 1726882804.20921: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' <<< 30564 1726882804.20979: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f981c9e7ca0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981cc49280> <<< 30564 1726882804.21025: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f981cc552b0> <<< 30564 1726882804.21028: stdout chunk (state=3): >>>import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981cc7b9a0> <<< 30564 1726882804.21067: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' <<< 30564 1726882804.21098: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' <<< 30564 1726882804.21125: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py <<< 30564 1726882804.21160: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c9e7fd0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c9e7dc0> <<< 30564 1726882804.21193: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c9e7d30> <<< 30564 1726882804.21218: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' <<< 30564 1726882804.21254: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py <<< 30564 1726882804.21259: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' <<< 30564 1726882804.21286: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py <<< 30564 1726882804.21342: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' <<< 30564 1726882804.21388: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c9ba3a0> <<< 30564 1726882804.21416: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' <<< 30564 1726882804.21458: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c9ba490> <<< 30564 1726882804.21637: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c9eefd0> <<< 30564 1726882804.21699: stdout chunk (state=3): >>>import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c9e9a60> <<< 30564 1726882804.21709: stdout chunk (state=3): >>>import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c9e9580> <<< 30564 1726882804.21722: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' <<< 30564 1726882804.21772: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py <<< 30564 1726882804.21785: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' <<< 30564 1726882804.21827: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' <<< 30564 1726882804.21831: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c8d41f0> <<< 30564 1726882804.21870: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c9a5b80> <<< 30564 1726882804.21941: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c9e9ee0> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981cc75fd0> <<< 30564 1726882804.21995: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py <<< 30564 1726882804.22013: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' <<< 30564 1726882804.22042: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c8e6b20> <<< 30564 1726882804.22045: stdout chunk (state=3): >>>import 'errno' # <<< 30564 1726882804.22126: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f981c8e6e50> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' <<< 30564 1726882804.22174: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' <<< 30564 1726882804.22182: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c8f8760> <<< 30564 1726882804.22212: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py <<< 30564 1726882804.22226: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' <<< 30564 1726882804.22259: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c8f8ca0> <<< 30564 1726882804.22354: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f981c8853d0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c8e6f40> <<< 30564 1726882804.22358: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' <<< 30564 1726882804.22411: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f981c8962b0> <<< 30564 1726882804.22440: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c8f85e0> import 'pwd' # <<< 30564 1726882804.22452: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f981c896370> <<< 30564 1726882804.22581: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c9e7a00> <<< 30564 1726882804.22596: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' <<< 30564 1726882804.22646: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f981c8b16d0> <<< 30564 1726882804.22675: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' <<< 30564 1726882804.22704: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f981c8b19a0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c8b1790> <<< 30564 1726882804.22756: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f981c8b1880> <<< 30564 1726882804.22771: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' <<< 30564 1726882804.23016: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f981c8b1cd0> <<< 30564 1726882804.23053: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f981c8be220> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c8b1910> <<< 30564 1726882804.23089: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c8a5a60> <<< 30564 1726882804.23108: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c9e75e0> <<< 30564 1726882804.23136: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py <<< 30564 1726882804.23210: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' <<< 30564 1726882804.23260: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c8b1ac0> <<< 30564 1726882804.23713: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/cp437.pyc' <<< 30564 1726882804.23725: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f981c7db6a0> # zipimport: found 30 names in '/tmp/ansible_stat_payload_lndr240c/ansible_stat_payload.zip' # zipimport: zlib available <<< 30564 1726882804.23768: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882804.23800: stdout chunk (state=3): >>>import ansible # loaded from Zip /tmp/ansible_stat_payload_lndr240c/ansible_stat_payload.zip/ansible/__init__.py <<< 30564 1726882804.23817: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882804.23830: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_stat_payload_lndr240c/ansible_stat_payload.zip/ansible/module_utils/__init__.py <<< 30564 1726882804.23855: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882804.25818: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882804.27417: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c1d67f0> <<< 30564 1726882804.27449: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' <<< 30564 1726882804.27476: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' <<< 30564 1726882804.27500: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' <<< 30564 1726882804.27534: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' <<< 30564 1726882804.27537: stdout chunk (state=3): >>># extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f981c1d6160> <<< 30564 1726882804.27584: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c1d6280> <<< 30564 1726882804.27631: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c1d6f40> <<< 30564 1726882804.27653: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py <<< 30564 1726882804.27658: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' <<< 30564 1726882804.27718: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c1d64f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c1d6d60> <<< 30564 1726882804.27722: stdout chunk (state=3): >>>import 'atexit' # <<< 30564 1726882804.27756: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' <<< 30564 1726882804.27760: stdout chunk (state=3): >>># extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f981c1d6fa0> <<< 30564 1726882804.27781: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py <<< 30564 1726882804.27815: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' <<< 30564 1726882804.27863: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c1d6100> <<< 30564 1726882804.27893: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py <<< 30564 1726882804.27910: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' <<< 30564 1726882804.27930: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py <<< 30564 1726882804.27951: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' <<< 30564 1726882804.27983: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' <<< 30564 1726882804.28102: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c12df10> <<< 30564 1726882804.28144: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f981c14cf10> <<< 30564 1726882804.28187: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' <<< 30564 1726882804.28190: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f981c14cd30> <<< 30564 1726882804.28201: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py <<< 30564 1726882804.28241: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' <<< 30564 1726882804.28305: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c14c3a0> <<< 30564 1726882804.28321: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c768dc0> <<< 30564 1726882804.28578: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c7683a0> <<< 30564 1726882804.28619: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' <<< 30564 1726882804.28657: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c768fa0> <<< 30564 1726882804.28709: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' <<< 30564 1726882804.28718: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' <<< 30564 1726882804.28730: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' <<< 30564 1726882804.28776: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c738c70> <<< 30564 1726882804.29605: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c1a9d00> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c1a93d0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c1df4c0> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f981c1a94f0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c1a9520> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f981c10e310> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c749220> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f981c11a880> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c7493a0> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c761dc0> <<< 30564 1726882804.29712: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c11a820> <<< 30564 1726882804.30099: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f981c11a670> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f981c119610> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f981c119520> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c7408e0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f981c19f6a0> <<< 30564 1726882804.30349: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f981c19daf0> <<< 30564 1726882804.30362: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c1ad0a0> <<< 30564 1726882804.30400: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f981c19f100> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c1e2ac0> <<< 30564 1726882804.30439: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882804.30454: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_stat_payload_lndr240c/ansible_stat_payload.zip/ansible/module_utils/compat/__init__.py <<< 30564 1726882804.30457: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882804.30568: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882804.30683: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882804.30716: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_stat_payload_lndr240c/ansible_stat_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available <<< 30564 1726882804.30737: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_stat_payload_lndr240c/ansible_stat_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available <<< 30564 1726882804.30898: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882804.31045: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882804.31847: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882804.32641: stdout chunk (state=3): >>>import ansible.module_utils.six # loaded from Zip /tmp/ansible_stat_payload_lndr240c/ansible_stat_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # <<< 30564 1726882804.32660: stdout chunk (state=3): >>>import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_stat_payload_lndr240c/ansible_stat_payload.zip/ansible/module_utils/common/text/converters.py <<< 30564 1726882804.32676: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' <<< 30564 1726882804.32738: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f981bcf05b0> <<< 30564 1726882804.32847: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' <<< 30564 1726882804.32870: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c0eb550> <<< 30564 1726882804.32873: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981bc900d0> <<< 30564 1726882804.32928: stdout chunk (state=3): >>>import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_stat_payload_lndr240c/ansible_stat_payload.zip/ansible/module_utils/compat/selinux.py <<< 30564 1726882804.32932: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882804.32955: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882804.32975: stdout chunk (state=3): >>>import ansible.module_utils._text # loaded from Zip /tmp/ansible_stat_payload_lndr240c/ansible_stat_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available <<< 30564 1726882804.33157: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882804.33360: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' <<< 30564 1726882804.33395: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c19dbe0> # zipimport: zlib available <<< 30564 1726882804.34049: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882804.34730: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882804.34816: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882804.34931: stdout chunk (state=3): >>>import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_stat_payload_lndr240c/ansible_stat_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available <<< 30564 1726882804.34992: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882804.35047: stdout chunk (state=3): >>>import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_stat_payload_lndr240c/ansible_stat_payload.zip/ansible/module_utils/common/warnings.py <<< 30564 1726882804.35050: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882804.35130: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882804.35256: stdout chunk (state=3): >>>import ansible.module_utils.errors # loaded from Zip /tmp/ansible_stat_payload_lndr240c/ansible_stat_payload.zip/ansible/module_utils/errors.py <<< 30564 1726882804.35280: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882804.35296: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_stat_payload_lndr240c/ansible_stat_payload.zip/ansible/module_utils/parsing/__init__.py <<< 30564 1726882804.35299: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882804.35334: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882804.35389: stdout chunk (state=3): >>>import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_stat_payload_lndr240c/ansible_stat_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available <<< 30564 1726882804.35713: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882804.36013: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py <<< 30564 1726882804.36063: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' <<< 30564 1726882804.36066: stdout chunk (state=3): >>>import '_ast' # <<< 30564 1726882804.36180: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c0f79a0> <<< 30564 1726882804.36183: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882804.36267: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882804.36343: stdout chunk (state=3): >>>import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_stat_payload_lndr240c/ansible_stat_payload.zip/ansible/module_utils/common/text/formatters.py <<< 30564 1726882804.36383: stdout chunk (state=3): >>>import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_stat_payload_lndr240c/ansible_stat_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_stat_payload_lndr240c/ansible_stat_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_stat_payload_lndr240c/ansible_stat_payload.zip/ansible/module_utils/common/arg_spec.py <<< 30564 1726882804.36397: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882804.36432: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882804.36488: stdout chunk (state=3): >>>import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_stat_payload_lndr240c/ansible_stat_payload.zip/ansible/module_utils/common/locale.py <<< 30564 1726882804.36492: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882804.36536: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882804.36582: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882804.36715: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882804.36799: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py <<< 30564 1726882804.36847: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' <<< 30564 1726882804.36946: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f981c754250> <<< 30564 1726882804.36988: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c0f7f10> <<< 30564 1726882804.37051: stdout chunk (state=3): >>>import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_stat_payload_lndr240c/ansible_stat_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_stat_payload_lndr240c/ansible_stat_payload.zip/ansible/module_utils/common/process.py <<< 30564 1726882804.37054: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882804.37245: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882804.37353: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882804.37356: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882804.37420: stdout chunk (state=3): >>># /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py <<< 30564 1726882804.37433: stdout chunk (state=3): >>># code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' <<< 30564 1726882804.37447: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py <<< 30564 1726882804.37488: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' <<< 30564 1726882804.37521: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py <<< 30564 1726882804.37537: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' <<< 30564 1726882804.37686: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c0dd7f0> <<< 30564 1726882804.37738: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c0d8820> <<< 30564 1726882804.37839: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c0d2a00> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_stat_payload_lndr240c/ansible_stat_payload.zip/ansible/module_utils/distro/__init__.py <<< 30564 1726882804.37842: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882804.37874: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882804.37903: stdout chunk (state=3): >>>import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_stat_payload_lndr240c/ansible_stat_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_stat_payload_lndr240c/ansible_stat_payload.zip/ansible/module_utils/common/sys_info.py <<< 30564 1726882804.38006: stdout chunk (state=3): >>>import ansible.module_utils.basic # loaded from Zip /tmp/ansible_stat_payload_lndr240c/ansible_stat_payload.zip/ansible/module_utils/basic.py <<< 30564 1726882804.38042: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_stat_payload_lndr240c/ansible_stat_payload.zip/ansible/modules/__init__.py <<< 30564 1726882804.38045: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882804.38225: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882804.38482: stdout chunk (state=3): >>># zipimport: zlib available <<< 30564 1726882804.38739: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 30564 1726882804.38743: stdout chunk (state=3): >>># destroy __main__ <<< 30564 1726882804.39151: stdout chunk (state=3): >>># clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache <<< 30564 1726882804.39221: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast <<< 30564 1726882804.39229: stdout chunk (state=3): >>># destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 30564 1726882804.39515: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 30564 1726882804.39531: stdout chunk (state=3): >>># destroy importlib.util # destroy importlib.abc # destroy importlib.machinery <<< 30564 1726882804.39570: stdout chunk (state=3): >>># destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy struct # destroy bz2 # destroy lzma <<< 30564 1726882804.39613: stdout chunk (state=3): >>># destroy __main__ # destroy locale # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid <<< 30564 1726882804.39648: stdout chunk (state=3): >>># destroy array # destroy datetime <<< 30564 1726882804.39651: stdout chunk (state=3): >>># destroy selinux # destroy distro # destroy json # destroy shlex # destroy logging # destroy argparse <<< 30564 1726882804.39734: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux <<< 30564 1726882804.39759: stdout chunk (state=3): >>># cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes <<< 30564 1726882804.39802: stdout chunk (state=3): >>># cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid <<< 30564 1726882804.39837: stdout chunk (state=3): >>># cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize <<< 30564 1726882804.39858: stdout chunk (state=3): >>># cleanup[3] wiping platform <<< 30564 1726882804.39895: stdout chunk (state=3): >>># destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal <<< 30564 1726882804.39934: stdout chunk (state=3): >>># cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib <<< 30564 1726882804.39962: stdout chunk (state=3): >>># cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil <<< 30564 1726882804.39997: stdout chunk (state=3): >>># destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading <<< 30564 1726882804.40044: stdout chunk (state=3): >>># cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib <<< 30564 1726882804.40074: stdout chunk (state=3): >>># cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external <<< 30564 1726882804.40106: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re <<< 30564 1726882804.40157: stdout chunk (state=3): >>># destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq <<< 30564 1726882804.40194: stdout chunk (state=3): >>># destroy collections.abc # cleanup[3] wiping _collections <<< 30564 1726882804.40219: stdout chunk (state=3): >>># destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse <<< 30564 1726882804.40245: stdout chunk (state=3): >>># cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale <<< 30564 1726882804.40283: stdout chunk (state=3): >>># cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath <<< 30564 1726882804.40322: stdout chunk (state=3): >>># cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal <<< 30564 1726882804.40362: stdout chunk (state=3): >>># cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs <<< 30564 1726882804.40411: stdout chunk (state=3): >>># cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix <<< 30564 1726882804.40455: stdout chunk (state=3): >>># cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys <<< 30564 1726882804.40480: stdout chunk (state=3): >>># cleanup[3] wiping builtins # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal <<< 30564 1726882804.40671: stdout chunk (state=3): >>># destroy platform <<< 30564 1726882804.40714: stdout chunk (state=3): >>># destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize <<< 30564 1726882804.40738: stdout chunk (state=3): >>># destroy _heapq <<< 30564 1726882804.40762: stdout chunk (state=3): >>># destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors <<< 30564 1726882804.40809: stdout chunk (state=3): >>># destroy select <<< 30564 1726882804.40848: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves <<< 30564 1726882804.40852: stdout chunk (state=3): >>># destroy _operator <<< 30564 1726882804.40869: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal <<< 30564 1726882804.40910: stdout chunk (state=3): >>># destroy _frozen_importlib # clear sys.audit hooks <<< 30564 1726882804.41358: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882804.41754: stderr chunk (state=3): >>><<< 30564 1726882804.41757: stdout chunk (state=3): >>><<< 30564 1726882804.41916: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981cd73dc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981cd183a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981cd73b20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981cd73ac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981cd18490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981cd18940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981cd18670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981cccf190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981cccf220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981ccf2850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981cccf940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981cd30880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981ccc8d90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981ccf2d90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981cd18970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981cc6eeb0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981cc71f40> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981cc67610> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981cc6d640> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981cc6e370> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f981c9d4df0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c9d48e0> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c9d4ee0> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c9d4fa0> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c9d4eb0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981cc49d60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981cc42640> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981cc556a0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981cc75df0> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f981c9e7ca0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981cc49280> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f981cc552b0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981cc7b9a0> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c9e7fd0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c9e7dc0> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c9e7d30> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c9ba3a0> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c9ba490> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c9eefd0> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c9e9a60> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c9e9580> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c8d41f0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c9a5b80> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c9e9ee0> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981cc75fd0> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c8e6b20> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f981c8e6e50> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c8f8760> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c8f8ca0> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f981c8853d0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c8e6f40> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f981c8962b0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c8f85e0> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f981c896370> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c9e7a00> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f981c8b16d0> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f981c8b19a0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c8b1790> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f981c8b1880> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f981c8b1cd0> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f981c8be220> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c8b1910> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c8a5a60> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c9e75e0> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c8b1ac0> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f981c7db6a0> # zipimport: found 30 names in '/tmp/ansible_stat_payload_lndr240c/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_stat_payload_lndr240c/ansible_stat_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_stat_payload_lndr240c/ansible_stat_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c1d67f0> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f981c1d6160> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c1d6280> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c1d6f40> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c1d64f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c1d6d60> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f981c1d6fa0> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c1d6100> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c12df10> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f981c14cf10> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f981c14cd30> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c14c3a0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c768dc0> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c7683a0> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c768fa0> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c738c70> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c1a9d00> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c1a93d0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c1df4c0> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f981c1a94f0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c1a9520> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f981c10e310> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c749220> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f981c11a880> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c7493a0> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c761dc0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c11a820> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f981c11a670> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f981c119610> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f981c119520> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c7408e0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f981c19f6a0> # extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f981c19daf0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c1ad0a0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f981c19f100> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c1e2ac0> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_stat_payload_lndr240c/ansible_stat_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_stat_payload_lndr240c/ansible_stat_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_stat_payload_lndr240c/ansible_stat_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.six # loaded from Zip /tmp/ansible_stat_payload_lndr240c/ansible_stat_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_stat_payload_lndr240c/ansible_stat_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f981bcf05b0> # /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c0eb550> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981bc900d0> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_stat_payload_lndr240c/ansible_stat_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_stat_payload_lndr240c/ansible_stat_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c19dbe0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_stat_payload_lndr240c/ansible_stat_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_stat_payload_lndr240c/ansible_stat_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_stat_payload_lndr240c/ansible_stat_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_stat_payload_lndr240c/ansible_stat_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_stat_payload_lndr240c/ansible_stat_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c0f79a0> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_stat_payload_lndr240c/ansible_stat_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_stat_payload_lndr240c/ansible_stat_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_stat_payload_lndr240c/ansible_stat_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_stat_payload_lndr240c/ansible_stat_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_stat_payload_lndr240c/ansible_stat_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f981c754250> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c0f7f10> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_stat_payload_lndr240c/ansible_stat_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_stat_payload_lndr240c/ansible_stat_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c0dd7f0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c0d8820> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f981c0d2a00> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_stat_payload_lndr240c/ansible_stat_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_stat_payload_lndr240c/ansible_stat_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_stat_payload_lndr240c/ansible_stat_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_stat_payload_lndr240c/ansible_stat_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_stat_payload_lndr240c/ansible_stat_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy struct # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy array # destroy datetime # destroy selinux # destroy distro # destroy json # destroy shlex # destroy logging # destroy argparse # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy struct # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy array # destroy datetime # destroy selinux # destroy distro # destroy json # destroy shlex # destroy logging # destroy argparse # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks 30564 1726882804.42469: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882803.8941927-30731-21224656307516/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882804.42473: _low_level_execute_command(): starting 30564 1726882804.42476: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882803.8941927-30731-21224656307516/ > /dev/null 2>&1 && sleep 0' 30564 1726882804.44155: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882804.44158: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882804.44313: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 30564 1726882804.44316: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882804.44318: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882804.44378: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882804.44511: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882804.44514: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882804.44727: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 30564 1726882804.47311: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882804.47316: stdout chunk (state=3): >>><<< 30564 1726882804.47318: stderr chunk (state=3): >>><<< 30564 1726882804.47734: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 30564 1726882804.47738: handler run complete 30564 1726882804.47740: attempt loop complete, returning result 30564 1726882804.47743: _execute() done 30564 1726882804.47745: dumping result to json 30564 1726882804.47747: done dumping result, returning 30564 1726882804.47749: done running TaskExecutor() for managed_node2/TASK: Check if system is ostree [0e448fcc-3ce9-4216-acec-00000000002e] 30564 1726882804.47751: sending task result for task 0e448fcc-3ce9-4216-acec-00000000002e 30564 1726882804.47822: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000002e 30564 1726882804.47827: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 30564 1726882804.47892: no more pending results, returning what we have 30564 1726882804.47895: results queue empty 30564 1726882804.47896: checking for any_errors_fatal 30564 1726882804.47902: done checking for any_errors_fatal 30564 1726882804.47903: checking for max_fail_percentage 30564 1726882804.47904: done checking for max_fail_percentage 30564 1726882804.47905: checking to see if all hosts have failed and the running result is not ok 30564 1726882804.47906: done checking to see if all hosts have failed 30564 1726882804.47907: getting the remaining hosts for this loop 30564 1726882804.47908: done getting the remaining hosts for this loop 30564 1726882804.47911: getting the next task for host managed_node2 30564 1726882804.47917: done getting next task for host managed_node2 30564 1726882804.47919: ^ task is: TASK: Set flag to indicate system is ostree 30564 1726882804.47922: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882804.47925: getting variables 30564 1726882804.47927: in VariableManager get_vars() 30564 1726882804.47953: Calling all_inventory to load vars for managed_node2 30564 1726882804.47956: Calling groups_inventory to load vars for managed_node2 30564 1726882804.47959: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882804.47979: Calling all_plugins_play to load vars for managed_node2 30564 1726882804.47982: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882804.47985: Calling groups_plugins_play to load vars for managed_node2 30564 1726882804.48174: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882804.48415: done with get_vars() 30564 1726882804.48426: done getting variables 30564 1726882804.48582: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Friday 20 September 2024 21:40:04 -0400 (0:00:00.653) 0:00:03.067 ****** 30564 1726882804.48609: entering _queue_task() for managed_node2/set_fact 30564 1726882804.48611: Creating lock for set_fact 30564 1726882804.49254: worker is 1 (out of 1 available) 30564 1726882804.49324: exiting _queue_task() for managed_node2/set_fact 30564 1726882804.49337: done queuing things up, now waiting for results queue to drain 30564 1726882804.49338: waiting for pending results... 30564 1726882804.50296: running TaskExecutor() for managed_node2/TASK: Set flag to indicate system is ostree 30564 1726882804.50396: in run() - task 0e448fcc-3ce9-4216-acec-00000000002f 30564 1726882804.50414: variable 'ansible_search_path' from source: unknown 30564 1726882804.50422: variable 'ansible_search_path' from source: unknown 30564 1726882804.50462: calling self._execute() 30564 1726882804.50540: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882804.50551: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882804.50567: variable 'omit' from source: magic vars 30564 1726882804.51015: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882804.51976: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882804.52265: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882804.52301: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882804.52338: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882804.52427: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882804.52456: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882804.52490: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882804.52518: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882804.52634: Evaluated conditional (not __network_is_ostree is defined): True 30564 1726882804.52648: variable 'omit' from source: magic vars 30564 1726882804.52693: variable 'omit' from source: magic vars 30564 1726882804.52817: variable '__ostree_booted_stat' from source: set_fact 30564 1726882804.52876: variable 'omit' from source: magic vars 30564 1726882804.52907: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882804.52940: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882804.52964: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882804.52991: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882804.53005: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882804.53036: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882804.53043: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882804.53050: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882804.53156: Set connection var ansible_timeout to 10 30564 1726882804.53169: Set connection var ansible_pipelining to False 30564 1726882804.53176: Set connection var ansible_shell_type to sh 30564 1726882804.53189: Set connection var ansible_shell_executable to /bin/sh 30564 1726882804.53201: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882804.53208: Set connection var ansible_connection to ssh 30564 1726882804.53237: variable 'ansible_shell_executable' from source: unknown 30564 1726882804.53245: variable 'ansible_connection' from source: unknown 30564 1726882804.53251: variable 'ansible_module_compression' from source: unknown 30564 1726882804.53257: variable 'ansible_shell_type' from source: unknown 30564 1726882804.53265: variable 'ansible_shell_executable' from source: unknown 30564 1726882804.53273: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882804.53280: variable 'ansible_pipelining' from source: unknown 30564 1726882804.53314: variable 'ansible_timeout' from source: unknown 30564 1726882804.53325: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882804.53429: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882804.53445: variable 'omit' from source: magic vars 30564 1726882804.53454: starting attempt loop 30564 1726882804.53461: running the handler 30564 1726882804.53481: handler run complete 30564 1726882804.53494: attempt loop complete, returning result 30564 1726882804.53501: _execute() done 30564 1726882804.53507: dumping result to json 30564 1726882804.53514: done dumping result, returning 30564 1726882804.53525: done running TaskExecutor() for managed_node2/TASK: Set flag to indicate system is ostree [0e448fcc-3ce9-4216-acec-00000000002f] 30564 1726882804.53539: sending task result for task 0e448fcc-3ce9-4216-acec-00000000002f 30564 1726882804.53642: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000002f 30564 1726882804.53650: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 30564 1726882804.53705: no more pending results, returning what we have 30564 1726882804.53708: results queue empty 30564 1726882804.53709: checking for any_errors_fatal 30564 1726882804.53715: done checking for any_errors_fatal 30564 1726882804.53715: checking for max_fail_percentage 30564 1726882804.53717: done checking for max_fail_percentage 30564 1726882804.53718: checking to see if all hosts have failed and the running result is not ok 30564 1726882804.53719: done checking to see if all hosts have failed 30564 1726882804.53719: getting the remaining hosts for this loop 30564 1726882804.53721: done getting the remaining hosts for this loop 30564 1726882804.53724: getting the next task for host managed_node2 30564 1726882804.53733: done getting next task for host managed_node2 30564 1726882804.53735: ^ task is: TASK: Fix CentOS6 Base repo 30564 1726882804.53738: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882804.53742: getting variables 30564 1726882804.53743: in VariableManager get_vars() 30564 1726882804.53775: Calling all_inventory to load vars for managed_node2 30564 1726882804.53778: Calling groups_inventory to load vars for managed_node2 30564 1726882804.53781: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882804.53792: Calling all_plugins_play to load vars for managed_node2 30564 1726882804.53794: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882804.53803: Calling groups_plugins_play to load vars for managed_node2 30564 1726882804.54026: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882804.54253: done with get_vars() 30564 1726882804.54265: done getting variables 30564 1726882804.54460: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Friday 20 September 2024 21:40:04 -0400 (0:00:00.060) 0:00:03.127 ****** 30564 1726882804.54649: entering _queue_task() for managed_node2/copy 30564 1726882804.55135: worker is 1 (out of 1 available) 30564 1726882804.55146: exiting _queue_task() for managed_node2/copy 30564 1726882804.55156: done queuing things up, now waiting for results queue to drain 30564 1726882804.55157: waiting for pending results... 30564 1726882804.56061: running TaskExecutor() for managed_node2/TASK: Fix CentOS6 Base repo 30564 1726882804.56252: in run() - task 0e448fcc-3ce9-4216-acec-000000000031 30564 1726882804.56267: variable 'ansible_search_path' from source: unknown 30564 1726882804.56271: variable 'ansible_search_path' from source: unknown 30564 1726882804.56305: calling self._execute() 30564 1726882804.56489: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882804.56495: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882804.56504: variable 'omit' from source: magic vars 30564 1726882804.57610: variable 'ansible_distribution' from source: facts 30564 1726882804.57638: Evaluated conditional (ansible_distribution == 'CentOS'): True 30564 1726882804.58030: variable 'ansible_distribution_major_version' from source: facts 30564 1726882804.58037: Evaluated conditional (ansible_distribution_major_version == '6'): False 30564 1726882804.58039: when evaluation is False, skipping this task 30564 1726882804.58042: _execute() done 30564 1726882804.58045: dumping result to json 30564 1726882804.58047: done dumping result, returning 30564 1726882804.58054: done running TaskExecutor() for managed_node2/TASK: Fix CentOS6 Base repo [0e448fcc-3ce9-4216-acec-000000000031] 30564 1726882804.58060: sending task result for task 0e448fcc-3ce9-4216-acec-000000000031 30564 1726882804.58254: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000031 30564 1726882804.58257: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 30564 1726882804.58331: no more pending results, returning what we have 30564 1726882804.58335: results queue empty 30564 1726882804.58336: checking for any_errors_fatal 30564 1726882804.58343: done checking for any_errors_fatal 30564 1726882804.58344: checking for max_fail_percentage 30564 1726882804.58346: done checking for max_fail_percentage 30564 1726882804.58347: checking to see if all hosts have failed and the running result is not ok 30564 1726882804.58347: done checking to see if all hosts have failed 30564 1726882804.58348: getting the remaining hosts for this loop 30564 1726882804.58350: done getting the remaining hosts for this loop 30564 1726882804.58354: getting the next task for host managed_node2 30564 1726882804.58362: done getting next task for host managed_node2 30564 1726882804.58370: ^ task is: TASK: Include the task 'enable_epel.yml' 30564 1726882804.58374: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882804.58378: getting variables 30564 1726882804.58380: in VariableManager get_vars() 30564 1726882804.58410: Calling all_inventory to load vars for managed_node2 30564 1726882804.58413: Calling groups_inventory to load vars for managed_node2 30564 1726882804.58417: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882804.58430: Calling all_plugins_play to load vars for managed_node2 30564 1726882804.58434: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882804.58437: Calling groups_plugins_play to load vars for managed_node2 30564 1726882804.58655: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882804.58904: done with get_vars() 30564 1726882804.58921: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Friday 20 September 2024 21:40:04 -0400 (0:00:00.047) 0:00:03.175 ****** 30564 1726882804.59438: entering _queue_task() for managed_node2/include_tasks 30564 1726882804.60024: worker is 1 (out of 1 available) 30564 1726882804.60036: exiting _queue_task() for managed_node2/include_tasks 30564 1726882804.60160: done queuing things up, now waiting for results queue to drain 30564 1726882804.60161: waiting for pending results... 30564 1726882804.61055: running TaskExecutor() for managed_node2/TASK: Include the task 'enable_epel.yml' 30564 1726882804.61253: in run() - task 0e448fcc-3ce9-4216-acec-000000000032 30564 1726882804.61267: variable 'ansible_search_path' from source: unknown 30564 1726882804.61271: variable 'ansible_search_path' from source: unknown 30564 1726882804.61307: calling self._execute() 30564 1726882804.61496: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882804.61500: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882804.61510: variable 'omit' from source: magic vars 30564 1726882804.62531: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882804.68009: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882804.68519: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882804.68670: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882804.68706: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882804.68730: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882804.68927: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882804.70014: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882804.70045: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882804.70093: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882804.70111: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882804.70227: variable '__network_is_ostree' from source: set_fact 30564 1726882804.70250: Evaluated conditional (not __network_is_ostree | d(false)): True 30564 1726882804.70262: _execute() done 30564 1726882804.70271: dumping result to json 30564 1726882804.70280: done dumping result, returning 30564 1726882804.70290: done running TaskExecutor() for managed_node2/TASK: Include the task 'enable_epel.yml' [0e448fcc-3ce9-4216-acec-000000000032] 30564 1726882804.70298: sending task result for task 0e448fcc-3ce9-4216-acec-000000000032 30564 1726882804.70422: no more pending results, returning what we have 30564 1726882804.70427: in VariableManager get_vars() 30564 1726882804.70458: Calling all_inventory to load vars for managed_node2 30564 1726882804.70460: Calling groups_inventory to load vars for managed_node2 30564 1726882804.70465: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882804.70480: Calling all_plugins_play to load vars for managed_node2 30564 1726882804.70483: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882804.70486: Calling groups_plugins_play to load vars for managed_node2 30564 1726882804.70701: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000032 30564 1726882804.70705: WORKER PROCESS EXITING 30564 1726882804.70719: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882804.70942: done with get_vars() 30564 1726882804.70950: variable 'ansible_search_path' from source: unknown 30564 1726882804.70951: variable 'ansible_search_path' from source: unknown 30564 1726882804.70992: we have included files to process 30564 1726882804.70993: generating all_blocks data 30564 1726882804.70995: done generating all_blocks data 30564 1726882804.71000: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 30564 1726882804.71001: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 30564 1726882804.71003: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 30564 1726882804.72495: done processing included file 30564 1726882804.72497: iterating over new_blocks loaded from include file 30564 1726882804.72499: in VariableManager get_vars() 30564 1726882804.72510: done with get_vars() 30564 1726882804.72512: filtering new block on tags 30564 1726882804.72534: done filtering new block on tags 30564 1726882804.72537: in VariableManager get_vars() 30564 1726882804.72661: done with get_vars() 30564 1726882804.72664: filtering new block on tags 30564 1726882804.72681: done filtering new block on tags 30564 1726882804.72683: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed_node2 30564 1726882804.72689: extending task lists for all hosts with included blocks 30564 1726882804.72815: done extending task lists 30564 1726882804.72817: done processing included files 30564 1726882804.72817: results queue empty 30564 1726882804.72818: checking for any_errors_fatal 30564 1726882804.72821: done checking for any_errors_fatal 30564 1726882804.72822: checking for max_fail_percentage 30564 1726882804.72823: done checking for max_fail_percentage 30564 1726882804.72823: checking to see if all hosts have failed and the running result is not ok 30564 1726882804.72824: done checking to see if all hosts have failed 30564 1726882804.72825: getting the remaining hosts for this loop 30564 1726882804.72826: done getting the remaining hosts for this loop 30564 1726882804.72828: getting the next task for host managed_node2 30564 1726882804.72832: done getting next task for host managed_node2 30564 1726882804.72834: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 30564 1726882804.72837: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882804.72839: getting variables 30564 1726882804.72840: in VariableManager get_vars() 30564 1726882804.72848: Calling all_inventory to load vars for managed_node2 30564 1726882804.72850: Calling groups_inventory to load vars for managed_node2 30564 1726882804.72852: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882804.72858: Calling all_plugins_play to load vars for managed_node2 30564 1726882804.72873: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882804.72877: Calling groups_plugins_play to load vars for managed_node2 30564 1726882804.73054: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882804.73248: done with get_vars() 30564 1726882804.73255: done getting variables 30564 1726882804.73327: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 30564 1726882804.73552: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 9] *********************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Friday 20 September 2024 21:40:04 -0400 (0:00:00.141) 0:00:03.317 ****** 30564 1726882804.73603: entering _queue_task() for managed_node2/command 30564 1726882804.73605: Creating lock for command 30564 1726882804.74188: worker is 1 (out of 1 available) 30564 1726882804.74205: exiting _queue_task() for managed_node2/command 30564 1726882804.74217: done queuing things up, now waiting for results queue to drain 30564 1726882804.74218: waiting for pending results... 30564 1726882804.74767: running TaskExecutor() for managed_node2/TASK: Create EPEL 9 30564 1726882804.74983: in run() - task 0e448fcc-3ce9-4216-acec-00000000004c 30564 1726882804.75003: variable 'ansible_search_path' from source: unknown 30564 1726882804.75010: variable 'ansible_search_path' from source: unknown 30564 1726882804.75162: calling self._execute() 30564 1726882804.75250: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882804.75267: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882804.75284: variable 'omit' from source: magic vars 30564 1726882804.76135: variable 'ansible_distribution' from source: facts 30564 1726882804.76166: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 30564 1726882804.76394: variable 'ansible_distribution_major_version' from source: facts 30564 1726882804.76469: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 30564 1726882804.76478: when evaluation is False, skipping this task 30564 1726882804.76486: _execute() done 30564 1726882804.76494: dumping result to json 30564 1726882804.76576: done dumping result, returning 30564 1726882804.76588: done running TaskExecutor() for managed_node2/TASK: Create EPEL 9 [0e448fcc-3ce9-4216-acec-00000000004c] 30564 1726882804.76599: sending task result for task 0e448fcc-3ce9-4216-acec-00000000004c skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 30564 1726882804.76771: no more pending results, returning what we have 30564 1726882804.76774: results queue empty 30564 1726882804.76775: checking for any_errors_fatal 30564 1726882804.76776: done checking for any_errors_fatal 30564 1726882804.76777: checking for max_fail_percentage 30564 1726882804.76778: done checking for max_fail_percentage 30564 1726882804.76779: checking to see if all hosts have failed and the running result is not ok 30564 1726882804.76780: done checking to see if all hosts have failed 30564 1726882804.76781: getting the remaining hosts for this loop 30564 1726882804.76782: done getting the remaining hosts for this loop 30564 1726882804.76786: getting the next task for host managed_node2 30564 1726882804.76793: done getting next task for host managed_node2 30564 1726882804.76795: ^ task is: TASK: Install yum-utils package 30564 1726882804.76799: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882804.76803: getting variables 30564 1726882804.76804: in VariableManager get_vars() 30564 1726882804.76832: Calling all_inventory to load vars for managed_node2 30564 1726882804.76835: Calling groups_inventory to load vars for managed_node2 30564 1726882804.76839: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882804.76854: Calling all_plugins_play to load vars for managed_node2 30564 1726882804.76858: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882804.76862: Calling groups_plugins_play to load vars for managed_node2 30564 1726882804.77085: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000004c 30564 1726882804.77089: WORKER PROCESS EXITING 30564 1726882804.77109: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882804.77331: done with get_vars() 30564 1726882804.77341: done getting variables 30564 1726882804.77446: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Friday 20 September 2024 21:40:04 -0400 (0:00:00.040) 0:00:03.357 ****** 30564 1726882804.77611: entering _queue_task() for managed_node2/package 30564 1726882804.77613: Creating lock for package 30564 1726882804.78149: worker is 1 (out of 1 available) 30564 1726882804.78160: exiting _queue_task() for managed_node2/package 30564 1726882804.78175: done queuing things up, now waiting for results queue to drain 30564 1726882804.78176: waiting for pending results... 30564 1726882804.79176: running TaskExecutor() for managed_node2/TASK: Install yum-utils package 30564 1726882804.79268: in run() - task 0e448fcc-3ce9-4216-acec-00000000004d 30564 1726882804.79282: variable 'ansible_search_path' from source: unknown 30564 1726882804.79286: variable 'ansible_search_path' from source: unknown 30564 1726882804.79317: calling self._execute() 30564 1726882804.79392: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882804.79403: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882804.79415: variable 'omit' from source: magic vars 30564 1726882804.79810: variable 'ansible_distribution' from source: facts 30564 1726882804.79828: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 30564 1726882804.79985: variable 'ansible_distribution_major_version' from source: facts 30564 1726882804.80005: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 30564 1726882804.80013: when evaluation is False, skipping this task 30564 1726882804.80020: _execute() done 30564 1726882804.80027: dumping result to json 30564 1726882804.80034: done dumping result, returning 30564 1726882804.80044: done running TaskExecutor() for managed_node2/TASK: Install yum-utils package [0e448fcc-3ce9-4216-acec-00000000004d] 30564 1726882804.80053: sending task result for task 0e448fcc-3ce9-4216-acec-00000000004d skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 30564 1726882804.80213: no more pending results, returning what we have 30564 1726882804.80218: results queue empty 30564 1726882804.80218: checking for any_errors_fatal 30564 1726882804.80225: done checking for any_errors_fatal 30564 1726882804.80226: checking for max_fail_percentage 30564 1726882804.80228: done checking for max_fail_percentage 30564 1726882804.80229: checking to see if all hosts have failed and the running result is not ok 30564 1726882804.80230: done checking to see if all hosts have failed 30564 1726882804.80231: getting the remaining hosts for this loop 30564 1726882804.80232: done getting the remaining hosts for this loop 30564 1726882804.80236: getting the next task for host managed_node2 30564 1726882804.80243: done getting next task for host managed_node2 30564 1726882804.80246: ^ task is: TASK: Enable EPEL 7 30564 1726882804.80250: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882804.80254: getting variables 30564 1726882804.80256: in VariableManager get_vars() 30564 1726882804.80289: Calling all_inventory to load vars for managed_node2 30564 1726882804.80291: Calling groups_inventory to load vars for managed_node2 30564 1726882804.80295: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882804.80307: Calling all_plugins_play to load vars for managed_node2 30564 1726882804.80311: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882804.80314: Calling groups_plugins_play to load vars for managed_node2 30564 1726882804.80989: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882804.81435: done with get_vars() 30564 1726882804.81446: done getting variables 30564 1726882804.81738: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000004d 30564 1726882804.81742: WORKER PROCESS EXITING 30564 1726882804.81758: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Friday 20 September 2024 21:40:04 -0400 (0:00:00.041) 0:00:03.399 ****** 30564 1726882804.81795: entering _queue_task() for managed_node2/command 30564 1726882804.82381: worker is 1 (out of 1 available) 30564 1726882804.82391: exiting _queue_task() for managed_node2/command 30564 1726882804.82403: done queuing things up, now waiting for results queue to drain 30564 1726882804.82404: waiting for pending results... 30564 1726882804.83088: running TaskExecutor() for managed_node2/TASK: Enable EPEL 7 30564 1726882804.83171: in run() - task 0e448fcc-3ce9-4216-acec-00000000004e 30564 1726882804.83182: variable 'ansible_search_path' from source: unknown 30564 1726882804.83186: variable 'ansible_search_path' from source: unknown 30564 1726882804.83216: calling self._execute() 30564 1726882804.83284: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882804.83287: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882804.83299: variable 'omit' from source: magic vars 30564 1726882804.84275: variable 'ansible_distribution' from source: facts 30564 1726882804.84295: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 30564 1726882804.84696: variable 'ansible_distribution_major_version' from source: facts 30564 1726882804.84706: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 30564 1726882804.84713: when evaluation is False, skipping this task 30564 1726882804.84719: _execute() done 30564 1726882804.84726: dumping result to json 30564 1726882804.84732: done dumping result, returning 30564 1726882804.84741: done running TaskExecutor() for managed_node2/TASK: Enable EPEL 7 [0e448fcc-3ce9-4216-acec-00000000004e] 30564 1726882804.84749: sending task result for task 0e448fcc-3ce9-4216-acec-00000000004e skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 30564 1726882804.84895: no more pending results, returning what we have 30564 1726882804.84898: results queue empty 30564 1726882804.84899: checking for any_errors_fatal 30564 1726882804.84906: done checking for any_errors_fatal 30564 1726882804.84906: checking for max_fail_percentage 30564 1726882804.84908: done checking for max_fail_percentage 30564 1726882804.84909: checking to see if all hosts have failed and the running result is not ok 30564 1726882804.84910: done checking to see if all hosts have failed 30564 1726882804.84910: getting the remaining hosts for this loop 30564 1726882804.84912: done getting the remaining hosts for this loop 30564 1726882804.84915: getting the next task for host managed_node2 30564 1726882804.84922: done getting next task for host managed_node2 30564 1726882804.84924: ^ task is: TASK: Enable EPEL 8 30564 1726882804.84928: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882804.84932: getting variables 30564 1726882804.84933: in VariableManager get_vars() 30564 1726882804.84960: Calling all_inventory to load vars for managed_node2 30564 1726882804.84962: Calling groups_inventory to load vars for managed_node2 30564 1726882804.84970: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882804.84984: Calling all_plugins_play to load vars for managed_node2 30564 1726882804.84987: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882804.84991: Calling groups_plugins_play to load vars for managed_node2 30564 1726882804.85336: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000004e 30564 1726882804.85339: WORKER PROCESS EXITING 30564 1726882804.85359: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882804.85585: done with get_vars() 30564 1726882804.85594: done getting variables 30564 1726882804.85760: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Friday 20 September 2024 21:40:04 -0400 (0:00:00.039) 0:00:03.439 ****** 30564 1726882804.85791: entering _queue_task() for managed_node2/command 30564 1726882804.86242: worker is 1 (out of 1 available) 30564 1726882804.86253: exiting _queue_task() for managed_node2/command 30564 1726882804.86386: done queuing things up, now waiting for results queue to drain 30564 1726882804.86388: waiting for pending results... 30564 1726882804.87324: running TaskExecutor() for managed_node2/TASK: Enable EPEL 8 30564 1726882804.87577: in run() - task 0e448fcc-3ce9-4216-acec-00000000004f 30564 1726882804.87647: variable 'ansible_search_path' from source: unknown 30564 1726882804.87687: variable 'ansible_search_path' from source: unknown 30564 1726882804.87775: calling self._execute() 30564 1726882804.88022: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882804.88036: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882804.88051: variable 'omit' from source: magic vars 30564 1726882804.88922: variable 'ansible_distribution' from source: facts 30564 1726882804.89017: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 30564 1726882804.89433: variable 'ansible_distribution_major_version' from source: facts 30564 1726882804.89444: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 30564 1726882804.89451: when evaluation is False, skipping this task 30564 1726882804.89527: _execute() done 30564 1726882804.89536: dumping result to json 30564 1726882804.89546: done dumping result, returning 30564 1726882804.89555: done running TaskExecutor() for managed_node2/TASK: Enable EPEL 8 [0e448fcc-3ce9-4216-acec-00000000004f] 30564 1726882804.89566: sending task result for task 0e448fcc-3ce9-4216-acec-00000000004f skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 30564 1726882804.89711: no more pending results, returning what we have 30564 1726882804.89715: results queue empty 30564 1726882804.89716: checking for any_errors_fatal 30564 1726882804.89723: done checking for any_errors_fatal 30564 1726882804.89724: checking for max_fail_percentage 30564 1726882804.89725: done checking for max_fail_percentage 30564 1726882804.89726: checking to see if all hosts have failed and the running result is not ok 30564 1726882804.89727: done checking to see if all hosts have failed 30564 1726882804.89728: getting the remaining hosts for this loop 30564 1726882804.89729: done getting the remaining hosts for this loop 30564 1726882804.89733: getting the next task for host managed_node2 30564 1726882804.89745: done getting next task for host managed_node2 30564 1726882804.89747: ^ task is: TASK: Enable EPEL 6 30564 1726882804.89752: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882804.89756: getting variables 30564 1726882804.89758: in VariableManager get_vars() 30564 1726882804.89792: Calling all_inventory to load vars for managed_node2 30564 1726882804.89795: Calling groups_inventory to load vars for managed_node2 30564 1726882804.89798: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882804.89812: Calling all_plugins_play to load vars for managed_node2 30564 1726882804.89816: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882804.89819: Calling groups_plugins_play to load vars for managed_node2 30564 1726882804.90008: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882804.90240: done with get_vars() 30564 1726882804.90249: done getting variables 30564 1726882804.90506: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000004f 30564 1726882804.90509: WORKER PROCESS EXITING 30564 1726882804.90549: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Friday 20 September 2024 21:40:04 -0400 (0:00:00.047) 0:00:03.487 ****** 30564 1726882804.90595: entering _queue_task() for managed_node2/copy 30564 1726882804.91180: worker is 1 (out of 1 available) 30564 1726882804.91191: exiting _queue_task() for managed_node2/copy 30564 1726882804.91205: done queuing things up, now waiting for results queue to drain 30564 1726882804.91206: waiting for pending results... 30564 1726882804.92081: running TaskExecutor() for managed_node2/TASK: Enable EPEL 6 30564 1726882804.92859: in run() - task 0e448fcc-3ce9-4216-acec-000000000051 30564 1726882804.93583: variable 'ansible_search_path' from source: unknown 30564 1726882804.93592: variable 'ansible_search_path' from source: unknown 30564 1726882804.93632: calling self._execute() 30564 1726882804.93709: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882804.93721: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882804.93735: variable 'omit' from source: magic vars 30564 1726882804.94151: variable 'ansible_distribution' from source: facts 30564 1726882804.94171: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 30564 1726882804.94285: variable 'ansible_distribution_major_version' from source: facts 30564 1726882804.94297: Evaluated conditional (ansible_distribution_major_version == '6'): False 30564 1726882804.94304: when evaluation is False, skipping this task 30564 1726882804.94312: _execute() done 30564 1726882804.94318: dumping result to json 30564 1726882804.94324: done dumping result, returning 30564 1726882804.94332: done running TaskExecutor() for managed_node2/TASK: Enable EPEL 6 [0e448fcc-3ce9-4216-acec-000000000051] 30564 1726882804.94341: sending task result for task 0e448fcc-3ce9-4216-acec-000000000051 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 30564 1726882804.94483: no more pending results, returning what we have 30564 1726882804.94486: results queue empty 30564 1726882804.94487: checking for any_errors_fatal 30564 1726882804.94493: done checking for any_errors_fatal 30564 1726882804.94494: checking for max_fail_percentage 30564 1726882804.94495: done checking for max_fail_percentage 30564 1726882804.94496: checking to see if all hosts have failed and the running result is not ok 30564 1726882804.94497: done checking to see if all hosts have failed 30564 1726882804.94498: getting the remaining hosts for this loop 30564 1726882804.94499: done getting the remaining hosts for this loop 30564 1726882804.94502: getting the next task for host managed_node2 30564 1726882804.94513: done getting next task for host managed_node2 30564 1726882804.94515: ^ task is: TASK: Set network provider to 'nm' 30564 1726882804.94518: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882804.94523: getting variables 30564 1726882804.94525: in VariableManager get_vars() 30564 1726882804.94552: Calling all_inventory to load vars for managed_node2 30564 1726882804.94553: Calling groups_inventory to load vars for managed_node2 30564 1726882804.94557: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882804.94574: Calling all_plugins_play to load vars for managed_node2 30564 1726882804.94577: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882804.94581: Calling groups_plugins_play to load vars for managed_node2 30564 1726882804.94804: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000051 30564 1726882804.94808: WORKER PROCESS EXITING 30564 1726882804.94822: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882804.95088: done with get_vars() 30564 1726882804.95104: done getting variables 30564 1726882804.95189: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_states_nm.yml:13 Friday 20 September 2024 21:40:04 -0400 (0:00:00.047) 0:00:03.534 ****** 30564 1726882804.95336: entering _queue_task() for managed_node2/set_fact 30564 1726882804.95912: worker is 1 (out of 1 available) 30564 1726882804.95923: exiting _queue_task() for managed_node2/set_fact 30564 1726882804.95937: done queuing things up, now waiting for results queue to drain 30564 1726882804.95938: waiting for pending results... 30564 1726882804.96440: running TaskExecutor() for managed_node2/TASK: Set network provider to 'nm' 30564 1726882804.96524: in run() - task 0e448fcc-3ce9-4216-acec-000000000007 30564 1726882804.96543: variable 'ansible_search_path' from source: unknown 30564 1726882804.96582: calling self._execute() 30564 1726882804.96653: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882804.96666: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882804.96682: variable 'omit' from source: magic vars 30564 1726882804.96821: variable 'omit' from source: magic vars 30564 1726882804.97411: variable 'omit' from source: magic vars 30564 1726882804.97451: variable 'omit' from source: magic vars 30564 1726882804.97500: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882804.97539: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882804.97566: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882804.97589: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882804.97606: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882804.97640: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882804.97648: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882804.97655: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882804.97755: Set connection var ansible_timeout to 10 30564 1726882804.97765: Set connection var ansible_pipelining to False 30564 1726882804.97772: Set connection var ansible_shell_type to sh 30564 1726882804.97779: Set connection var ansible_shell_executable to /bin/sh 30564 1726882804.97788: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882804.97793: Set connection var ansible_connection to ssh 30564 1726882804.97814: variable 'ansible_shell_executable' from source: unknown 30564 1726882804.97820: variable 'ansible_connection' from source: unknown 30564 1726882804.97825: variable 'ansible_module_compression' from source: unknown 30564 1726882804.97830: variable 'ansible_shell_type' from source: unknown 30564 1726882804.97835: variable 'ansible_shell_executable' from source: unknown 30564 1726882804.97841: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882804.97847: variable 'ansible_pipelining' from source: unknown 30564 1726882804.97852: variable 'ansible_timeout' from source: unknown 30564 1726882804.97858: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882804.97995: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882804.98269: variable 'omit' from source: magic vars 30564 1726882804.98282: starting attempt loop 30564 1726882804.98290: running the handler 30564 1726882804.98306: handler run complete 30564 1726882804.98319: attempt loop complete, returning result 30564 1726882804.98326: _execute() done 30564 1726882804.98345: dumping result to json 30564 1726882804.98382: done dumping result, returning 30564 1726882804.98418: done running TaskExecutor() for managed_node2/TASK: Set network provider to 'nm' [0e448fcc-3ce9-4216-acec-000000000007] 30564 1726882804.98428: sending task result for task 0e448fcc-3ce9-4216-acec-000000000007 ok: [managed_node2] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 30564 1726882804.98616: no more pending results, returning what we have 30564 1726882804.98619: results queue empty 30564 1726882804.98620: checking for any_errors_fatal 30564 1726882804.98624: done checking for any_errors_fatal 30564 1726882804.98625: checking for max_fail_percentage 30564 1726882804.98626: done checking for max_fail_percentage 30564 1726882804.98627: checking to see if all hosts have failed and the running result is not ok 30564 1726882804.98628: done checking to see if all hosts have failed 30564 1726882804.98629: getting the remaining hosts for this loop 30564 1726882804.98630: done getting the remaining hosts for this loop 30564 1726882804.98633: getting the next task for host managed_node2 30564 1726882804.98640: done getting next task for host managed_node2 30564 1726882804.98642: ^ task is: TASK: meta (flush_handlers) 30564 1726882804.98643: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882804.98648: getting variables 30564 1726882804.98649: in VariableManager get_vars() 30564 1726882804.98679: Calling all_inventory to load vars for managed_node2 30564 1726882804.98681: Calling groups_inventory to load vars for managed_node2 30564 1726882804.98686: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882804.98699: Calling all_plugins_play to load vars for managed_node2 30564 1726882804.98702: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882804.98706: Calling groups_plugins_play to load vars for managed_node2 30564 1726882804.98883: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000007 30564 1726882804.98887: WORKER PROCESS EXITING 30564 1726882804.98916: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882804.99146: done with get_vars() 30564 1726882804.99156: done getting variables 30564 1726882804.99232: in VariableManager get_vars() 30564 1726882804.99241: Calling all_inventory to load vars for managed_node2 30564 1726882804.99244: Calling groups_inventory to load vars for managed_node2 30564 1726882804.99246: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882804.99250: Calling all_plugins_play to load vars for managed_node2 30564 1726882804.99253: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882804.99255: Calling groups_plugins_play to load vars for managed_node2 30564 1726882804.99453: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882804.99687: done with get_vars() 30564 1726882804.99700: done queuing things up, now waiting for results queue to drain 30564 1726882804.99702: results queue empty 30564 1726882804.99703: checking for any_errors_fatal 30564 1726882804.99705: done checking for any_errors_fatal 30564 1726882804.99705: checking for max_fail_percentage 30564 1726882804.99706: done checking for max_fail_percentage 30564 1726882804.99707: checking to see if all hosts have failed and the running result is not ok 30564 1726882804.99708: done checking to see if all hosts have failed 30564 1726882804.99709: getting the remaining hosts for this loop 30564 1726882804.99709: done getting the remaining hosts for this loop 30564 1726882804.99712: getting the next task for host managed_node2 30564 1726882804.99715: done getting next task for host managed_node2 30564 1726882804.99717: ^ task is: TASK: meta (flush_handlers) 30564 1726882804.99718: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882804.99725: getting variables 30564 1726882804.99726: in VariableManager get_vars() 30564 1726882804.99733: Calling all_inventory to load vars for managed_node2 30564 1726882804.99735: Calling groups_inventory to load vars for managed_node2 30564 1726882804.99737: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882804.99741: Calling all_plugins_play to load vars for managed_node2 30564 1726882804.99744: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882804.99746: Calling groups_plugins_play to load vars for managed_node2 30564 1726882804.99912: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882805.00138: done with get_vars() 30564 1726882805.00146: done getting variables 30564 1726882805.00217: in VariableManager get_vars() 30564 1726882805.00226: Calling all_inventory to load vars for managed_node2 30564 1726882805.00228: Calling groups_inventory to load vars for managed_node2 30564 1726882805.00230: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882805.00234: Calling all_plugins_play to load vars for managed_node2 30564 1726882805.00237: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882805.00239: Calling groups_plugins_play to load vars for managed_node2 30564 1726882805.00480: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882805.00703: done with get_vars() 30564 1726882805.00714: done queuing things up, now waiting for results queue to drain 30564 1726882805.00715: results queue empty 30564 1726882805.00716: checking for any_errors_fatal 30564 1726882805.00717: done checking for any_errors_fatal 30564 1726882805.00718: checking for max_fail_percentage 30564 1726882805.00719: done checking for max_fail_percentage 30564 1726882805.00720: checking to see if all hosts have failed and the running result is not ok 30564 1726882805.00721: done checking to see if all hosts have failed 30564 1726882805.00721: getting the remaining hosts for this loop 30564 1726882805.00722: done getting the remaining hosts for this loop 30564 1726882805.00724: getting the next task for host managed_node2 30564 1726882805.00727: done getting next task for host managed_node2 30564 1726882805.00728: ^ task is: None 30564 1726882805.00729: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882805.00730: done queuing things up, now waiting for results queue to drain 30564 1726882805.00731: results queue empty 30564 1726882805.00732: checking for any_errors_fatal 30564 1726882805.00732: done checking for any_errors_fatal 30564 1726882805.00733: checking for max_fail_percentage 30564 1726882805.00734: done checking for max_fail_percentage 30564 1726882805.00735: checking to see if all hosts have failed and the running result is not ok 30564 1726882805.00735: done checking to see if all hosts have failed 30564 1726882805.00737: getting the next task for host managed_node2 30564 1726882805.00739: done getting next task for host managed_node2 30564 1726882805.00739: ^ task is: None 30564 1726882805.00741: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882805.00801: in VariableManager get_vars() 30564 1726882805.00815: done with get_vars() 30564 1726882805.00821: in VariableManager get_vars() 30564 1726882805.00830: done with get_vars() 30564 1726882805.00835: variable 'omit' from source: magic vars 30564 1726882805.00877: in VariableManager get_vars() 30564 1726882805.00887: done with get_vars() 30564 1726882805.00911: variable 'omit' from source: magic vars PLAY [Play for testing states] ************************************************* 30564 1726882805.01253: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 30564 1726882805.01280: getting the remaining hosts for this loop 30564 1726882805.01282: done getting the remaining hosts for this loop 30564 1726882805.01284: getting the next task for host managed_node2 30564 1726882805.01286: done getting next task for host managed_node2 30564 1726882805.01288: ^ task is: TASK: Gathering Facts 30564 1726882805.01289: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882805.01291: getting variables 30564 1726882805.01293: in VariableManager get_vars() 30564 1726882805.01300: Calling all_inventory to load vars for managed_node2 30564 1726882805.01302: Calling groups_inventory to load vars for managed_node2 30564 1726882805.01304: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882805.01308: Calling all_plugins_play to load vars for managed_node2 30564 1726882805.01320: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882805.01322: Calling groups_plugins_play to load vars for managed_node2 30564 1726882805.01483: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882805.01741: done with get_vars() 30564 1726882805.01748: done getting variables 30564 1726882805.01790: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_states.yml:3 Friday 20 September 2024 21:40:05 -0400 (0:00:00.064) 0:00:03.599 ****** 30564 1726882805.01820: entering _queue_task() for managed_node2/gather_facts 30564 1726882805.02060: worker is 1 (out of 1 available) 30564 1726882805.02077: exiting _queue_task() for managed_node2/gather_facts 30564 1726882805.02088: done queuing things up, now waiting for results queue to drain 30564 1726882805.02097: waiting for pending results... 30564 1726882805.02440: running TaskExecutor() for managed_node2/TASK: Gathering Facts 30564 1726882805.02547: in run() - task 0e448fcc-3ce9-4216-acec-000000000077 30564 1726882805.02589: variable 'ansible_search_path' from source: unknown 30564 1726882805.02636: calling self._execute() 30564 1726882805.02741: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882805.02753: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882805.02771: variable 'omit' from source: magic vars 30564 1726882805.03190: variable 'ansible_distribution_major_version' from source: facts 30564 1726882805.03211: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882805.03232: variable 'omit' from source: magic vars 30564 1726882805.03260: variable 'omit' from source: magic vars 30564 1726882805.03305: variable 'omit' from source: magic vars 30564 1726882805.03362: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882805.03406: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882805.03427: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882805.03462: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882805.03483: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882805.03514: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882805.03523: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882805.03530: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882805.03640: Set connection var ansible_timeout to 10 30564 1726882805.03661: Set connection var ansible_pipelining to False 30564 1726882805.03676: Set connection var ansible_shell_type to sh 30564 1726882805.03688: Set connection var ansible_shell_executable to /bin/sh 30564 1726882805.03701: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882805.03708: Set connection var ansible_connection to ssh 30564 1726882805.03736: variable 'ansible_shell_executable' from source: unknown 30564 1726882805.03744: variable 'ansible_connection' from source: unknown 30564 1726882805.03751: variable 'ansible_module_compression' from source: unknown 30564 1726882805.03772: variable 'ansible_shell_type' from source: unknown 30564 1726882805.03784: variable 'ansible_shell_executable' from source: unknown 30564 1726882805.03792: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882805.03800: variable 'ansible_pipelining' from source: unknown 30564 1726882805.03807: variable 'ansible_timeout' from source: unknown 30564 1726882805.03815: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882805.04018: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882805.04032: variable 'omit' from source: magic vars 30564 1726882805.04042: starting attempt loop 30564 1726882805.04049: running the handler 30564 1726882805.04099: variable 'ansible_facts' from source: unknown 30564 1726882805.04112: _low_level_execute_command(): starting 30564 1726882805.04128: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882805.04673: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882805.04691: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882805.04726: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882805.04742: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882805.04760: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882805.04800: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882805.04819: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882805.04831: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882805.04853: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882805.04875: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882805.04893: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882805.04917: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882805.04931: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882805.04945: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882805.05030: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882805.05053: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882805.05078: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882805.05228: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 30564 1726882805.07506: stdout chunk (state=3): >>>/root <<< 30564 1726882805.07739: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882805.07742: stdout chunk (state=3): >>><<< 30564 1726882805.07745: stderr chunk (state=3): >>><<< 30564 1726882805.07773: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 30564 1726882805.07870: _low_level_execute_command(): starting 30564 1726882805.07874: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882805.0777519-30787-184080180850189 `" && echo ansible-tmp-1726882805.0777519-30787-184080180850189="` echo /root/.ansible/tmp/ansible-tmp-1726882805.0777519-30787-184080180850189 `" ) && sleep 0' 30564 1726882805.08461: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882805.08484: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882805.08500: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882805.08526: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882805.08571: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882805.08585: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882805.08599: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882805.08617: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882805.08638: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882805.08650: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882805.08663: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882805.08685: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882805.08702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882805.08713: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882805.08724: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882805.08739: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882805.08823: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882805.08844: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882805.08876: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882805.09015: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 30564 1726882805.11728: stdout chunk (state=3): >>>ansible-tmp-1726882805.0777519-30787-184080180850189=/root/.ansible/tmp/ansible-tmp-1726882805.0777519-30787-184080180850189 <<< 30564 1726882805.11905: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882805.11971: stderr chunk (state=3): >>><<< 30564 1726882805.11975: stdout chunk (state=3): >>><<< 30564 1726882805.12073: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882805.0777519-30787-184080180850189=/root/.ansible/tmp/ansible-tmp-1726882805.0777519-30787-184080180850189 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 30564 1726882805.12077: variable 'ansible_module_compression' from source: unknown 30564 1726882805.12274: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 30564 1726882805.12278: variable 'ansible_facts' from source: unknown 30564 1726882805.12348: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882805.0777519-30787-184080180850189/AnsiballZ_setup.py 30564 1726882805.12510: Sending initial data 30564 1726882805.12513: Sent initial data (154 bytes) 30564 1726882805.13451: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882805.13470: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882805.13491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882805.13512: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882805.13549: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882805.13560: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882805.13580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882805.13598: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882805.13609: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882805.13618: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882805.13629: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882805.13648: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882805.13666: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882805.13680: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882805.13693: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882805.13706: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882805.13793: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882805.13819: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882805.13835: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882805.13972: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 30564 1726882805.16454: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882805.16555: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882805.16671: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmpalo8235i /root/.ansible/tmp/ansible-tmp-1726882805.0777519-30787-184080180850189/AnsiballZ_setup.py <<< 30564 1726882805.16776: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882805.19443: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882805.19573: stderr chunk (state=3): >>><<< 30564 1726882805.19577: stdout chunk (state=3): >>><<< 30564 1726882805.19687: done transferring module to remote 30564 1726882805.19694: _low_level_execute_command(): starting 30564 1726882805.19697: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882805.0777519-30787-184080180850189/ /root/.ansible/tmp/ansible-tmp-1726882805.0777519-30787-184080180850189/AnsiballZ_setup.py && sleep 0' 30564 1726882805.20279: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882805.20294: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882805.20309: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882805.20326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882805.20376: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882805.20389: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882805.20402: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882805.20418: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882805.20428: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882805.20438: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882805.20455: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882805.20475: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882805.20491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882805.20502: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882805.20513: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882805.20525: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882805.20632: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882805.20662: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882805.20699: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882805.20849: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 30564 1726882805.23492: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882805.23495: stdout chunk (state=3): >>><<< 30564 1726882805.23497: stderr chunk (state=3): >>><<< 30564 1726882805.23584: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 30564 1726882805.23587: _low_level_execute_command(): starting 30564 1726882805.23590: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882805.0777519-30787-184080180850189/AnsiballZ_setup.py && sleep 0' 30564 1726882805.24040: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882805.24055: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882805.24068: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882805.24086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882805.24124: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882805.24128: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882805.24130: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882805.24236: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882805.24256: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882805.24275: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882805.24416: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 30564 1726882805.94642: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_fips": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-158", "ansible_nodename": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "21e18164a0c64d0daed004bd8a1b67b7", "ansible_is_chroot": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALEARW5ZJ51XTLSDuUsPojumVU0f1DmiQsXjMOap4QLlljOiysapjSUe6pZOyAdiI/KfARhDoOFvlC07kCLCcs7DDk8JxBZpsM0D55SdDlfwsB3FVgWNP+9by8G6kzbePHWdZyyWlAuavj4OAEwAjpWpP8/daus0ha4xywlVVoKjAAAAFQCbiW4bR+tgMvjrxC198dqI1mTbjQAAAIBzCzkJTtnGDKfOHq2dFI5cUEuaj1PgRot3wyaXENzUjZVnIFgXUmgKDCxO+EAtU6uAkBPQF4XNgiuaw5bavYpZxcJ4WIpM4ZDRoSkc7BBbJPRLZ45GfrHJwgqAmAZ3RSvVqeXE4WKQHLm43/eDHewgPqqqWe6QVuQH5SEe79yk3wAAAIEArG+AuupiAeoVJ9Lh36QMj4kRo5pTASh2eD5MqSOdy39UhsXbWBcj3JCIvNk/nwep/9neGyRZ5t5wT05dRX80vlgZJX65hrbepO+lqC3wlng+6GQ34D7TJKYnvEkR3neE0+06kx5R6IRWZf1YQV6fMQhx8AJ2JmvnLFicmYlkhQQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDND+RJCrYgIUzolo5fZ64Ey6cksefKDUWmGDjsqVTmuT3HrlDyUZOro4JAnUQBmiamXsJUFbrFdJAVpukD4yyowqCQLr0ZFuKNEzrt5CObrtWflOskKynO3kaoU0WhDkqIbwS2j/+NxBCxgDGqd/5Os3cOMv3eyjUElz6xoI4zsmGMfxVYmT+/SHBfoyxyqY8Hw2Ooq+H5L9OlYgV4hqu7kKPpM1THUJTjy47m6qvws5gztclLjPA1KIW2Dz6kKzUYspNJcoS2sK1xFvL7mBjpGAP7WhXVH2n5ySenQ24Z6mEj+tG2f11rjPpjCUjDzzciGCWiRDZWBLm/GGmQXJJ8zAYnw82yIUKqufLrr1wmcXICPMVj9pFjXSoBWe/yhX9E87w7YD5HWsUrgrLdSctdV4QYy+R5g9ERi7FjwbRsuZ04BihZs70+f/29hUzuc6MA87KVovGT0Uc7GVC7bx8NLt0bTBsbydlONVHVQuol/YEpQrQophDvmBfh+PgMDH8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOEITn1vyppR+Moe1UdR0WGPhUnQ/dwHNcNi0OYy21LkBQ5jsxOPLvZ+C2MbRYlz2afs4nYYIV8E0AuK6aRks3w=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKEdFOHVk9tX1R+zEyLVdxS/U5QeeeFYWSnUmjpXlpt7", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_fibre_channel_wwn": [], "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 33528 10.31.11.158 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 3352<<< 30564 1726882805.94665: stdout chunk (state=3): >>>8 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "40", "second": "05", "epoch": "1726882805", "epoch_int": "1726882805", "date": "2024-09-20", "time": "21:40:05", "iso8601_micro": "2024-09-21T01:40:05.640174Z", "iso8601": "2024-09-21T01:40:05Z", "iso8601_basic": "20240920T214005640174", "iso8601_basic_short": "20240920T214005", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2788, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 744, "free": 2788}, "nocache": {"free": 3252, "used": 280}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2e6858-9a88-b36a-7765-70992ab591a7", "ansible_product_uuid": "ec2e6858-9a88-b36a-7765-70992ab591a7", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 744, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264234164224, "block_size": 4096, "block_total": 65519355, "block_available": 64510294, "block_used": 1009061, "inode_total": 131071472, "inode_available": 130998690, "inode_used": 72782, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_interfaces": ["lo", "rpltstbr", "eth0"], "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "2e:06:5a:d7:92:57", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "19<<< 30564 1726882805.94896: stdout chunk (state=3): >>>2.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "<<< 30564 1726882805.94913: stdout chunk (state=3): >>>tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:4f:68:7a:de:b1", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.158", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::104f:68ff:fe7a:deb1", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.158", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:4f:68:7a:de:b1", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["192.0.2.72", "10.31.11.158"], "ansible_all_ipv6_addresses": ["fe80::104f:68ff:fe7a:deb1"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.158", "127.0.0.0/8", "127.0.0.1", "192.0.2.72"], "ipv6": ["::1", "fe80::104f:68ff:fe7a:deb1"]}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_lsb": {}, "ansible_loadavg": {"1m": 0.44, "5m": 0.42, "15m": 0.26}, "ansible_local": {}, "ansible_iscsi_iqn": "", "ansible_apparmor": {"status": "disabled"}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 30564 1726882805.96996: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882805.97051: stderr chunk (state=3): >>><<< 30564 1726882805.97054: stdout chunk (state=3): >>><<< 30564 1726882805.97090: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_fips": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-158", "ansible_nodename": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "21e18164a0c64d0daed004bd8a1b67b7", "ansible_is_chroot": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALEARW5ZJ51XTLSDuUsPojumVU0f1DmiQsXjMOap4QLlljOiysapjSUe6pZOyAdiI/KfARhDoOFvlC07kCLCcs7DDk8JxBZpsM0D55SdDlfwsB3FVgWNP+9by8G6kzbePHWdZyyWlAuavj4OAEwAjpWpP8/daus0ha4xywlVVoKjAAAAFQCbiW4bR+tgMvjrxC198dqI1mTbjQAAAIBzCzkJTtnGDKfOHq2dFI5cUEuaj1PgRot3wyaXENzUjZVnIFgXUmgKDCxO+EAtU6uAkBPQF4XNgiuaw5bavYpZxcJ4WIpM4ZDRoSkc7BBbJPRLZ45GfrHJwgqAmAZ3RSvVqeXE4WKQHLm43/eDHewgPqqqWe6QVuQH5SEe79yk3wAAAIEArG+AuupiAeoVJ9Lh36QMj4kRo5pTASh2eD5MqSOdy39UhsXbWBcj3JCIvNk/nwep/9neGyRZ5t5wT05dRX80vlgZJX65hrbepO+lqC3wlng+6GQ34D7TJKYnvEkR3neE0+06kx5R6IRWZf1YQV6fMQhx8AJ2JmvnLFicmYlkhQQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDND+RJCrYgIUzolo5fZ64Ey6cksefKDUWmGDjsqVTmuT3HrlDyUZOro4JAnUQBmiamXsJUFbrFdJAVpukD4yyowqCQLr0ZFuKNEzrt5CObrtWflOskKynO3kaoU0WhDkqIbwS2j/+NxBCxgDGqd/5Os3cOMv3eyjUElz6xoI4zsmGMfxVYmT+/SHBfoyxyqY8Hw2Ooq+H5L9OlYgV4hqu7kKPpM1THUJTjy47m6qvws5gztclLjPA1KIW2Dz6kKzUYspNJcoS2sK1xFvL7mBjpGAP7WhXVH2n5ySenQ24Z6mEj+tG2f11rjPpjCUjDzzciGCWiRDZWBLm/GGmQXJJ8zAYnw82yIUKqufLrr1wmcXICPMVj9pFjXSoBWe/yhX9E87w7YD5HWsUrgrLdSctdV4QYy+R5g9ERi7FjwbRsuZ04BihZs70+f/29hUzuc6MA87KVovGT0Uc7GVC7bx8NLt0bTBsbydlONVHVQuol/YEpQrQophDvmBfh+PgMDH8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOEITn1vyppR+Moe1UdR0WGPhUnQ/dwHNcNi0OYy21LkBQ5jsxOPLvZ+C2MbRYlz2afs4nYYIV8E0AuK6aRks3w=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKEdFOHVk9tX1R+zEyLVdxS/U5QeeeFYWSnUmjpXlpt7", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_fibre_channel_wwn": [], "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 33528 10.31.11.158 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 33528 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "40", "second": "05", "epoch": "1726882805", "epoch_int": "1726882805", "date": "2024-09-20", "time": "21:40:05", "iso8601_micro": "2024-09-21T01:40:05.640174Z", "iso8601": "2024-09-21T01:40:05Z", "iso8601_basic": "20240920T214005640174", "iso8601_basic_short": "20240920T214005", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2788, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 744, "free": 2788}, "nocache": {"free": 3252, "used": 280}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2e6858-9a88-b36a-7765-70992ab591a7", "ansible_product_uuid": "ec2e6858-9a88-b36a-7765-70992ab591a7", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 744, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264234164224, "block_size": 4096, "block_total": 65519355, "block_available": 64510294, "block_used": 1009061, "inode_total": 131071472, "inode_available": 130998690, "inode_used": 72782, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_interfaces": ["lo", "rpltstbr", "eth0"], "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "2e:06:5a:d7:92:57", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:4f:68:7a:de:b1", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.158", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::104f:68ff:fe7a:deb1", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.158", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:4f:68:7a:de:b1", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["192.0.2.72", "10.31.11.158"], "ansible_all_ipv6_addresses": ["fe80::104f:68ff:fe7a:deb1"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.158", "127.0.0.0/8", "127.0.0.1", "192.0.2.72"], "ipv6": ["::1", "fe80::104f:68ff:fe7a:deb1"]}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_lsb": {}, "ansible_loadavg": {"1m": 0.44, "5m": 0.42, "15m": 0.26}, "ansible_local": {}, "ansible_iscsi_iqn": "", "ansible_apparmor": {"status": "disabled"}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882805.97332: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882805.0777519-30787-184080180850189/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882805.97349: _low_level_execute_command(): starting 30564 1726882805.97352: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882805.0777519-30787-184080180850189/ > /dev/null 2>&1 && sleep 0' 30564 1726882805.97805: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882805.97809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882805.97846: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882805.97849: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882805.97851: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 30564 1726882805.97853: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882805.97907: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882805.97911: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882805.98021: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 30564 1726882806.00590: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882806.00636: stderr chunk (state=3): >>><<< 30564 1726882806.00639: stdout chunk (state=3): >>><<< 30564 1726882806.00651: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 30564 1726882806.00660: handler run complete 30564 1726882806.00745: variable 'ansible_facts' from source: unknown 30564 1726882806.00813: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882806.01011: variable 'ansible_facts' from source: unknown 30564 1726882806.01065: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882806.01153: attempt loop complete, returning result 30564 1726882806.01156: _execute() done 30564 1726882806.01159: dumping result to json 30564 1726882806.01184: done dumping result, returning 30564 1726882806.01191: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [0e448fcc-3ce9-4216-acec-000000000077] 30564 1726882806.01196: sending task result for task 0e448fcc-3ce9-4216-acec-000000000077 30564 1726882806.01471: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000077 30564 1726882806.01475: WORKER PROCESS EXITING ok: [managed_node2] 30564 1726882806.01689: no more pending results, returning what we have 30564 1726882806.01692: results queue empty 30564 1726882806.01692: checking for any_errors_fatal 30564 1726882806.01693: done checking for any_errors_fatal 30564 1726882806.01694: checking for max_fail_percentage 30564 1726882806.01695: done checking for max_fail_percentage 30564 1726882806.01695: checking to see if all hosts have failed and the running result is not ok 30564 1726882806.01696: done checking to see if all hosts have failed 30564 1726882806.01696: getting the remaining hosts for this loop 30564 1726882806.01697: done getting the remaining hosts for this loop 30564 1726882806.01700: getting the next task for host managed_node2 30564 1726882806.01704: done getting next task for host managed_node2 30564 1726882806.01705: ^ task is: TASK: meta (flush_handlers) 30564 1726882806.01707: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882806.01710: getting variables 30564 1726882806.01711: in VariableManager get_vars() 30564 1726882806.01727: Calling all_inventory to load vars for managed_node2 30564 1726882806.01728: Calling groups_inventory to load vars for managed_node2 30564 1726882806.01731: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882806.01738: Calling all_plugins_play to load vars for managed_node2 30564 1726882806.01740: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882806.01742: Calling groups_plugins_play to load vars for managed_node2 30564 1726882806.01850: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882806.01990: done with get_vars() 30564 1726882806.01997: done getting variables 30564 1726882806.02046: in VariableManager get_vars() 30564 1726882806.02053: Calling all_inventory to load vars for managed_node2 30564 1726882806.02054: Calling groups_inventory to load vars for managed_node2 30564 1726882806.02055: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882806.02059: Calling all_plugins_play to load vars for managed_node2 30564 1726882806.02060: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882806.02062: Calling groups_plugins_play to load vars for managed_node2 30564 1726882806.02151: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882806.02267: done with get_vars() 30564 1726882806.02277: done queuing things up, now waiting for results queue to drain 30564 1726882806.02278: results queue empty 30564 1726882806.02279: checking for any_errors_fatal 30564 1726882806.02281: done checking for any_errors_fatal 30564 1726882806.02281: checking for max_fail_percentage 30564 1726882806.02282: done checking for max_fail_percentage 30564 1726882806.02286: checking to see if all hosts have failed and the running result is not ok 30564 1726882806.02286: done checking to see if all hosts have failed 30564 1726882806.02287: getting the remaining hosts for this loop 30564 1726882806.02287: done getting the remaining hosts for this loop 30564 1726882806.02289: getting the next task for host managed_node2 30564 1726882806.02291: done getting next task for host managed_node2 30564 1726882806.02292: ^ task is: TASK: Show playbook name 30564 1726882806.02293: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882806.02295: getting variables 30564 1726882806.02295: in VariableManager get_vars() 30564 1726882806.02301: Calling all_inventory to load vars for managed_node2 30564 1726882806.02303: Calling groups_inventory to load vars for managed_node2 30564 1726882806.02304: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882806.02307: Calling all_plugins_play to load vars for managed_node2 30564 1726882806.02308: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882806.02310: Calling groups_plugins_play to load vars for managed_node2 30564 1726882806.02396: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882806.02513: done with get_vars() 30564 1726882806.02519: done getting variables 30564 1726882806.02575: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Show playbook name] ****************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_states.yml:11 Friday 20 September 2024 21:40:06 -0400 (0:00:01.007) 0:00:04.607 ****** 30564 1726882806.02593: entering _queue_task() for managed_node2/debug 30564 1726882806.02594: Creating lock for debug 30564 1726882806.02786: worker is 1 (out of 1 available) 30564 1726882806.02797: exiting _queue_task() for managed_node2/debug 30564 1726882806.02808: done queuing things up, now waiting for results queue to drain 30564 1726882806.02810: waiting for pending results... 30564 1726882806.02960: running TaskExecutor() for managed_node2/TASK: Show playbook name 30564 1726882806.03014: in run() - task 0e448fcc-3ce9-4216-acec-00000000000b 30564 1726882806.03024: variable 'ansible_search_path' from source: unknown 30564 1726882806.03052: calling self._execute() 30564 1726882806.03110: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882806.03114: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882806.03123: variable 'omit' from source: magic vars 30564 1726882806.03444: variable 'ansible_distribution_major_version' from source: facts 30564 1726882806.03460: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882806.03466: variable 'omit' from source: magic vars 30564 1726882806.03490: variable 'omit' from source: magic vars 30564 1726882806.03515: variable 'omit' from source: magic vars 30564 1726882806.03542: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882806.03573: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882806.03588: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882806.03601: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882806.03610: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882806.03632: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882806.03639: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882806.03642: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882806.03712: Set connection var ansible_timeout to 10 30564 1726882806.03715: Set connection var ansible_pipelining to False 30564 1726882806.03718: Set connection var ansible_shell_type to sh 30564 1726882806.03723: Set connection var ansible_shell_executable to /bin/sh 30564 1726882806.03733: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882806.03736: Set connection var ansible_connection to ssh 30564 1726882806.03755: variable 'ansible_shell_executable' from source: unknown 30564 1726882806.03757: variable 'ansible_connection' from source: unknown 30564 1726882806.03760: variable 'ansible_module_compression' from source: unknown 30564 1726882806.03763: variable 'ansible_shell_type' from source: unknown 30564 1726882806.03767: variable 'ansible_shell_executable' from source: unknown 30564 1726882806.03774: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882806.03777: variable 'ansible_pipelining' from source: unknown 30564 1726882806.03780: variable 'ansible_timeout' from source: unknown 30564 1726882806.03784: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882806.03885: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882806.03892: variable 'omit' from source: magic vars 30564 1726882806.03897: starting attempt loop 30564 1726882806.03900: running the handler 30564 1726882806.03950: handler run complete 30564 1726882806.03975: attempt loop complete, returning result 30564 1726882806.03983: _execute() done 30564 1726882806.03989: dumping result to json 30564 1726882806.03994: done dumping result, returning 30564 1726882806.04004: done running TaskExecutor() for managed_node2/TASK: Show playbook name [0e448fcc-3ce9-4216-acec-00000000000b] 30564 1726882806.04012: sending task result for task 0e448fcc-3ce9-4216-acec-00000000000b 30564 1726882806.04104: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000000b 30564 1726882806.04111: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: this is: playbooks/tests_states.yml 30564 1726882806.04255: no more pending results, returning what we have 30564 1726882806.04258: results queue empty 30564 1726882806.04259: checking for any_errors_fatal 30564 1726882806.04260: done checking for any_errors_fatal 30564 1726882806.04261: checking for max_fail_percentage 30564 1726882806.04262: done checking for max_fail_percentage 30564 1726882806.04265: checking to see if all hosts have failed and the running result is not ok 30564 1726882806.04266: done checking to see if all hosts have failed 30564 1726882806.04269: getting the remaining hosts for this loop 30564 1726882806.04270: done getting the remaining hosts for this loop 30564 1726882806.04273: getting the next task for host managed_node2 30564 1726882806.04286: done getting next task for host managed_node2 30564 1726882806.04290: ^ task is: TASK: Include the task 'run_test.yml' 30564 1726882806.04291: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882806.04294: getting variables 30564 1726882806.04296: in VariableManager get_vars() 30564 1726882806.04319: Calling all_inventory to load vars for managed_node2 30564 1726882806.04321: Calling groups_inventory to load vars for managed_node2 30564 1726882806.04339: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882806.04349: Calling all_plugins_play to load vars for managed_node2 30564 1726882806.04352: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882806.04355: Calling groups_plugins_play to load vars for managed_node2 30564 1726882806.04612: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882806.04844: done with get_vars() 30564 1726882806.04852: done getting variables TASK [Include the task 'run_test.yml'] ***************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_states.yml:22 Friday 20 September 2024 21:40:06 -0400 (0:00:00.023) 0:00:04.630 ****** 30564 1726882806.04945: entering _queue_task() for managed_node2/include_tasks 30564 1726882806.05173: worker is 1 (out of 1 available) 30564 1726882806.05184: exiting _queue_task() for managed_node2/include_tasks 30564 1726882806.05194: done queuing things up, now waiting for results queue to drain 30564 1726882806.05196: waiting for pending results... 30564 1726882806.05469: running TaskExecutor() for managed_node2/TASK: Include the task 'run_test.yml' 30564 1726882806.05531: in run() - task 0e448fcc-3ce9-4216-acec-00000000000d 30564 1726882806.05542: variable 'ansible_search_path' from source: unknown 30564 1726882806.05573: calling self._execute() 30564 1726882806.05632: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882806.05640: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882806.05649: variable 'omit' from source: magic vars 30564 1726882806.05925: variable 'ansible_distribution_major_version' from source: facts 30564 1726882806.05936: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882806.05941: _execute() done 30564 1726882806.05944: dumping result to json 30564 1726882806.05946: done dumping result, returning 30564 1726882806.05952: done running TaskExecutor() for managed_node2/TASK: Include the task 'run_test.yml' [0e448fcc-3ce9-4216-acec-00000000000d] 30564 1726882806.05957: sending task result for task 0e448fcc-3ce9-4216-acec-00000000000d 30564 1726882806.06056: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000000d 30564 1726882806.06058: WORKER PROCESS EXITING 30564 1726882806.06088: no more pending results, returning what we have 30564 1726882806.06092: in VariableManager get_vars() 30564 1726882806.06116: Calling all_inventory to load vars for managed_node2 30564 1726882806.06118: Calling groups_inventory to load vars for managed_node2 30564 1726882806.06121: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882806.06129: Calling all_plugins_play to load vars for managed_node2 30564 1726882806.06132: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882806.06134: Calling groups_plugins_play to load vars for managed_node2 30564 1726882806.06247: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882806.06389: done with get_vars() 30564 1726882806.06394: variable 'ansible_search_path' from source: unknown 30564 1726882806.06403: we have included files to process 30564 1726882806.06404: generating all_blocks data 30564 1726882806.06404: done generating all_blocks data 30564 1726882806.06405: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30564 1726882806.06406: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30564 1726882806.06407: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30564 1726882806.06727: in VariableManager get_vars() 30564 1726882806.06737: done with get_vars() 30564 1726882806.06775: in VariableManager get_vars() 30564 1726882806.06785: done with get_vars() 30564 1726882806.06810: in VariableManager get_vars() 30564 1726882806.06820: done with get_vars() 30564 1726882806.06846: in VariableManager get_vars() 30564 1726882806.06855: done with get_vars() 30564 1726882806.06883: in VariableManager get_vars() 30564 1726882806.06892: done with get_vars() 30564 1726882806.07123: in VariableManager get_vars() 30564 1726882806.07134: done with get_vars() 30564 1726882806.07141: done processing included file 30564 1726882806.07142: iterating over new_blocks loaded from include file 30564 1726882806.07143: in VariableManager get_vars() 30564 1726882806.07150: done with get_vars() 30564 1726882806.07151: filtering new block on tags 30564 1726882806.07210: done filtering new block on tags 30564 1726882806.07212: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml for managed_node2 30564 1726882806.07215: extending task lists for all hosts with included blocks 30564 1726882806.07237: done extending task lists 30564 1726882806.07238: done processing included files 30564 1726882806.07238: results queue empty 30564 1726882806.07239: checking for any_errors_fatal 30564 1726882806.07240: done checking for any_errors_fatal 30564 1726882806.07241: checking for max_fail_percentage 30564 1726882806.07241: done checking for max_fail_percentage 30564 1726882806.07242: checking to see if all hosts have failed and the running result is not ok 30564 1726882806.07243: done checking to see if all hosts have failed 30564 1726882806.07243: getting the remaining hosts for this loop 30564 1726882806.07244: done getting the remaining hosts for this loop 30564 1726882806.07245: getting the next task for host managed_node2 30564 1726882806.07248: done getting next task for host managed_node2 30564 1726882806.07249: ^ task is: TASK: TEST: {{ lsr_description }} 30564 1726882806.07251: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882806.07253: getting variables 30564 1726882806.07254: in VariableManager get_vars() 30564 1726882806.07266: Calling all_inventory to load vars for managed_node2 30564 1726882806.07270: Calling groups_inventory to load vars for managed_node2 30564 1726882806.07275: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882806.07282: Calling all_plugins_play to load vars for managed_node2 30564 1726882806.07284: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882806.07287: Calling groups_plugins_play to load vars for managed_node2 30564 1726882806.07404: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882806.07542: done with get_vars() 30564 1726882806.07548: done getting variables 30564 1726882806.07587: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30564 1726882806.07671: variable 'lsr_description' from source: include params TASK [TEST: I can create a profile] ******************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:5 Friday 20 September 2024 21:40:06 -0400 (0:00:00.027) 0:00:04.658 ****** 30564 1726882806.07702: entering _queue_task() for managed_node2/debug 30564 1726882806.07868: worker is 1 (out of 1 available) 30564 1726882806.07882: exiting _queue_task() for managed_node2/debug 30564 1726882806.07893: done queuing things up, now waiting for results queue to drain 30564 1726882806.07894: waiting for pending results... 30564 1726882806.08049: running TaskExecutor() for managed_node2/TASK: TEST: I can create a profile 30564 1726882806.08119: in run() - task 0e448fcc-3ce9-4216-acec-000000000091 30564 1726882806.08125: variable 'ansible_search_path' from source: unknown 30564 1726882806.08131: variable 'ansible_search_path' from source: unknown 30564 1726882806.08159: calling self._execute() 30564 1726882806.08247: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882806.08265: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882806.08281: variable 'omit' from source: magic vars 30564 1726882806.08630: variable 'ansible_distribution_major_version' from source: facts 30564 1726882806.08647: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882806.08659: variable 'omit' from source: magic vars 30564 1726882806.08707: variable 'omit' from source: magic vars 30564 1726882806.08813: variable 'lsr_description' from source: include params 30564 1726882806.08833: variable 'omit' from source: magic vars 30564 1726882806.08876: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882806.08922: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882806.08944: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882806.08970: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882806.08990: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882806.09032: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882806.09042: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882806.09050: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882806.09161: Set connection var ansible_timeout to 10 30564 1726882806.09175: Set connection var ansible_pipelining to False 30564 1726882806.09183: Set connection var ansible_shell_type to sh 30564 1726882806.09198: Set connection var ansible_shell_executable to /bin/sh 30564 1726882806.09204: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882806.09206: Set connection var ansible_connection to ssh 30564 1726882806.09226: variable 'ansible_shell_executable' from source: unknown 30564 1726882806.09231: variable 'ansible_connection' from source: unknown 30564 1726882806.09237: variable 'ansible_module_compression' from source: unknown 30564 1726882806.09241: variable 'ansible_shell_type' from source: unknown 30564 1726882806.09243: variable 'ansible_shell_executable' from source: unknown 30564 1726882806.09245: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882806.09247: variable 'ansible_pipelining' from source: unknown 30564 1726882806.09251: variable 'ansible_timeout' from source: unknown 30564 1726882806.09258: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882806.09368: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882806.09381: variable 'omit' from source: magic vars 30564 1726882806.09384: starting attempt loop 30564 1726882806.09387: running the handler 30564 1726882806.09420: handler run complete 30564 1726882806.09433: attempt loop complete, returning result 30564 1726882806.09436: _execute() done 30564 1726882806.09438: dumping result to json 30564 1726882806.09443: done dumping result, returning 30564 1726882806.09452: done running TaskExecutor() for managed_node2/TASK: TEST: I can create a profile [0e448fcc-3ce9-4216-acec-000000000091] 30564 1726882806.09461: sending task result for task 0e448fcc-3ce9-4216-acec-000000000091 30564 1726882806.09540: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000091 30564 1726882806.09543: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: ########## I can create a profile ########## 30564 1726882806.09600: no more pending results, returning what we have 30564 1726882806.09602: results queue empty 30564 1726882806.09603: checking for any_errors_fatal 30564 1726882806.09605: done checking for any_errors_fatal 30564 1726882806.09605: checking for max_fail_percentage 30564 1726882806.09607: done checking for max_fail_percentage 30564 1726882806.09607: checking to see if all hosts have failed and the running result is not ok 30564 1726882806.09608: done checking to see if all hosts have failed 30564 1726882806.09609: getting the remaining hosts for this loop 30564 1726882806.09610: done getting the remaining hosts for this loop 30564 1726882806.09613: getting the next task for host managed_node2 30564 1726882806.09617: done getting next task for host managed_node2 30564 1726882806.09619: ^ task is: TASK: Show item 30564 1726882806.09622: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882806.09625: getting variables 30564 1726882806.09626: in VariableManager get_vars() 30564 1726882806.09645: Calling all_inventory to load vars for managed_node2 30564 1726882806.09646: Calling groups_inventory to load vars for managed_node2 30564 1726882806.09648: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882806.09657: Calling all_plugins_play to load vars for managed_node2 30564 1726882806.09659: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882806.09661: Calling groups_plugins_play to load vars for managed_node2 30564 1726882806.09778: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882806.09918: done with get_vars() 30564 1726882806.09924: done getting variables 30564 1726882806.09958: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show item] *************************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:9 Friday 20 September 2024 21:40:06 -0400 (0:00:00.022) 0:00:04.681 ****** 30564 1726882806.09980: entering _queue_task() for managed_node2/debug 30564 1726882806.10128: worker is 1 (out of 1 available) 30564 1726882806.10139: exiting _queue_task() for managed_node2/debug 30564 1726882806.10149: done queuing things up, now waiting for results queue to drain 30564 1726882806.10150: waiting for pending results... 30564 1726882806.10286: running TaskExecutor() for managed_node2/TASK: Show item 30564 1726882806.10336: in run() - task 0e448fcc-3ce9-4216-acec-000000000092 30564 1726882806.10347: variable 'ansible_search_path' from source: unknown 30564 1726882806.10350: variable 'ansible_search_path' from source: unknown 30564 1726882806.10387: variable 'omit' from source: magic vars 30564 1726882806.10470: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882806.10476: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882806.10485: variable 'omit' from source: magic vars 30564 1726882806.10705: variable 'ansible_distribution_major_version' from source: facts 30564 1726882806.10715: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882806.10720: variable 'omit' from source: magic vars 30564 1726882806.10745: variable 'omit' from source: magic vars 30564 1726882806.10782: variable 'item' from source: unknown 30564 1726882806.10831: variable 'item' from source: unknown 30564 1726882806.10847: variable 'omit' from source: magic vars 30564 1726882806.10884: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882806.10906: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882806.10920: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882806.10935: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882806.10944: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882806.10969: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882806.10979: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882806.10982: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882806.11088: Set connection var ansible_timeout to 10 30564 1726882806.11468: Set connection var ansible_pipelining to False 30564 1726882806.11472: Set connection var ansible_shell_type to sh 30564 1726882806.11475: Set connection var ansible_shell_executable to /bin/sh 30564 1726882806.11477: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882806.11480: Set connection var ansible_connection to ssh 30564 1726882806.11482: variable 'ansible_shell_executable' from source: unknown 30564 1726882806.11484: variable 'ansible_connection' from source: unknown 30564 1726882806.11486: variable 'ansible_module_compression' from source: unknown 30564 1726882806.11488: variable 'ansible_shell_type' from source: unknown 30564 1726882806.11490: variable 'ansible_shell_executable' from source: unknown 30564 1726882806.11492: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882806.11494: variable 'ansible_pipelining' from source: unknown 30564 1726882806.11870: variable 'ansible_timeout' from source: unknown 30564 1726882806.11873: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882806.11877: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882806.11880: variable 'omit' from source: magic vars 30564 1726882806.11882: starting attempt loop 30564 1726882806.11884: running the handler 30564 1726882806.11886: variable 'lsr_description' from source: include params 30564 1726882806.11888: variable 'lsr_description' from source: include params 30564 1726882806.11890: handler run complete 30564 1726882806.11892: attempt loop complete, returning result 30564 1726882806.11894: variable 'item' from source: unknown 30564 1726882806.11896: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_description) => { "ansible_loop_var": "item", "item": "lsr_description", "lsr_description": "I can create a profile" } 30564 1726882806.12003: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882806.12006: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882806.12009: variable 'omit' from source: magic vars 30564 1726882806.12011: variable 'ansible_distribution_major_version' from source: facts 30564 1726882806.12013: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882806.12015: variable 'omit' from source: magic vars 30564 1726882806.12017: variable 'omit' from source: magic vars 30564 1726882806.12019: variable 'item' from source: unknown 30564 1726882806.12021: variable 'item' from source: unknown 30564 1726882806.12023: variable 'omit' from source: magic vars 30564 1726882806.12025: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882806.12027: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882806.12029: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882806.12031: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882806.12033: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882806.12035: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882806.12037: Set connection var ansible_timeout to 10 30564 1726882806.12039: Set connection var ansible_pipelining to False 30564 1726882806.12042: Set connection var ansible_shell_type to sh 30564 1726882806.12043: Set connection var ansible_shell_executable to /bin/sh 30564 1726882806.12082: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882806.12084: Set connection var ansible_connection to ssh 30564 1726882806.12089: variable 'ansible_shell_executable' from source: unknown 30564 1726882806.12092: variable 'ansible_connection' from source: unknown 30564 1726882806.12094: variable 'ansible_module_compression' from source: unknown 30564 1726882806.12191: variable 'ansible_shell_type' from source: unknown 30564 1726882806.12194: variable 'ansible_shell_executable' from source: unknown 30564 1726882806.12200: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882806.12203: variable 'ansible_pipelining' from source: unknown 30564 1726882806.12205: variable 'ansible_timeout' from source: unknown 30564 1726882806.12207: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882806.12224: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882806.12237: variable 'omit' from source: magic vars 30564 1726882806.12250: starting attempt loop 30564 1726882806.12258: running the handler 30564 1726882806.12287: variable 'lsr_setup' from source: include params 30564 1726882806.12373: variable 'lsr_setup' from source: include params 30564 1726882806.12429: handler run complete 30564 1726882806.12448: attempt loop complete, returning result 30564 1726882806.12470: variable 'item' from source: unknown 30564 1726882806.12545: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_setup) => { "ansible_loop_var": "item", "item": "lsr_setup", "lsr_setup": [ "tasks/delete_interface.yml", "tasks/assert_device_absent.yml" ] } 30564 1726882806.12716: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882806.12730: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882806.12742: variable 'omit' from source: magic vars 30564 1726882806.12912: variable 'ansible_distribution_major_version' from source: facts 30564 1726882806.12922: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882806.12929: variable 'omit' from source: magic vars 30564 1726882806.12945: variable 'omit' from source: magic vars 30564 1726882806.12995: variable 'item' from source: unknown 30564 1726882806.13051: variable 'item' from source: unknown 30564 1726882806.13073: variable 'omit' from source: magic vars 30564 1726882806.13100: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882806.13111: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882806.13120: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882806.13133: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882806.13140: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882806.13146: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882806.13220: Set connection var ansible_timeout to 10 30564 1726882806.13229: Set connection var ansible_pipelining to False 30564 1726882806.13235: Set connection var ansible_shell_type to sh 30564 1726882806.13243: Set connection var ansible_shell_executable to /bin/sh 30564 1726882806.13253: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882806.13259: Set connection var ansible_connection to ssh 30564 1726882806.13285: variable 'ansible_shell_executable' from source: unknown 30564 1726882806.13292: variable 'ansible_connection' from source: unknown 30564 1726882806.13297: variable 'ansible_module_compression' from source: unknown 30564 1726882806.13310: variable 'ansible_shell_type' from source: unknown 30564 1726882806.13316: variable 'ansible_shell_executable' from source: unknown 30564 1726882806.13321: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882806.13328: variable 'ansible_pipelining' from source: unknown 30564 1726882806.13333: variable 'ansible_timeout' from source: unknown 30564 1726882806.13339: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882806.13425: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882806.13436: variable 'omit' from source: magic vars 30564 1726882806.13443: starting attempt loop 30564 1726882806.13448: running the handler 30564 1726882806.13472: variable 'lsr_test' from source: include params 30564 1726882806.13539: variable 'lsr_test' from source: include params 30564 1726882806.13559: handler run complete 30564 1726882806.13581: attempt loop complete, returning result 30564 1726882806.13598: variable 'item' from source: unknown 30564 1726882806.13662: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_test) => { "ansible_loop_var": "item", "item": "lsr_test", "lsr_test": [ "tasks/create_bridge_profile.yml" ] } 30564 1726882806.13802: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882806.13814: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882806.13826: variable 'omit' from source: magic vars 30564 1726882806.13978: variable 'ansible_distribution_major_version' from source: facts 30564 1726882806.13989: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882806.13997: variable 'omit' from source: magic vars 30564 1726882806.14014: variable 'omit' from source: magic vars 30564 1726882806.14056: variable 'item' from source: unknown 30564 1726882806.14130: variable 'item' from source: unknown 30564 1726882806.14151: variable 'omit' from source: magic vars 30564 1726882806.14181: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882806.14194: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882806.14205: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882806.14220: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882806.14227: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882806.14234: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882806.14319: Set connection var ansible_timeout to 10 30564 1726882806.14330: Set connection var ansible_pipelining to False 30564 1726882806.14336: Set connection var ansible_shell_type to sh 30564 1726882806.14345: Set connection var ansible_shell_executable to /bin/sh 30564 1726882806.14354: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882806.14360: Set connection var ansible_connection to ssh 30564 1726882806.14386: variable 'ansible_shell_executable' from source: unknown 30564 1726882806.14397: variable 'ansible_connection' from source: unknown 30564 1726882806.14408: variable 'ansible_module_compression' from source: unknown 30564 1726882806.14414: variable 'ansible_shell_type' from source: unknown 30564 1726882806.14419: variable 'ansible_shell_executable' from source: unknown 30564 1726882806.14425: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882806.14431: variable 'ansible_pipelining' from source: unknown 30564 1726882806.14436: variable 'ansible_timeout' from source: unknown 30564 1726882806.14442: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882806.14534: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882806.14545: variable 'omit' from source: magic vars 30564 1726882806.14552: starting attempt loop 30564 1726882806.14558: running the handler 30564 1726882806.14582: variable 'lsr_assert' from source: include params 30564 1726882806.14651: variable 'lsr_assert' from source: include params 30564 1726882806.14677: handler run complete 30564 1726882806.14695: attempt loop complete, returning result 30564 1726882806.14715: variable 'item' from source: unknown 30564 1726882806.14791: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_assert) => { "ansible_loop_var": "item", "item": "lsr_assert", "lsr_assert": [ "tasks/assert_profile_present.yml" ] } 30564 1726882806.14935: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882806.14950: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882806.14969: variable 'omit' from source: magic vars 30564 1726882806.15429: variable 'ansible_distribution_major_version' from source: facts 30564 1726882806.15441: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882806.15455: variable 'omit' from source: magic vars 30564 1726882806.15480: variable 'omit' from source: magic vars 30564 1726882806.15525: variable 'item' from source: unknown 30564 1726882806.15600: variable 'item' from source: unknown 30564 1726882806.15619: variable 'omit' from source: magic vars 30564 1726882806.15647: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882806.15659: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882806.15674: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882806.15689: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882806.15697: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882806.15703: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882806.15785: Set connection var ansible_timeout to 10 30564 1726882806.15795: Set connection var ansible_pipelining to False 30564 1726882806.15801: Set connection var ansible_shell_type to sh 30564 1726882806.15809: Set connection var ansible_shell_executable to /bin/sh 30564 1726882806.15819: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882806.15824: Set connection var ansible_connection to ssh 30564 1726882806.15849: variable 'ansible_shell_executable' from source: unknown 30564 1726882806.15863: variable 'ansible_connection' from source: unknown 30564 1726882806.15875: variable 'ansible_module_compression' from source: unknown 30564 1726882806.15882: variable 'ansible_shell_type' from source: unknown 30564 1726882806.15888: variable 'ansible_shell_executable' from source: unknown 30564 1726882806.15894: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882806.15901: variable 'ansible_pipelining' from source: unknown 30564 1726882806.15907: variable 'ansible_timeout' from source: unknown 30564 1726882806.15913: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882806.16006: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882806.16018: variable 'omit' from source: magic vars 30564 1726882806.16025: starting attempt loop 30564 1726882806.16031: running the handler 30564 1726882806.16051: variable 'lsr_assert_when' from source: include params 30564 1726882806.16124: variable 'lsr_assert_when' from source: include params 30564 1726882806.16220: variable 'network_provider' from source: set_fact 30564 1726882806.16255: handler run complete 30564 1726882806.16280: attempt loop complete, returning result 30564 1726882806.16305: variable 'item' from source: unknown 30564 1726882806.16369: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_assert_when) => { "ansible_loop_var": "item", "item": "lsr_assert_when", "lsr_assert_when": [ { "condition": true, "what": "tasks/assert_device_present.yml" } ] } 30564 1726882806.16514: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882806.16528: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882806.16541: variable 'omit' from source: magic vars 30564 1726882806.16696: variable 'ansible_distribution_major_version' from source: facts 30564 1726882806.16706: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882806.16713: variable 'omit' from source: magic vars 30564 1726882806.16728: variable 'omit' from source: magic vars 30564 1726882806.16771: variable 'item' from source: unknown 30564 1726882806.16839: variable 'item' from source: unknown 30564 1726882806.16858: variable 'omit' from source: magic vars 30564 1726882806.16884: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882806.16903: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882806.16913: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882806.16927: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882806.16934: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882806.16941: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882806.17024: Set connection var ansible_timeout to 10 30564 1726882806.17034: Set connection var ansible_pipelining to False 30564 1726882806.17041: Set connection var ansible_shell_type to sh 30564 1726882806.17049: Set connection var ansible_shell_executable to /bin/sh 30564 1726882806.17059: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882806.17069: Set connection var ansible_connection to ssh 30564 1726882806.17094: variable 'ansible_shell_executable' from source: unknown 30564 1726882806.17104: variable 'ansible_connection' from source: unknown 30564 1726882806.17116: variable 'ansible_module_compression' from source: unknown 30564 1726882806.17126: variable 'ansible_shell_type' from source: unknown 30564 1726882806.17132: variable 'ansible_shell_executable' from source: unknown 30564 1726882806.17138: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882806.17145: variable 'ansible_pipelining' from source: unknown 30564 1726882806.17151: variable 'ansible_timeout' from source: unknown 30564 1726882806.17158: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882806.17254: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882806.17272: variable 'omit' from source: magic vars 30564 1726882806.17282: starting attempt loop 30564 1726882806.17289: running the handler 30564 1726882806.17310: variable 'lsr_fail_debug' from source: play vars 30564 1726882806.17386: variable 'lsr_fail_debug' from source: play vars 30564 1726882806.17406: handler run complete 30564 1726882806.17422: attempt loop complete, returning result 30564 1726882806.17450: variable 'item' from source: unknown 30564 1726882806.17519: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_fail_debug) => { "ansible_loop_var": "item", "item": "lsr_fail_debug", "lsr_fail_debug": [ "__network_connections_result" ] } 30564 1726882806.17673: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882806.17688: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882806.17703: variable 'omit' from source: magic vars 30564 1726882806.17861: variable 'ansible_distribution_major_version' from source: facts 30564 1726882806.17877: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882806.17886: variable 'omit' from source: magic vars 30564 1726882806.17902: variable 'omit' from source: magic vars 30564 1726882806.17950: variable 'item' from source: unknown 30564 1726882806.18015: variable 'item' from source: unknown 30564 1726882806.18033: variable 'omit' from source: magic vars 30564 1726882806.18060: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882806.18081: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882806.18092: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882806.18106: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882806.18113: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882806.18119: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882806.18200: Set connection var ansible_timeout to 10 30564 1726882806.18210: Set connection var ansible_pipelining to False 30564 1726882806.18217: Set connection var ansible_shell_type to sh 30564 1726882806.18225: Set connection var ansible_shell_executable to /bin/sh 30564 1726882806.18236: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882806.18242: Set connection var ansible_connection to ssh 30564 1726882806.18275: variable 'ansible_shell_executable' from source: unknown 30564 1726882806.18284: variable 'ansible_connection' from source: unknown 30564 1726882806.18291: variable 'ansible_module_compression' from source: unknown 30564 1726882806.18297: variable 'ansible_shell_type' from source: unknown 30564 1726882806.18302: variable 'ansible_shell_executable' from source: unknown 30564 1726882806.18308: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882806.18315: variable 'ansible_pipelining' from source: unknown 30564 1726882806.18321: variable 'ansible_timeout' from source: unknown 30564 1726882806.18327: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882806.18419: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882806.18430: variable 'omit' from source: magic vars 30564 1726882806.18437: starting attempt loop 30564 1726882806.18442: running the handler 30564 1726882806.18462: variable 'lsr_cleanup' from source: include params 30564 1726882806.18530: variable 'lsr_cleanup' from source: include params 30564 1726882806.18551: handler run complete 30564 1726882806.18576: attempt loop complete, returning result 30564 1726882806.18603: variable 'item' from source: unknown 30564 1726882806.18669: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_cleanup) => { "ansible_loop_var": "item", "item": "lsr_cleanup", "lsr_cleanup": [ "tasks/cleanup_profile+device.yml" ] } 30564 1726882806.18773: dumping result to json 30564 1726882806.18787: done dumping result, returning 30564 1726882806.18798: done running TaskExecutor() for managed_node2/TASK: Show item [0e448fcc-3ce9-4216-acec-000000000092] 30564 1726882806.18808: sending task result for task 0e448fcc-3ce9-4216-acec-000000000092 30564 1726882806.18931: no more pending results, returning what we have 30564 1726882806.18934: results queue empty 30564 1726882806.18934: checking for any_errors_fatal 30564 1726882806.18940: done checking for any_errors_fatal 30564 1726882806.18941: checking for max_fail_percentage 30564 1726882806.18942: done checking for max_fail_percentage 30564 1726882806.18943: checking to see if all hosts have failed and the running result is not ok 30564 1726882806.18944: done checking to see if all hosts have failed 30564 1726882806.18945: getting the remaining hosts for this loop 30564 1726882806.18946: done getting the remaining hosts for this loop 30564 1726882806.18949: getting the next task for host managed_node2 30564 1726882806.18956: done getting next task for host managed_node2 30564 1726882806.18958: ^ task is: TASK: Include the task 'show_interfaces.yml' 30564 1726882806.18961: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882806.18966: getting variables 30564 1726882806.18968: in VariableManager get_vars() 30564 1726882806.18994: Calling all_inventory to load vars for managed_node2 30564 1726882806.18997: Calling groups_inventory to load vars for managed_node2 30564 1726882806.19000: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882806.19007: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000092 30564 1726882806.19010: WORKER PROCESS EXITING 30564 1726882806.19021: Calling all_plugins_play to load vars for managed_node2 30564 1726882806.19024: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882806.19027: Calling groups_plugins_play to load vars for managed_node2 30564 1726882806.19195: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882806.19570: done with get_vars() 30564 1726882806.19579: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:21 Friday 20 September 2024 21:40:06 -0400 (0:00:00.096) 0:00:04.777 ****** 30564 1726882806.19657: entering _queue_task() for managed_node2/include_tasks 30564 1726882806.19887: worker is 1 (out of 1 available) 30564 1726882806.19898: exiting _queue_task() for managed_node2/include_tasks 30564 1726882806.19909: done queuing things up, now waiting for results queue to drain 30564 1726882806.19910: waiting for pending results... 30564 1726882806.20152: running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' 30564 1726882806.20258: in run() - task 0e448fcc-3ce9-4216-acec-000000000093 30564 1726882806.20279: variable 'ansible_search_path' from source: unknown 30564 1726882806.20287: variable 'ansible_search_path' from source: unknown 30564 1726882806.20326: calling self._execute() 30564 1726882806.20410: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882806.20423: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882806.20438: variable 'omit' from source: magic vars 30564 1726882806.20781: variable 'ansible_distribution_major_version' from source: facts 30564 1726882806.20803: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882806.20814: _execute() done 30564 1726882806.20821: dumping result to json 30564 1726882806.20828: done dumping result, returning 30564 1726882806.20837: done running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' [0e448fcc-3ce9-4216-acec-000000000093] 30564 1726882806.20849: sending task result for task 0e448fcc-3ce9-4216-acec-000000000093 30564 1726882806.20965: no more pending results, returning what we have 30564 1726882806.20970: in VariableManager get_vars() 30564 1726882806.21007: Calling all_inventory to load vars for managed_node2 30564 1726882806.21010: Calling groups_inventory to load vars for managed_node2 30564 1726882806.21013: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882806.21026: Calling all_plugins_play to load vars for managed_node2 30564 1726882806.21029: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882806.21032: Calling groups_plugins_play to load vars for managed_node2 30564 1726882806.21208: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882806.21333: done with get_vars() 30564 1726882806.21338: variable 'ansible_search_path' from source: unknown 30564 1726882806.21339: variable 'ansible_search_path' from source: unknown 30564 1726882806.21358: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000093 30564 1726882806.21361: WORKER PROCESS EXITING 30564 1726882806.21379: we have included files to process 30564 1726882806.21380: generating all_blocks data 30564 1726882806.21381: done generating all_blocks data 30564 1726882806.21384: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30564 1726882806.21385: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30564 1726882806.21387: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30564 1726882806.21504: in VariableManager get_vars() 30564 1726882806.21517: done with get_vars() 30564 1726882806.21587: done processing included file 30564 1726882806.21588: iterating over new_blocks loaded from include file 30564 1726882806.21589: in VariableManager get_vars() 30564 1726882806.21597: done with get_vars() 30564 1726882806.21598: filtering new block on tags 30564 1726882806.21620: done filtering new block on tags 30564 1726882806.21622: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node2 30564 1726882806.21625: extending task lists for all hosts with included blocks 30564 1726882806.21920: done extending task lists 30564 1726882806.21921: done processing included files 30564 1726882806.21922: results queue empty 30564 1726882806.21922: checking for any_errors_fatal 30564 1726882806.21927: done checking for any_errors_fatal 30564 1726882806.21927: checking for max_fail_percentage 30564 1726882806.21928: done checking for max_fail_percentage 30564 1726882806.21929: checking to see if all hosts have failed and the running result is not ok 30564 1726882806.21930: done checking to see if all hosts have failed 30564 1726882806.21930: getting the remaining hosts for this loop 30564 1726882806.21931: done getting the remaining hosts for this loop 30564 1726882806.21932: getting the next task for host managed_node2 30564 1726882806.21935: done getting next task for host managed_node2 30564 1726882806.21937: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 30564 1726882806.21940: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882806.21941: getting variables 30564 1726882806.21942: in VariableManager get_vars() 30564 1726882806.21947: Calling all_inventory to load vars for managed_node2 30564 1726882806.21948: Calling groups_inventory to load vars for managed_node2 30564 1726882806.21950: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882806.21953: Calling all_plugins_play to load vars for managed_node2 30564 1726882806.21954: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882806.21956: Calling groups_plugins_play to load vars for managed_node2 30564 1726882806.22040: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882806.22161: done with get_vars() 30564 1726882806.22169: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 21:40:06 -0400 (0:00:00.025) 0:00:04.803 ****** 30564 1726882806.22213: entering _queue_task() for managed_node2/include_tasks 30564 1726882806.22376: worker is 1 (out of 1 available) 30564 1726882806.22388: exiting _queue_task() for managed_node2/include_tasks 30564 1726882806.22400: done queuing things up, now waiting for results queue to drain 30564 1726882806.22401: waiting for pending results... 30564 1726882806.22549: running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' 30564 1726882806.22619: in run() - task 0e448fcc-3ce9-4216-acec-0000000000ba 30564 1726882806.22635: variable 'ansible_search_path' from source: unknown 30564 1726882806.22642: variable 'ansible_search_path' from source: unknown 30564 1726882806.22681: calling self._execute() 30564 1726882806.22745: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882806.22755: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882806.22773: variable 'omit' from source: magic vars 30564 1726882806.23045: variable 'ansible_distribution_major_version' from source: facts 30564 1726882806.23054: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882806.23059: _execute() done 30564 1726882806.23062: dumping result to json 30564 1726882806.23067: done dumping result, returning 30564 1726882806.23075: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' [0e448fcc-3ce9-4216-acec-0000000000ba] 30564 1726882806.23080: sending task result for task 0e448fcc-3ce9-4216-acec-0000000000ba 30564 1726882806.23155: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000000ba 30564 1726882806.23158: WORKER PROCESS EXITING 30564 1726882806.23187: no more pending results, returning what we have 30564 1726882806.23191: in VariableManager get_vars() 30564 1726882806.23215: Calling all_inventory to load vars for managed_node2 30564 1726882806.23218: Calling groups_inventory to load vars for managed_node2 30564 1726882806.23220: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882806.23230: Calling all_plugins_play to load vars for managed_node2 30564 1726882806.23232: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882806.23235: Calling groups_plugins_play to load vars for managed_node2 30564 1726882806.23349: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882806.23491: done with get_vars() 30564 1726882806.23496: variable 'ansible_search_path' from source: unknown 30564 1726882806.23496: variable 'ansible_search_path' from source: unknown 30564 1726882806.23534: we have included files to process 30564 1726882806.23535: generating all_blocks data 30564 1726882806.23536: done generating all_blocks data 30564 1726882806.23537: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30564 1726882806.23538: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30564 1726882806.23540: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30564 1726882806.23869: done processing included file 30564 1726882806.23871: iterating over new_blocks loaded from include file 30564 1726882806.23873: in VariableManager get_vars() 30564 1726882806.23886: done with get_vars() 30564 1726882806.23888: filtering new block on tags 30564 1726882806.23919: done filtering new block on tags 30564 1726882806.23922: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node2 30564 1726882806.23926: extending task lists for all hosts with included blocks 30564 1726882806.24058: done extending task lists 30564 1726882806.24059: done processing included files 30564 1726882806.24060: results queue empty 30564 1726882806.24060: checking for any_errors_fatal 30564 1726882806.24064: done checking for any_errors_fatal 30564 1726882806.24065: checking for max_fail_percentage 30564 1726882806.24066: done checking for max_fail_percentage 30564 1726882806.24067: checking to see if all hosts have failed and the running result is not ok 30564 1726882806.24068: done checking to see if all hosts have failed 30564 1726882806.24068: getting the remaining hosts for this loop 30564 1726882806.24069: done getting the remaining hosts for this loop 30564 1726882806.24072: getting the next task for host managed_node2 30564 1726882806.24075: done getting next task for host managed_node2 30564 1726882806.24077: ^ task is: TASK: Gather current interface info 30564 1726882806.24080: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882806.24082: getting variables 30564 1726882806.24083: in VariableManager get_vars() 30564 1726882806.24089: Calling all_inventory to load vars for managed_node2 30564 1726882806.24091: Calling groups_inventory to load vars for managed_node2 30564 1726882806.24093: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882806.24097: Calling all_plugins_play to load vars for managed_node2 30564 1726882806.24099: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882806.24101: Calling groups_plugins_play to load vars for managed_node2 30564 1726882806.24227: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882806.24424: done with get_vars() 30564 1726882806.24431: done getting variables 30564 1726882806.24462: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 21:40:06 -0400 (0:00:00.022) 0:00:04.826 ****** 30564 1726882806.24488: entering _queue_task() for managed_node2/command 30564 1726882806.24660: worker is 1 (out of 1 available) 30564 1726882806.24674: exiting _queue_task() for managed_node2/command 30564 1726882806.24686: done queuing things up, now waiting for results queue to drain 30564 1726882806.24687: waiting for pending results... 30564 1726882806.24823: running TaskExecutor() for managed_node2/TASK: Gather current interface info 30564 1726882806.24889: in run() - task 0e448fcc-3ce9-4216-acec-0000000000f5 30564 1726882806.24899: variable 'ansible_search_path' from source: unknown 30564 1726882806.24905: variable 'ansible_search_path' from source: unknown 30564 1726882806.24933: calling self._execute() 30564 1726882806.24986: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882806.24990: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882806.24999: variable 'omit' from source: magic vars 30564 1726882806.25251: variable 'ansible_distribution_major_version' from source: facts 30564 1726882806.25261: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882806.25271: variable 'omit' from source: magic vars 30564 1726882806.25299: variable 'omit' from source: magic vars 30564 1726882806.25321: variable 'omit' from source: magic vars 30564 1726882806.25357: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882806.25408: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882806.25430: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882806.25451: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882806.25469: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882806.25502: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882806.25511: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882806.25518: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882806.25615: Set connection var ansible_timeout to 10 30564 1726882806.25624: Set connection var ansible_pipelining to False 30564 1726882806.25630: Set connection var ansible_shell_type to sh 30564 1726882806.25638: Set connection var ansible_shell_executable to /bin/sh 30564 1726882806.25649: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882806.25654: Set connection var ansible_connection to ssh 30564 1726882806.25683: variable 'ansible_shell_executable' from source: unknown 30564 1726882806.25691: variable 'ansible_connection' from source: unknown 30564 1726882806.25698: variable 'ansible_module_compression' from source: unknown 30564 1726882806.25704: variable 'ansible_shell_type' from source: unknown 30564 1726882806.25710: variable 'ansible_shell_executable' from source: unknown 30564 1726882806.25716: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882806.25722: variable 'ansible_pipelining' from source: unknown 30564 1726882806.25728: variable 'ansible_timeout' from source: unknown 30564 1726882806.25734: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882806.25862: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882806.25882: variable 'omit' from source: magic vars 30564 1726882806.25891: starting attempt loop 30564 1726882806.25897: running the handler 30564 1726882806.25913: _low_level_execute_command(): starting 30564 1726882806.25924: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882806.26630: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882806.26645: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882806.26659: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882806.26680: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882806.26722: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882806.26734: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882806.26750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882806.26773: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882806.26786: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882806.26798: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882806.26811: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882806.26826: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882806.26840: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882806.26853: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882806.26868: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882806.26884: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882806.26956: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882806.26981: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882806.26998: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882806.27133: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882806.28800: stdout chunk (state=3): >>>/root <<< 30564 1726882806.28906: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882806.28959: stderr chunk (state=3): >>><<< 30564 1726882806.28969: stdout chunk (state=3): >>><<< 30564 1726882806.28988: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882806.29001: _low_level_execute_command(): starting 30564 1726882806.29005: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882806.2898643-30842-164863863683354 `" && echo ansible-tmp-1726882806.2898643-30842-164863863683354="` echo /root/.ansible/tmp/ansible-tmp-1726882806.2898643-30842-164863863683354 `" ) && sleep 0' 30564 1726882806.29658: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882806.29679: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882806.29693: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882806.29711: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882806.29755: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882806.29775: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882806.29790: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882806.29806: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882806.29817: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882806.29827: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882806.29840: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882806.29857: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882806.29879: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882806.29893: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882806.29904: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882806.29916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882806.30000: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882806.30019: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882806.30034: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882806.30169: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882806.32071: stdout chunk (state=3): >>>ansible-tmp-1726882806.2898643-30842-164863863683354=/root/.ansible/tmp/ansible-tmp-1726882806.2898643-30842-164863863683354 <<< 30564 1726882806.32182: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882806.32237: stderr chunk (state=3): >>><<< 30564 1726882806.32240: stdout chunk (state=3): >>><<< 30564 1726882806.32669: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882806.2898643-30842-164863863683354=/root/.ansible/tmp/ansible-tmp-1726882806.2898643-30842-164863863683354 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882806.32673: variable 'ansible_module_compression' from source: unknown 30564 1726882806.32675: ANSIBALLZ: Using generic lock for ansible.legacy.command 30564 1726882806.32678: ANSIBALLZ: Acquiring lock 30564 1726882806.32680: ANSIBALLZ: Lock acquired: 140506263950048 30564 1726882806.32682: ANSIBALLZ: Creating module 30564 1726882806.45747: ANSIBALLZ: Writing module into payload 30564 1726882806.45870: ANSIBALLZ: Writing module 30564 1726882806.45901: ANSIBALLZ: Renaming module 30564 1726882806.45913: ANSIBALLZ: Done creating module 30564 1726882806.45935: variable 'ansible_facts' from source: unknown 30564 1726882806.46024: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882806.2898643-30842-164863863683354/AnsiballZ_command.py 30564 1726882806.46186: Sending initial data 30564 1726882806.46190: Sent initial data (156 bytes) 30564 1726882806.47121: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882806.47158: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882806.47177: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882806.47194: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882806.47235: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882806.47246: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882806.47259: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882806.47282: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882806.47293: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882806.47303: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882806.47314: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882806.47326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882806.47341: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882806.47352: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882806.47367: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882806.47382: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882806.47474: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882806.47496: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882806.47511: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882806.47639: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882806.49483: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 <<< 30564 1726882806.49487: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882806.49874: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882806.49879: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmpgjatckxs /root/.ansible/tmp/ansible-tmp-1726882806.2898643-30842-164863863683354/AnsiballZ_command.py <<< 30564 1726882806.51114: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882806.51347: stderr chunk (state=3): >>><<< 30564 1726882806.51350: stdout chunk (state=3): >>><<< 30564 1726882806.51352: done transferring module to remote 30564 1726882806.51354: _low_level_execute_command(): starting 30564 1726882806.51357: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882806.2898643-30842-164863863683354/ /root/.ansible/tmp/ansible-tmp-1726882806.2898643-30842-164863863683354/AnsiballZ_command.py && sleep 0' 30564 1726882806.52094: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882806.52098: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882806.52140: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 30564 1726882806.52143: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration <<< 30564 1726882806.52146: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882806.52148: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 30564 1726882806.52150: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882806.53793: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882806.53797: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882806.53920: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882806.55686: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882806.55751: stderr chunk (state=3): >>><<< 30564 1726882806.55754: stdout chunk (state=3): >>><<< 30564 1726882806.55843: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882806.55847: _low_level_execute_command(): starting 30564 1726882806.55849: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882806.2898643-30842-164863863683354/AnsiballZ_command.py && sleep 0' 30564 1726882806.56411: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882806.56423: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882806.56436: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882806.56452: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882806.56497: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882806.56512: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882806.56524: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882806.56540: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882806.56550: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882806.56560: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882806.56577: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882806.56591: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882806.56605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882806.56620: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882806.56630: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882806.56644: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882806.56727: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882806.56747: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882806.56761: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882806.56899: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882806.70327: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo\nrpltstbr", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:40:06.697859", "end": "2024-09-20 21:40:06.701193", "delta": "0:00:00.003334", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30564 1726882806.71584: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882806.71658: stderr chunk (state=3): >>><<< 30564 1726882806.71661: stdout chunk (state=3): >>><<< 30564 1726882806.71804: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo\nrpltstbr", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:40:06.697859", "end": "2024-09-20 21:40:06.701193", "delta": "0:00:00.003334", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882806.71814: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882806.2898643-30842-164863863683354/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882806.71816: _low_level_execute_command(): starting 30564 1726882806.71819: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882806.2898643-30842-164863863683354/ > /dev/null 2>&1 && sleep 0' 30564 1726882806.72389: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882806.72402: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882806.72415: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882806.72431: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882806.72478: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882806.72490: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882806.72502: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882806.72518: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882806.72528: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882806.72538: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882806.72548: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882806.72561: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882806.72582: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882806.72594: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882806.72603: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882806.72615: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882806.72697: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882806.72718: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882806.72732: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882806.72858: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882806.74750: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882806.74753: stdout chunk (state=3): >>><<< 30564 1726882806.74755: stderr chunk (state=3): >>><<< 30564 1726882806.75019: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882806.75026: handler run complete 30564 1726882806.75033: Evaluated conditional (False): False 30564 1726882806.75036: attempt loop complete, returning result 30564 1726882806.75038: _execute() done 30564 1726882806.75040: dumping result to json 30564 1726882806.75042: done dumping result, returning 30564 1726882806.75044: done running TaskExecutor() for managed_node2/TASK: Gather current interface info [0e448fcc-3ce9-4216-acec-0000000000f5] 30564 1726882806.75046: sending task result for task 0e448fcc-3ce9-4216-acec-0000000000f5 30564 1726882806.75175: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000000f5 30564 1726882806.75178: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003334", "end": "2024-09-20 21:40:06.701193", "rc": 0, "start": "2024-09-20 21:40:06.697859" } STDOUT: bonding_masters eth0 lo rpltstbr 30564 1726882806.75256: no more pending results, returning what we have 30564 1726882806.75259: results queue empty 30564 1726882806.75260: checking for any_errors_fatal 30564 1726882806.75264: done checking for any_errors_fatal 30564 1726882806.75265: checking for max_fail_percentage 30564 1726882806.75267: done checking for max_fail_percentage 30564 1726882806.75268: checking to see if all hosts have failed and the running result is not ok 30564 1726882806.75268: done checking to see if all hosts have failed 30564 1726882806.75269: getting the remaining hosts for this loop 30564 1726882806.75271: done getting the remaining hosts for this loop 30564 1726882806.75287: getting the next task for host managed_node2 30564 1726882806.75295: done getting next task for host managed_node2 30564 1726882806.75298: ^ task is: TASK: Set current_interfaces 30564 1726882806.75305: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882806.75309: getting variables 30564 1726882806.75311: in VariableManager get_vars() 30564 1726882806.75355: Calling all_inventory to load vars for managed_node2 30564 1726882806.75359: Calling groups_inventory to load vars for managed_node2 30564 1726882806.75363: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882806.75379: Calling all_plugins_play to load vars for managed_node2 30564 1726882806.75384: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882806.75392: Calling groups_plugins_play to load vars for managed_node2 30564 1726882806.75819: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882806.76071: done with get_vars() 30564 1726882806.76082: done getting variables 30564 1726882806.76149: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 21:40:06 -0400 (0:00:00.516) 0:00:05.343 ****** 30564 1726882806.76182: entering _queue_task() for managed_node2/set_fact 30564 1726882806.76437: worker is 1 (out of 1 available) 30564 1726882806.76457: exiting _queue_task() for managed_node2/set_fact 30564 1726882806.76473: done queuing things up, now waiting for results queue to drain 30564 1726882806.76474: waiting for pending results... 30564 1726882806.77029: running TaskExecutor() for managed_node2/TASK: Set current_interfaces 30564 1726882806.77159: in run() - task 0e448fcc-3ce9-4216-acec-0000000000f6 30564 1726882806.77200: variable 'ansible_search_path' from source: unknown 30564 1726882806.77208: variable 'ansible_search_path' from source: unknown 30564 1726882806.77247: calling self._execute() 30564 1726882806.77369: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882806.77382: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882806.77395: variable 'omit' from source: magic vars 30564 1726882806.77743: variable 'ansible_distribution_major_version' from source: facts 30564 1726882806.77761: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882806.77781: variable 'omit' from source: magic vars 30564 1726882806.77834: variable 'omit' from source: magic vars 30564 1726882806.77949: variable '_current_interfaces' from source: set_fact 30564 1726882806.78089: variable 'omit' from source: magic vars 30564 1726882806.78135: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882806.78179: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882806.78207: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882806.78229: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882806.78244: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882806.78283: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882806.78291: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882806.78300: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882806.78410: Set connection var ansible_timeout to 10 30564 1726882806.78425: Set connection var ansible_pipelining to False 30564 1726882806.78432: Set connection var ansible_shell_type to sh 30564 1726882806.78441: Set connection var ansible_shell_executable to /bin/sh 30564 1726882806.78451: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882806.78457: Set connection var ansible_connection to ssh 30564 1726882806.78488: variable 'ansible_shell_executable' from source: unknown 30564 1726882806.78495: variable 'ansible_connection' from source: unknown 30564 1726882806.78501: variable 'ansible_module_compression' from source: unknown 30564 1726882806.78507: variable 'ansible_shell_type' from source: unknown 30564 1726882806.78512: variable 'ansible_shell_executable' from source: unknown 30564 1726882806.78519: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882806.78526: variable 'ansible_pipelining' from source: unknown 30564 1726882806.78532: variable 'ansible_timeout' from source: unknown 30564 1726882806.78538: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882806.78671: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882806.78688: variable 'omit' from source: magic vars 30564 1726882806.78697: starting attempt loop 30564 1726882806.78703: running the handler 30564 1726882806.78718: handler run complete 30564 1726882806.78732: attempt loop complete, returning result 30564 1726882806.78738: _execute() done 30564 1726882806.78744: dumping result to json 30564 1726882806.78750: done dumping result, returning 30564 1726882806.78764: done running TaskExecutor() for managed_node2/TASK: Set current_interfaces [0e448fcc-3ce9-4216-acec-0000000000f6] 30564 1726882806.78782: sending task result for task 0e448fcc-3ce9-4216-acec-0000000000f6 30564 1726882806.78877: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000000f6 30564 1726882806.78898: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo", "rpltstbr" ] }, "changed": false } 30564 1726882806.79449: no more pending results, returning what we have 30564 1726882806.79452: results queue empty 30564 1726882806.79453: checking for any_errors_fatal 30564 1726882806.79462: done checking for any_errors_fatal 30564 1726882806.79463: checking for max_fail_percentage 30564 1726882806.79466: done checking for max_fail_percentage 30564 1726882806.79467: checking to see if all hosts have failed and the running result is not ok 30564 1726882806.79468: done checking to see if all hosts have failed 30564 1726882806.79469: getting the remaining hosts for this loop 30564 1726882806.79470: done getting the remaining hosts for this loop 30564 1726882806.79474: getting the next task for host managed_node2 30564 1726882806.79483: done getting next task for host managed_node2 30564 1726882806.79486: ^ task is: TASK: Show current_interfaces 30564 1726882806.79489: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882806.79493: getting variables 30564 1726882806.79494: in VariableManager get_vars() 30564 1726882806.79517: Calling all_inventory to load vars for managed_node2 30564 1726882806.79519: Calling groups_inventory to load vars for managed_node2 30564 1726882806.79522: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882806.79531: Calling all_plugins_play to load vars for managed_node2 30564 1726882806.79537: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882806.79540: Calling groups_plugins_play to load vars for managed_node2 30564 1726882806.79931: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882806.80191: done with get_vars() 30564 1726882806.80200: done getting variables 30564 1726882806.80261: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 21:40:06 -0400 (0:00:00.041) 0:00:05.384 ****** 30564 1726882806.80290: entering _queue_task() for managed_node2/debug 30564 1726882806.80523: worker is 1 (out of 1 available) 30564 1726882806.80536: exiting _queue_task() for managed_node2/debug 30564 1726882806.80547: done queuing things up, now waiting for results queue to drain 30564 1726882806.80548: waiting for pending results... 30564 1726882806.81393: running TaskExecutor() for managed_node2/TASK: Show current_interfaces 30564 1726882806.81519: in run() - task 0e448fcc-3ce9-4216-acec-0000000000bb 30564 1726882806.81535: variable 'ansible_search_path' from source: unknown 30564 1726882806.81545: variable 'ansible_search_path' from source: unknown 30564 1726882806.81583: calling self._execute() 30564 1726882806.81659: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882806.81676: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882806.81690: variable 'omit' from source: magic vars 30564 1726882806.82031: variable 'ansible_distribution_major_version' from source: facts 30564 1726882806.82048: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882806.82058: variable 'omit' from source: magic vars 30564 1726882806.82110: variable 'omit' from source: magic vars 30564 1726882806.82211: variable 'current_interfaces' from source: set_fact 30564 1726882806.82242: variable 'omit' from source: magic vars 30564 1726882806.82286: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882806.82328: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882806.82351: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882806.82380: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882806.82398: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882806.82434: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882806.82443: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882806.82451: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882806.82558: Set connection var ansible_timeout to 10 30564 1726882806.82576: Set connection var ansible_pipelining to False 30564 1726882806.82584: Set connection var ansible_shell_type to sh 30564 1726882806.82592: Set connection var ansible_shell_executable to /bin/sh 30564 1726882806.82601: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882806.82606: Set connection var ansible_connection to ssh 30564 1726882806.82631: variable 'ansible_shell_executable' from source: unknown 30564 1726882806.82638: variable 'ansible_connection' from source: unknown 30564 1726882806.82644: variable 'ansible_module_compression' from source: unknown 30564 1726882806.82649: variable 'ansible_shell_type' from source: unknown 30564 1726882806.82654: variable 'ansible_shell_executable' from source: unknown 30564 1726882806.82658: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882806.82665: variable 'ansible_pipelining' from source: unknown 30564 1726882806.82675: variable 'ansible_timeout' from source: unknown 30564 1726882806.82682: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882806.82826: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882806.82844: variable 'omit' from source: magic vars 30564 1726882806.82856: starting attempt loop 30564 1726882806.82863: running the handler 30564 1726882806.82917: handler run complete 30564 1726882806.82935: attempt loop complete, returning result 30564 1726882806.82942: _execute() done 30564 1726882806.82948: dumping result to json 30564 1726882806.82958: done dumping result, returning 30564 1726882806.82975: done running TaskExecutor() for managed_node2/TASK: Show current_interfaces [0e448fcc-3ce9-4216-acec-0000000000bb] 30564 1726882806.82985: sending task result for task 0e448fcc-3ce9-4216-acec-0000000000bb ok: [managed_node2] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo', 'rpltstbr'] 30564 1726882806.83127: no more pending results, returning what we have 30564 1726882806.83134: results queue empty 30564 1726882806.83135: checking for any_errors_fatal 30564 1726882806.83145: done checking for any_errors_fatal 30564 1726882806.83146: checking for max_fail_percentage 30564 1726882806.83148: done checking for max_fail_percentage 30564 1726882806.83149: checking to see if all hosts have failed and the running result is not ok 30564 1726882806.83150: done checking to see if all hosts have failed 30564 1726882806.83151: getting the remaining hosts for this loop 30564 1726882806.83153: done getting the remaining hosts for this loop 30564 1726882806.83160: getting the next task for host managed_node2 30564 1726882806.83203: done getting next task for host managed_node2 30564 1726882806.83207: ^ task is: TASK: Setup 30564 1726882806.83211: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882806.83215: getting variables 30564 1726882806.83217: in VariableManager get_vars() 30564 1726882806.83242: Calling all_inventory to load vars for managed_node2 30564 1726882806.83245: Calling groups_inventory to load vars for managed_node2 30564 1726882806.83249: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882806.83260: Calling all_plugins_play to load vars for managed_node2 30564 1726882806.83265: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882806.83271: Calling groups_plugins_play to load vars for managed_node2 30564 1726882806.83621: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882806.83860: done with get_vars() 30564 1726882806.83877: done getting variables TASK [Setup] ******************************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:24 Friday 20 September 2024 21:40:06 -0400 (0:00:00.036) 0:00:05.421 ****** 30564 1726882806.83974: entering _queue_task() for managed_node2/include_tasks 30564 1726882806.83993: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000000bb 30564 1726882806.84003: WORKER PROCESS EXITING 30564 1726882806.84484: worker is 1 (out of 1 available) 30564 1726882806.84496: exiting _queue_task() for managed_node2/include_tasks 30564 1726882806.84507: done queuing things up, now waiting for results queue to drain 30564 1726882806.84508: waiting for pending results... 30564 1726882806.84787: running TaskExecutor() for managed_node2/TASK: Setup 30564 1726882806.84903: in run() - task 0e448fcc-3ce9-4216-acec-000000000094 30564 1726882806.84922: variable 'ansible_search_path' from source: unknown 30564 1726882806.84953: variable 'ansible_search_path' from source: unknown 30564 1726882806.85031: variable 'lsr_setup' from source: include params 30564 1726882806.85353: variable 'lsr_setup' from source: include params 30564 1726882806.85430: variable 'omit' from source: magic vars 30564 1726882806.85603: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882806.85618: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882806.85646: variable 'omit' from source: magic vars 30564 1726882806.86082: variable 'ansible_distribution_major_version' from source: facts 30564 1726882806.86098: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882806.86109: variable 'item' from source: unknown 30564 1726882806.86185: variable 'item' from source: unknown 30564 1726882806.86223: variable 'item' from source: unknown 30564 1726882806.86294: variable 'item' from source: unknown 30564 1726882806.86539: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882806.86553: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882806.86591: variable 'omit' from source: magic vars 30564 1726882806.86827: variable 'ansible_distribution_major_version' from source: facts 30564 1726882806.86839: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882806.86848: variable 'item' from source: unknown 30564 1726882806.86952: variable 'item' from source: unknown 30564 1726882806.86991: variable 'item' from source: unknown 30564 1726882806.87059: variable 'item' from source: unknown 30564 1726882806.87162: dumping result to json 30564 1726882806.87173: done dumping result, returning 30564 1726882806.87183: done running TaskExecutor() for managed_node2/TASK: Setup [0e448fcc-3ce9-4216-acec-000000000094] 30564 1726882806.87192: sending task result for task 0e448fcc-3ce9-4216-acec-000000000094 30564 1726882806.87284: no more pending results, returning what we have 30564 1726882806.87289: in VariableManager get_vars() 30564 1726882806.87318: Calling all_inventory to load vars for managed_node2 30564 1726882806.87320: Calling groups_inventory to load vars for managed_node2 30564 1726882806.87323: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882806.87335: Calling all_plugins_play to load vars for managed_node2 30564 1726882806.87338: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882806.87340: Calling groups_plugins_play to load vars for managed_node2 30564 1726882806.87546: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000094 30564 1726882806.87550: WORKER PROCESS EXITING 30564 1726882806.87578: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882806.87786: done with get_vars() 30564 1726882806.87793: variable 'ansible_search_path' from source: unknown 30564 1726882806.87794: variable 'ansible_search_path' from source: unknown 30564 1726882806.87831: variable 'ansible_search_path' from source: unknown 30564 1726882806.87833: variable 'ansible_search_path' from source: unknown 30564 1726882806.87861: we have included files to process 30564 1726882806.87862: generating all_blocks data 30564 1726882806.87869: done generating all_blocks data 30564 1726882806.87873: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 30564 1726882806.87875: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 30564 1726882806.87878: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 30564 1726882806.88039: done processing included file 30564 1726882806.88041: iterating over new_blocks loaded from include file 30564 1726882806.88042: in VariableManager get_vars() 30564 1726882806.88050: done with get_vars() 30564 1726882806.88051: filtering new block on tags 30564 1726882806.88071: done filtering new block on tags 30564 1726882806.88073: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml for managed_node2 => (item=tasks/delete_interface.yml) 30564 1726882806.88076: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 30564 1726882806.88076: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 30564 1726882806.88078: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 30564 1726882806.88151: in VariableManager get_vars() 30564 1726882806.88162: done with get_vars() 30564 1726882806.88237: done processing included file 30564 1726882806.88238: iterating over new_blocks loaded from include file 30564 1726882806.88239: in VariableManager get_vars() 30564 1726882806.88248: done with get_vars() 30564 1726882806.88250: filtering new block on tags 30564 1726882806.88272: done filtering new block on tags 30564 1726882806.88274: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed_node2 => (item=tasks/assert_device_absent.yml) 30564 1726882806.88277: extending task lists for all hosts with included blocks 30564 1726882806.88680: done extending task lists 30564 1726882806.88682: done processing included files 30564 1726882806.88686: results queue empty 30564 1726882806.88687: checking for any_errors_fatal 30564 1726882806.88690: done checking for any_errors_fatal 30564 1726882806.88691: checking for max_fail_percentage 30564 1726882806.88692: done checking for max_fail_percentage 30564 1726882806.88693: checking to see if all hosts have failed and the running result is not ok 30564 1726882806.88694: done checking to see if all hosts have failed 30564 1726882806.88695: getting the remaining hosts for this loop 30564 1726882806.88696: done getting the remaining hosts for this loop 30564 1726882806.88698: getting the next task for host managed_node2 30564 1726882806.88702: done getting next task for host managed_node2 30564 1726882806.88704: ^ task is: TASK: Remove test interface if necessary 30564 1726882806.88707: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882806.88709: getting variables 30564 1726882806.88709: in VariableManager get_vars() 30564 1726882806.88721: Calling all_inventory to load vars for managed_node2 30564 1726882806.88723: Calling groups_inventory to load vars for managed_node2 30564 1726882806.88804: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882806.88810: Calling all_plugins_play to load vars for managed_node2 30564 1726882806.88813: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882806.88816: Calling groups_plugins_play to load vars for managed_node2 30564 1726882806.88969: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882806.89195: done with get_vars() 30564 1726882806.89204: done getting variables 30564 1726882806.89244: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Remove test interface if necessary] ************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml:3 Friday 20 September 2024 21:40:06 -0400 (0:00:00.052) 0:00:05.474 ****** 30564 1726882806.89272: entering _queue_task() for managed_node2/command 30564 1726882806.89587: worker is 1 (out of 1 available) 30564 1726882806.89599: exiting _queue_task() for managed_node2/command 30564 1726882806.89611: done queuing things up, now waiting for results queue to drain 30564 1726882806.89612: waiting for pending results... 30564 1726882806.89872: running TaskExecutor() for managed_node2/TASK: Remove test interface if necessary 30564 1726882806.89939: in run() - task 0e448fcc-3ce9-4216-acec-00000000011b 30564 1726882806.89954: variable 'ansible_search_path' from source: unknown 30564 1726882806.89958: variable 'ansible_search_path' from source: unknown 30564 1726882806.89990: calling self._execute() 30564 1726882806.90046: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882806.90050: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882806.90059: variable 'omit' from source: magic vars 30564 1726882806.90315: variable 'ansible_distribution_major_version' from source: facts 30564 1726882806.90325: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882806.90335: variable 'omit' from source: magic vars 30564 1726882806.90363: variable 'omit' from source: magic vars 30564 1726882806.90430: variable 'interface' from source: play vars 30564 1726882806.90447: variable 'omit' from source: magic vars 30564 1726882806.90481: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882806.90510: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882806.90524: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882806.90539: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882806.90548: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882806.90575: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882806.90579: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882806.90582: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882806.90649: Set connection var ansible_timeout to 10 30564 1726882806.90652: Set connection var ansible_pipelining to False 30564 1726882806.90655: Set connection var ansible_shell_type to sh 30564 1726882806.90660: Set connection var ansible_shell_executable to /bin/sh 30564 1726882806.90671: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882806.90674: Set connection var ansible_connection to ssh 30564 1726882806.90691: variable 'ansible_shell_executable' from source: unknown 30564 1726882806.90694: variable 'ansible_connection' from source: unknown 30564 1726882806.90697: variable 'ansible_module_compression' from source: unknown 30564 1726882806.90699: variable 'ansible_shell_type' from source: unknown 30564 1726882806.90706: variable 'ansible_shell_executable' from source: unknown 30564 1726882806.90709: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882806.90713: variable 'ansible_pipelining' from source: unknown 30564 1726882806.90715: variable 'ansible_timeout' from source: unknown 30564 1726882806.90718: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882806.90817: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882806.90823: variable 'omit' from source: magic vars 30564 1726882806.90832: starting attempt loop 30564 1726882806.90835: running the handler 30564 1726882806.90843: _low_level_execute_command(): starting 30564 1726882806.90849: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882806.91343: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882806.91348: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882806.91380: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882806.91385: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 30564 1726882806.91387: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882806.91436: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882806.91439: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882806.91446: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882806.91551: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882806.93215: stdout chunk (state=3): >>>/root <<< 30564 1726882806.93323: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882806.93372: stderr chunk (state=3): >>><<< 30564 1726882806.93376: stdout chunk (state=3): >>><<< 30564 1726882806.93388: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882806.93398: _low_level_execute_command(): starting 30564 1726882806.93404: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882806.9338794-30886-17438417190340 `" && echo ansible-tmp-1726882806.9338794-30886-17438417190340="` echo /root/.ansible/tmp/ansible-tmp-1726882806.9338794-30886-17438417190340 `" ) && sleep 0' 30564 1726882806.93825: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882806.93845: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882806.93849: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882806.93851: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882806.93896: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 30564 1726882806.93901: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882806.93904: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882806.93906: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882806.93955: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882806.93958: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882806.94075: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882806.95926: stdout chunk (state=3): >>>ansible-tmp-1726882806.9338794-30886-17438417190340=/root/.ansible/tmp/ansible-tmp-1726882806.9338794-30886-17438417190340 <<< 30564 1726882806.96040: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882806.96115: stderr chunk (state=3): >>><<< 30564 1726882806.96125: stdout chunk (state=3): >>><<< 30564 1726882806.96274: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882806.9338794-30886-17438417190340=/root/.ansible/tmp/ansible-tmp-1726882806.9338794-30886-17438417190340 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882806.96279: variable 'ansible_module_compression' from source: unknown 30564 1726882806.96281: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30564 1726882806.96283: variable 'ansible_facts' from source: unknown 30564 1726882806.96362: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882806.9338794-30886-17438417190340/AnsiballZ_command.py 30564 1726882806.96521: Sending initial data 30564 1726882806.96525: Sent initial data (155 bytes) 30564 1726882806.97447: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882806.97451: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882806.97494: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882806.97497: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration <<< 30564 1726882806.97500: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 30564 1726882806.97502: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882806.97549: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882806.97568: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882806.97661: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882806.99400: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882806.99500: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882806.99603: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmp2k12xwwh /root/.ansible/tmp/ansible-tmp-1726882806.9338794-30886-17438417190340/AnsiballZ_command.py <<< 30564 1726882806.99698: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882807.00754: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882807.00847: stderr chunk (state=3): >>><<< 30564 1726882807.00851: stdout chunk (state=3): >>><<< 30564 1726882807.00869: done transferring module to remote 30564 1726882807.00880: _low_level_execute_command(): starting 30564 1726882807.00883: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882806.9338794-30886-17438417190340/ /root/.ansible/tmp/ansible-tmp-1726882806.9338794-30886-17438417190340/AnsiballZ_command.py && sleep 0' 30564 1726882807.01306: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882807.01309: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882807.01339: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882807.01343: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882807.01345: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882807.01397: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882807.01404: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882807.01504: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882807.03238: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882807.03292: stderr chunk (state=3): >>><<< 30564 1726882807.03295: stdout chunk (state=3): >>><<< 30564 1726882807.03309: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882807.03312: _low_level_execute_command(): starting 30564 1726882807.03317: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882806.9338794-30886-17438417190340/AnsiballZ_command.py && sleep 0' 30564 1726882807.03792: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882807.03797: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882807.03800: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882807.03804: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882807.03806: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882807.03832: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882807.03835: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882807.03837: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882807.03891: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882807.03894: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882807.04004: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882807.17869: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "Cannot find device \"statebr\"", "rc": 1, "cmd": ["ip", "link", "del", "statebr"], "start": "2024-09-20 21:40:07.169077", "end": "2024-09-20 21:40:07.176653", "delta": "0:00:00.007576", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del statebr", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30564 1726882807.18976: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.11.158 closed. <<< 30564 1726882807.19036: stderr chunk (state=3): >>><<< 30564 1726882807.19040: stdout chunk (state=3): >>><<< 30564 1726882807.19056: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "Cannot find device \"statebr\"", "rc": 1, "cmd": ["ip", "link", "del", "statebr"], "start": "2024-09-20 21:40:07.169077", "end": "2024-09-20 21:40:07.176653", "delta": "0:00:00.007576", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del statebr", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.11.158 closed. 30564 1726882807.19089: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882806.9338794-30886-17438417190340/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882807.19095: _low_level_execute_command(): starting 30564 1726882807.19102: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882806.9338794-30886-17438417190340/ > /dev/null 2>&1 && sleep 0' 30564 1726882807.19559: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882807.19577: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882807.19594: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 30564 1726882807.19607: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 30564 1726882807.19621: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882807.19662: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882807.19681: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882807.19786: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882807.21602: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882807.21644: stderr chunk (state=3): >>><<< 30564 1726882807.21648: stdout chunk (state=3): >>><<< 30564 1726882807.21660: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882807.21671: handler run complete 30564 1726882807.21689: Evaluated conditional (False): False 30564 1726882807.21700: attempt loop complete, returning result 30564 1726882807.21703: _execute() done 30564 1726882807.21705: dumping result to json 30564 1726882807.21710: done dumping result, returning 30564 1726882807.21716: done running TaskExecutor() for managed_node2/TASK: Remove test interface if necessary [0e448fcc-3ce9-4216-acec-00000000011b] 30564 1726882807.21722: sending task result for task 0e448fcc-3ce9-4216-acec-00000000011b 30564 1726882807.21813: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000011b 30564 1726882807.21816: WORKER PROCESS EXITING fatal: [managed_node2]: FAILED! => { "changed": false, "cmd": [ "ip", "link", "del", "statebr" ], "delta": "0:00:00.007576", "end": "2024-09-20 21:40:07.176653", "rc": 1, "start": "2024-09-20 21:40:07.169077" } STDERR: Cannot find device "statebr" MSG: non-zero return code ...ignoring 30564 1726882807.21880: no more pending results, returning what we have 30564 1726882807.21884: results queue empty 30564 1726882807.21885: checking for any_errors_fatal 30564 1726882807.21887: done checking for any_errors_fatal 30564 1726882807.21887: checking for max_fail_percentage 30564 1726882807.21889: done checking for max_fail_percentage 30564 1726882807.21890: checking to see if all hosts have failed and the running result is not ok 30564 1726882807.21891: done checking to see if all hosts have failed 30564 1726882807.21892: getting the remaining hosts for this loop 30564 1726882807.21893: done getting the remaining hosts for this loop 30564 1726882807.21897: getting the next task for host managed_node2 30564 1726882807.21906: done getting next task for host managed_node2 30564 1726882807.21909: ^ task is: TASK: Include the task 'get_interface_stat.yml' 30564 1726882807.21912: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882807.21915: getting variables 30564 1726882807.21917: in VariableManager get_vars() 30564 1726882807.21944: Calling all_inventory to load vars for managed_node2 30564 1726882807.21946: Calling groups_inventory to load vars for managed_node2 30564 1726882807.21950: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882807.21961: Calling all_plugins_play to load vars for managed_node2 30564 1726882807.21966: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882807.21969: Calling groups_plugins_play to load vars for managed_node2 30564 1726882807.22110: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882807.22259: done with get_vars() 30564 1726882807.22268: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Friday 20 September 2024 21:40:07 -0400 (0:00:00.330) 0:00:05.804 ****** 30564 1726882807.22332: entering _queue_task() for managed_node2/include_tasks 30564 1726882807.22514: worker is 1 (out of 1 available) 30564 1726882807.22528: exiting _queue_task() for managed_node2/include_tasks 30564 1726882807.22540: done queuing things up, now waiting for results queue to drain 30564 1726882807.22541: waiting for pending results... 30564 1726882807.22693: running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' 30564 1726882807.22754: in run() - task 0e448fcc-3ce9-4216-acec-00000000011f 30564 1726882807.22769: variable 'ansible_search_path' from source: unknown 30564 1726882807.22773: variable 'ansible_search_path' from source: unknown 30564 1726882807.22806: calling self._execute() 30564 1726882807.22862: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882807.22875: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882807.22887: variable 'omit' from source: magic vars 30564 1726882807.23139: variable 'ansible_distribution_major_version' from source: facts 30564 1726882807.23149: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882807.23155: _execute() done 30564 1726882807.23158: dumping result to json 30564 1726882807.23160: done dumping result, returning 30564 1726882807.23166: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' [0e448fcc-3ce9-4216-acec-00000000011f] 30564 1726882807.23175: sending task result for task 0e448fcc-3ce9-4216-acec-00000000011f 30564 1726882807.23259: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000011f 30564 1726882807.23261: WORKER PROCESS EXITING 30564 1726882807.23290: no more pending results, returning what we have 30564 1726882807.23294: in VariableManager get_vars() 30564 1726882807.23324: Calling all_inventory to load vars for managed_node2 30564 1726882807.23327: Calling groups_inventory to load vars for managed_node2 30564 1726882807.23330: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882807.23339: Calling all_plugins_play to load vars for managed_node2 30564 1726882807.23342: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882807.23345: Calling groups_plugins_play to load vars for managed_node2 30564 1726882807.23462: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882807.23585: done with get_vars() 30564 1726882807.23590: variable 'ansible_search_path' from source: unknown 30564 1726882807.23590: variable 'ansible_search_path' from source: unknown 30564 1726882807.23597: variable 'item' from source: include params 30564 1726882807.23677: variable 'item' from source: include params 30564 1726882807.23703: we have included files to process 30564 1726882807.23704: generating all_blocks data 30564 1726882807.23705: done generating all_blocks data 30564 1726882807.23708: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30564 1726882807.23709: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30564 1726882807.23710: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30564 1726882807.23858: done processing included file 30564 1726882807.23860: iterating over new_blocks loaded from include file 30564 1726882807.23861: in VariableManager get_vars() 30564 1726882807.23873: done with get_vars() 30564 1726882807.23874: filtering new block on tags 30564 1726882807.23891: done filtering new block on tags 30564 1726882807.23892: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node2 30564 1726882807.23895: extending task lists for all hosts with included blocks 30564 1726882807.23989: done extending task lists 30564 1726882807.23990: done processing included files 30564 1726882807.23991: results queue empty 30564 1726882807.23991: checking for any_errors_fatal 30564 1726882807.23994: done checking for any_errors_fatal 30564 1726882807.23995: checking for max_fail_percentage 30564 1726882807.23996: done checking for max_fail_percentage 30564 1726882807.23996: checking to see if all hosts have failed and the running result is not ok 30564 1726882807.23997: done checking to see if all hosts have failed 30564 1726882807.23997: getting the remaining hosts for this loop 30564 1726882807.23998: done getting the remaining hosts for this loop 30564 1726882807.23999: getting the next task for host managed_node2 30564 1726882807.24002: done getting next task for host managed_node2 30564 1726882807.24004: ^ task is: TASK: Get stat for interface {{ interface }} 30564 1726882807.24006: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882807.24007: getting variables 30564 1726882807.24008: in VariableManager get_vars() 30564 1726882807.24013: Calling all_inventory to load vars for managed_node2 30564 1726882807.24014: Calling groups_inventory to load vars for managed_node2 30564 1726882807.24016: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882807.24019: Calling all_plugins_play to load vars for managed_node2 30564 1726882807.24020: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882807.24022: Calling groups_plugins_play to load vars for managed_node2 30564 1726882807.24129: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882807.24248: done with get_vars() 30564 1726882807.24254: done getting variables 30564 1726882807.24334: variable 'interface' from source: play vars TASK [Get stat for interface statebr] ****************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:40:07 -0400 (0:00:00.020) 0:00:05.824 ****** 30564 1726882807.24353: entering _queue_task() for managed_node2/stat 30564 1726882807.24515: worker is 1 (out of 1 available) 30564 1726882807.24529: exiting _queue_task() for managed_node2/stat 30564 1726882807.24539: done queuing things up, now waiting for results queue to drain 30564 1726882807.24540: waiting for pending results... 30564 1726882807.24684: running TaskExecutor() for managed_node2/TASK: Get stat for interface statebr 30564 1726882807.24742: in run() - task 0e448fcc-3ce9-4216-acec-00000000016e 30564 1726882807.24752: variable 'ansible_search_path' from source: unknown 30564 1726882807.24755: variable 'ansible_search_path' from source: unknown 30564 1726882807.24785: calling self._execute() 30564 1726882807.24838: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882807.24841: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882807.24850: variable 'omit' from source: magic vars 30564 1726882807.25084: variable 'ansible_distribution_major_version' from source: facts 30564 1726882807.25094: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882807.25099: variable 'omit' from source: magic vars 30564 1726882807.25133: variable 'omit' from source: magic vars 30564 1726882807.25199: variable 'interface' from source: play vars 30564 1726882807.25213: variable 'omit' from source: magic vars 30564 1726882807.25247: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882807.25277: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882807.25292: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882807.25305: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882807.25315: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882807.25338: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882807.25341: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882807.25344: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882807.25413: Set connection var ansible_timeout to 10 30564 1726882807.25418: Set connection var ansible_pipelining to False 30564 1726882807.25420: Set connection var ansible_shell_type to sh 30564 1726882807.25425: Set connection var ansible_shell_executable to /bin/sh 30564 1726882807.25433: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882807.25436: Set connection var ansible_connection to ssh 30564 1726882807.25452: variable 'ansible_shell_executable' from source: unknown 30564 1726882807.25459: variable 'ansible_connection' from source: unknown 30564 1726882807.25462: variable 'ansible_module_compression' from source: unknown 30564 1726882807.25466: variable 'ansible_shell_type' from source: unknown 30564 1726882807.25471: variable 'ansible_shell_executable' from source: unknown 30564 1726882807.25473: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882807.25475: variable 'ansible_pipelining' from source: unknown 30564 1726882807.25478: variable 'ansible_timeout' from source: unknown 30564 1726882807.25481: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882807.25621: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30564 1726882807.25629: variable 'omit' from source: magic vars 30564 1726882807.25634: starting attempt loop 30564 1726882807.25636: running the handler 30564 1726882807.25648: _low_level_execute_command(): starting 30564 1726882807.25654: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882807.26157: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882807.26177: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882807.26193: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882807.26205: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882807.26255: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882807.26274: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882807.26375: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882807.27972: stdout chunk (state=3): >>>/root <<< 30564 1726882807.28073: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882807.28119: stderr chunk (state=3): >>><<< 30564 1726882807.28122: stdout chunk (state=3): >>><<< 30564 1726882807.28140: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882807.28151: _low_level_execute_command(): starting 30564 1726882807.28156: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882807.2813852-30899-100777030073305 `" && echo ansible-tmp-1726882807.2813852-30899-100777030073305="` echo /root/.ansible/tmp/ansible-tmp-1726882807.2813852-30899-100777030073305 `" ) && sleep 0' 30564 1726882807.28578: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882807.28597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882807.28614: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882807.28624: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882807.28673: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882807.28686: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882807.28795: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882807.30654: stdout chunk (state=3): >>>ansible-tmp-1726882807.2813852-30899-100777030073305=/root/.ansible/tmp/ansible-tmp-1726882807.2813852-30899-100777030073305 <<< 30564 1726882807.30762: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882807.30804: stderr chunk (state=3): >>><<< 30564 1726882807.30807: stdout chunk (state=3): >>><<< 30564 1726882807.30819: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882807.2813852-30899-100777030073305=/root/.ansible/tmp/ansible-tmp-1726882807.2813852-30899-100777030073305 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882807.30852: variable 'ansible_module_compression' from source: unknown 30564 1726882807.30895: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 30564 1726882807.30918: variable 'ansible_facts' from source: unknown 30564 1726882807.30981: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882807.2813852-30899-100777030073305/AnsiballZ_stat.py 30564 1726882807.31070: Sending initial data 30564 1726882807.31080: Sent initial data (153 bytes) 30564 1726882807.31884: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882807.31887: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882807.31921: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882807.31925: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882807.31928: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882807.31986: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882807.31993: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882807.31995: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882807.32092: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882807.33820: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 <<< 30564 1726882807.33824: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882807.33916: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882807.34018: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmpukyvvym2 /root/.ansible/tmp/ansible-tmp-1726882807.2813852-30899-100777030073305/AnsiballZ_stat.py <<< 30564 1726882807.34112: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882807.35469: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882807.35573: stderr chunk (state=3): >>><<< 30564 1726882807.35576: stdout chunk (state=3): >>><<< 30564 1726882807.35578: done transferring module to remote 30564 1726882807.35580: _low_level_execute_command(): starting 30564 1726882807.35583: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882807.2813852-30899-100777030073305/ /root/.ansible/tmp/ansible-tmp-1726882807.2813852-30899-100777030073305/AnsiballZ_stat.py && sleep 0' 30564 1726882807.36091: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882807.36095: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882807.36136: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882807.36139: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882807.36142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882807.36144: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882807.36190: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882807.36193: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882807.36297: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882807.38044: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882807.38111: stderr chunk (state=3): >>><<< 30564 1726882807.38114: stdout chunk (state=3): >>><<< 30564 1726882807.38172: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882807.38182: _low_level_execute_command(): starting 30564 1726882807.38185: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882807.2813852-30899-100777030073305/AnsiballZ_stat.py && sleep 0' 30564 1726882807.38738: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882807.38751: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882807.38766: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882807.38787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882807.38825: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882807.38840: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882807.38855: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882807.38877: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882807.38888: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882807.38898: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882807.38909: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882807.38922: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882807.38937: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882807.38949: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882807.38959: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882807.38977: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882807.39052: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882807.39081: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882807.39097: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882807.39232: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882807.52320: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 30564 1726882807.53398: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882807.53401: stdout chunk (state=3): >>><<< 30564 1726882807.53403: stderr chunk (state=3): >>><<< 30564 1726882807.53531: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882807.53536: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882807.2813852-30899-100777030073305/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882807.53539: _low_level_execute_command(): starting 30564 1726882807.53541: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882807.2813852-30899-100777030073305/ > /dev/null 2>&1 && sleep 0' 30564 1726882807.54120: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882807.54137: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882807.54155: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882807.54184: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882807.54238: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882807.54254: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882807.54275: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882807.54294: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882807.54311: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882807.54323: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882807.54335: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882807.54348: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882807.54365: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882807.54381: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882807.54392: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882807.54405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882807.54484: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882807.54506: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882807.54523: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882807.54660: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882807.56487: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882807.56570: stderr chunk (state=3): >>><<< 30564 1726882807.56581: stdout chunk (state=3): >>><<< 30564 1726882807.56976: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882807.56984: handler run complete 30564 1726882807.56986: attempt loop complete, returning result 30564 1726882807.56988: _execute() done 30564 1726882807.56990: dumping result to json 30564 1726882807.56992: done dumping result, returning 30564 1726882807.56994: done running TaskExecutor() for managed_node2/TASK: Get stat for interface statebr [0e448fcc-3ce9-4216-acec-00000000016e] 30564 1726882807.56996: sending task result for task 0e448fcc-3ce9-4216-acec-00000000016e 30564 1726882807.57071: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000016e 30564 1726882807.57076: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 30564 1726882807.57130: no more pending results, returning what we have 30564 1726882807.57133: results queue empty 30564 1726882807.57134: checking for any_errors_fatal 30564 1726882807.57135: done checking for any_errors_fatal 30564 1726882807.57136: checking for max_fail_percentage 30564 1726882807.57138: done checking for max_fail_percentage 30564 1726882807.57138: checking to see if all hosts have failed and the running result is not ok 30564 1726882807.57139: done checking to see if all hosts have failed 30564 1726882807.57140: getting the remaining hosts for this loop 30564 1726882807.57141: done getting the remaining hosts for this loop 30564 1726882807.57145: getting the next task for host managed_node2 30564 1726882807.57152: done getting next task for host managed_node2 30564 1726882807.57154: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 30564 1726882807.57158: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882807.57162: getting variables 30564 1726882807.57165: in VariableManager get_vars() 30564 1726882807.57195: Calling all_inventory to load vars for managed_node2 30564 1726882807.57197: Calling groups_inventory to load vars for managed_node2 30564 1726882807.57201: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882807.57211: Calling all_plugins_play to load vars for managed_node2 30564 1726882807.57213: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882807.57216: Calling groups_plugins_play to load vars for managed_node2 30564 1726882807.57410: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882807.57647: done with get_vars() 30564 1726882807.57658: done getting variables 30564 1726882807.57760: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 30564 1726882807.57892: variable 'interface' from source: play vars TASK [Assert that the interface is absent - 'statebr'] ************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Friday 20 September 2024 21:40:07 -0400 (0:00:00.335) 0:00:06.160 ****** 30564 1726882807.57927: entering _queue_task() for managed_node2/assert 30564 1726882807.57929: Creating lock for assert 30564 1726882807.58182: worker is 1 (out of 1 available) 30564 1726882807.58194: exiting _queue_task() for managed_node2/assert 30564 1726882807.58205: done queuing things up, now waiting for results queue to drain 30564 1726882807.58206: waiting for pending results... 30564 1726882807.58491: running TaskExecutor() for managed_node2/TASK: Assert that the interface is absent - 'statebr' 30564 1726882807.58616: in run() - task 0e448fcc-3ce9-4216-acec-000000000120 30564 1726882807.58635: variable 'ansible_search_path' from source: unknown 30564 1726882807.58644: variable 'ansible_search_path' from source: unknown 30564 1726882807.58695: calling self._execute() 30564 1726882807.58785: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882807.58797: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882807.58812: variable 'omit' from source: magic vars 30564 1726882807.59275: variable 'ansible_distribution_major_version' from source: facts 30564 1726882807.59293: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882807.59310: variable 'omit' from source: magic vars 30564 1726882807.59360: variable 'omit' from source: magic vars 30564 1726882807.59470: variable 'interface' from source: play vars 30564 1726882807.59494: variable 'omit' from source: magic vars 30564 1726882807.59544: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882807.59588: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882807.59612: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882807.59639: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882807.59660: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882807.59700: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882807.59709: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882807.59716: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882807.59829: Set connection var ansible_timeout to 10 30564 1726882807.59840: Set connection var ansible_pipelining to False 30564 1726882807.59852: Set connection var ansible_shell_type to sh 30564 1726882807.59871: Set connection var ansible_shell_executable to /bin/sh 30564 1726882807.59886: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882807.59894: Set connection var ansible_connection to ssh 30564 1726882807.59922: variable 'ansible_shell_executable' from source: unknown 30564 1726882807.59931: variable 'ansible_connection' from source: unknown 30564 1726882807.59939: variable 'ansible_module_compression' from source: unknown 30564 1726882807.59946: variable 'ansible_shell_type' from source: unknown 30564 1726882807.59959: variable 'ansible_shell_executable' from source: unknown 30564 1726882807.59975: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882807.59984: variable 'ansible_pipelining' from source: unknown 30564 1726882807.59991: variable 'ansible_timeout' from source: unknown 30564 1726882807.59998: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882807.60146: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882807.60162: variable 'omit' from source: magic vars 30564 1726882807.60180: starting attempt loop 30564 1726882807.60187: running the handler 30564 1726882807.60337: variable 'interface_stat' from source: set_fact 30564 1726882807.60351: Evaluated conditional (not interface_stat.stat.exists): True 30564 1726882807.60361: handler run complete 30564 1726882807.60386: attempt loop complete, returning result 30564 1726882807.60396: _execute() done 30564 1726882807.60403: dumping result to json 30564 1726882807.60415: done dumping result, returning 30564 1726882807.60426: done running TaskExecutor() for managed_node2/TASK: Assert that the interface is absent - 'statebr' [0e448fcc-3ce9-4216-acec-000000000120] 30564 1726882807.60436: sending task result for task 0e448fcc-3ce9-4216-acec-000000000120 ok: [managed_node2] => { "changed": false } MSG: All assertions passed 30564 1726882807.60579: no more pending results, returning what we have 30564 1726882807.60583: results queue empty 30564 1726882807.60584: checking for any_errors_fatal 30564 1726882807.60593: done checking for any_errors_fatal 30564 1726882807.60593: checking for max_fail_percentage 30564 1726882807.60595: done checking for max_fail_percentage 30564 1726882807.60596: checking to see if all hosts have failed and the running result is not ok 30564 1726882807.60597: done checking to see if all hosts have failed 30564 1726882807.60598: getting the remaining hosts for this loop 30564 1726882807.60600: done getting the remaining hosts for this loop 30564 1726882807.60603: getting the next task for host managed_node2 30564 1726882807.60612: done getting next task for host managed_node2 30564 1726882807.60615: ^ task is: TASK: Test 30564 1726882807.60618: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882807.60622: getting variables 30564 1726882807.60624: in VariableManager get_vars() 30564 1726882807.60651: Calling all_inventory to load vars for managed_node2 30564 1726882807.60654: Calling groups_inventory to load vars for managed_node2 30564 1726882807.60658: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882807.60674: Calling all_plugins_play to load vars for managed_node2 30564 1726882807.60679: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882807.60683: Calling groups_plugins_play to load vars for managed_node2 30564 1726882807.60911: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882807.61140: done with get_vars() 30564 1726882807.61150: done getting variables 30564 1726882807.61305: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000120 30564 1726882807.61308: WORKER PROCESS EXITING TASK [Test] ******************************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:30 Friday 20 September 2024 21:40:07 -0400 (0:00:00.034) 0:00:06.195 ****** 30564 1726882807.61374: entering _queue_task() for managed_node2/include_tasks 30564 1726882807.61692: worker is 1 (out of 1 available) 30564 1726882807.61704: exiting _queue_task() for managed_node2/include_tasks 30564 1726882807.61716: done queuing things up, now waiting for results queue to drain 30564 1726882807.61717: waiting for pending results... 30564 1726882807.61975: running TaskExecutor() for managed_node2/TASK: Test 30564 1726882807.62077: in run() - task 0e448fcc-3ce9-4216-acec-000000000095 30564 1726882807.62093: variable 'ansible_search_path' from source: unknown 30564 1726882807.62099: variable 'ansible_search_path' from source: unknown 30564 1726882807.62140: variable 'lsr_test' from source: include params 30564 1726882807.62327: variable 'lsr_test' from source: include params 30564 1726882807.62399: variable 'omit' from source: magic vars 30564 1726882807.62515: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882807.62529: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882807.62543: variable 'omit' from source: magic vars 30564 1726882807.62782: variable 'ansible_distribution_major_version' from source: facts 30564 1726882807.62796: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882807.62806: variable 'item' from source: unknown 30564 1726882807.62882: variable 'item' from source: unknown 30564 1726882807.62915: variable 'item' from source: unknown 30564 1726882807.62989: variable 'item' from source: unknown 30564 1726882807.63120: dumping result to json 30564 1726882807.63128: done dumping result, returning 30564 1726882807.63136: done running TaskExecutor() for managed_node2/TASK: Test [0e448fcc-3ce9-4216-acec-000000000095] 30564 1726882807.63145: sending task result for task 0e448fcc-3ce9-4216-acec-000000000095 30564 1726882807.63211: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000095 30564 1726882807.63234: no more pending results, returning what we have 30564 1726882807.63238: in VariableManager get_vars() 30564 1726882807.63269: Calling all_inventory to load vars for managed_node2 30564 1726882807.63272: Calling groups_inventory to load vars for managed_node2 30564 1726882807.63276: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882807.63288: Calling all_plugins_play to load vars for managed_node2 30564 1726882807.63291: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882807.63294: Calling groups_plugins_play to load vars for managed_node2 30564 1726882807.63492: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882807.63716: done with get_vars() 30564 1726882807.63723: variable 'ansible_search_path' from source: unknown 30564 1726882807.63724: variable 'ansible_search_path' from source: unknown 30564 1726882807.63763: we have included files to process 30564 1726882807.63766: generating all_blocks data 30564 1726882807.63771: done generating all_blocks data 30564 1726882807.63779: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 30564 1726882807.63781: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 30564 1726882807.63784: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 30564 1726882807.64233: WORKER PROCESS EXITING 30564 1726882807.64323: done processing included file 30564 1726882807.64325: iterating over new_blocks loaded from include file 30564 1726882807.64327: in VariableManager get_vars() 30564 1726882807.64339: done with get_vars() 30564 1726882807.64341: filtering new block on tags 30564 1726882807.64376: done filtering new block on tags 30564 1726882807.64379: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml for managed_node2 => (item=tasks/create_bridge_profile.yml) 30564 1726882807.64383: extending task lists for all hosts with included blocks 30564 1726882807.65288: done extending task lists 30564 1726882807.65289: done processing included files 30564 1726882807.65290: results queue empty 30564 1726882807.65291: checking for any_errors_fatal 30564 1726882807.65293: done checking for any_errors_fatal 30564 1726882807.65294: checking for max_fail_percentage 30564 1726882807.65295: done checking for max_fail_percentage 30564 1726882807.65296: checking to see if all hosts have failed and the running result is not ok 30564 1726882807.65297: done checking to see if all hosts have failed 30564 1726882807.65298: getting the remaining hosts for this loop 30564 1726882807.65299: done getting the remaining hosts for this loop 30564 1726882807.65306: getting the next task for host managed_node2 30564 1726882807.65310: done getting next task for host managed_node2 30564 1726882807.65312: ^ task is: TASK: Include network role 30564 1726882807.65315: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882807.65317: getting variables 30564 1726882807.65318: in VariableManager get_vars() 30564 1726882807.65326: Calling all_inventory to load vars for managed_node2 30564 1726882807.65328: Calling groups_inventory to load vars for managed_node2 30564 1726882807.65330: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882807.65335: Calling all_plugins_play to load vars for managed_node2 30564 1726882807.65337: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882807.65340: Calling groups_plugins_play to load vars for managed_node2 30564 1726882807.65515: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882807.65740: done with get_vars() 30564 1726882807.65749: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml:3 Friday 20 September 2024 21:40:07 -0400 (0:00:00.044) 0:00:06.239 ****** 30564 1726882807.65822: entering _queue_task() for managed_node2/include_role 30564 1726882807.65824: Creating lock for include_role 30564 1726882807.66052: worker is 1 (out of 1 available) 30564 1726882807.66071: exiting _queue_task() for managed_node2/include_role 30564 1726882807.66084: done queuing things up, now waiting for results queue to drain 30564 1726882807.66086: waiting for pending results... 30564 1726882807.66338: running TaskExecutor() for managed_node2/TASK: Include network role 30564 1726882807.66459: in run() - task 0e448fcc-3ce9-4216-acec-00000000018e 30564 1726882807.66482: variable 'ansible_search_path' from source: unknown 30564 1726882807.66490: variable 'ansible_search_path' from source: unknown 30564 1726882807.66534: calling self._execute() 30564 1726882807.66608: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882807.66623: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882807.66642: variable 'omit' from source: magic vars 30564 1726882807.66994: variable 'ansible_distribution_major_version' from source: facts 30564 1726882807.67012: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882807.67026: _execute() done 30564 1726882807.67035: dumping result to json 30564 1726882807.67044: done dumping result, returning 30564 1726882807.67057: done running TaskExecutor() for managed_node2/TASK: Include network role [0e448fcc-3ce9-4216-acec-00000000018e] 30564 1726882807.67077: sending task result for task 0e448fcc-3ce9-4216-acec-00000000018e 30564 1726882807.67219: no more pending results, returning what we have 30564 1726882807.67224: in VariableManager get_vars() 30564 1726882807.67253: Calling all_inventory to load vars for managed_node2 30564 1726882807.67256: Calling groups_inventory to load vars for managed_node2 30564 1726882807.67260: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882807.67277: Calling all_plugins_play to load vars for managed_node2 30564 1726882807.67281: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882807.67285: Calling groups_plugins_play to load vars for managed_node2 30564 1726882807.67494: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882807.67726: done with get_vars() 30564 1726882807.67734: variable 'ansible_search_path' from source: unknown 30564 1726882807.67735: variable 'ansible_search_path' from source: unknown 30564 1726882807.68048: variable 'omit' from source: magic vars 30564 1726882807.68091: variable 'omit' from source: magic vars 30564 1726882807.68105: variable 'omit' from source: magic vars 30564 1726882807.68109: we have included files to process 30564 1726882807.68110: generating all_blocks data 30564 1726882807.68112: done generating all_blocks data 30564 1726882807.68113: processing included file: fedora.linux_system_roles.network 30564 1726882807.68190: in VariableManager get_vars() 30564 1726882807.68201: done with get_vars() 30564 1726882807.68228: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000018e 30564 1726882807.68233: WORKER PROCESS EXITING 30564 1726882807.68281: in VariableManager get_vars() 30564 1726882807.68296: done with get_vars() 30564 1726882807.68564: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 30564 1726882807.68779: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 30564 1726882807.68916: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 30564 1726882807.69628: in VariableManager get_vars() 30564 1726882807.69650: done with get_vars() 30564 1726882807.70087: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30564 1726882807.71855: iterating over new_blocks loaded from include file 30564 1726882807.71857: in VariableManager get_vars() 30564 1726882807.71877: done with get_vars() 30564 1726882807.71879: filtering new block on tags 30564 1726882807.72181: done filtering new block on tags 30564 1726882807.72185: in VariableManager get_vars() 30564 1726882807.72199: done with get_vars() 30564 1726882807.72200: filtering new block on tags 30564 1726882807.72217: done filtering new block on tags 30564 1726882807.72219: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node2 30564 1726882807.72224: extending task lists for all hosts with included blocks 30564 1726882807.72406: done extending task lists 30564 1726882807.72408: done processing included files 30564 1726882807.72409: results queue empty 30564 1726882807.72409: checking for any_errors_fatal 30564 1726882807.72412: done checking for any_errors_fatal 30564 1726882807.72413: checking for max_fail_percentage 30564 1726882807.72414: done checking for max_fail_percentage 30564 1726882807.72415: checking to see if all hosts have failed and the running result is not ok 30564 1726882807.72415: done checking to see if all hosts have failed 30564 1726882807.72416: getting the remaining hosts for this loop 30564 1726882807.72417: done getting the remaining hosts for this loop 30564 1726882807.72420: getting the next task for host managed_node2 30564 1726882807.72424: done getting next task for host managed_node2 30564 1726882807.72427: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30564 1726882807.72430: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882807.72439: getting variables 30564 1726882807.72440: in VariableManager get_vars() 30564 1726882807.72451: Calling all_inventory to load vars for managed_node2 30564 1726882807.72453: Calling groups_inventory to load vars for managed_node2 30564 1726882807.72455: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882807.72459: Calling all_plugins_play to load vars for managed_node2 30564 1726882807.72462: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882807.72470: Calling groups_plugins_play to load vars for managed_node2 30564 1726882807.72642: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882807.72877: done with get_vars() 30564 1726882807.72885: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:40:07 -0400 (0:00:00.071) 0:00:06.310 ****** 30564 1726882807.72956: entering _queue_task() for managed_node2/include_tasks 30564 1726882807.73198: worker is 1 (out of 1 available) 30564 1726882807.73210: exiting _queue_task() for managed_node2/include_tasks 30564 1726882807.73220: done queuing things up, now waiting for results queue to drain 30564 1726882807.73221: waiting for pending results... 30564 1726882807.73481: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30564 1726882807.73610: in run() - task 0e448fcc-3ce9-4216-acec-00000000020c 30564 1726882807.73629: variable 'ansible_search_path' from source: unknown 30564 1726882807.73637: variable 'ansible_search_path' from source: unknown 30564 1726882807.73685: calling self._execute() 30564 1726882807.73769: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882807.73789: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882807.73804: variable 'omit' from source: magic vars 30564 1726882807.74172: variable 'ansible_distribution_major_version' from source: facts 30564 1726882807.74192: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882807.74203: _execute() done 30564 1726882807.74216: dumping result to json 30564 1726882807.74227: done dumping result, returning 30564 1726882807.74238: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0e448fcc-3ce9-4216-acec-00000000020c] 30564 1726882807.74248: sending task result for task 0e448fcc-3ce9-4216-acec-00000000020c 30564 1726882807.74389: no more pending results, returning what we have 30564 1726882807.74394: in VariableManager get_vars() 30564 1726882807.74431: Calling all_inventory to load vars for managed_node2 30564 1726882807.74434: Calling groups_inventory to load vars for managed_node2 30564 1726882807.74437: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882807.74449: Calling all_plugins_play to load vars for managed_node2 30564 1726882807.74453: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882807.74456: Calling groups_plugins_play to load vars for managed_node2 30564 1726882807.74661: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882807.74910: done with get_vars() 30564 1726882807.74918: variable 'ansible_search_path' from source: unknown 30564 1726882807.74920: variable 'ansible_search_path' from source: unknown 30564 1726882807.74962: we have included files to process 30564 1726882807.74976: generating all_blocks data 30564 1726882807.74979: done generating all_blocks data 30564 1726882807.75097: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30564 1726882807.75099: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30564 1726882807.75105: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000020c 30564 1726882807.75107: WORKER PROCESS EXITING 30564 1726882807.75110: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30564 1726882807.75890: done processing included file 30564 1726882807.75893: iterating over new_blocks loaded from include file 30564 1726882807.75894: in VariableManager get_vars() 30564 1726882807.75917: done with get_vars() 30564 1726882807.75918: filtering new block on tags 30564 1726882807.75948: done filtering new block on tags 30564 1726882807.75950: in VariableManager get_vars() 30564 1726882807.75981: done with get_vars() 30564 1726882807.75983: filtering new block on tags 30564 1726882807.76026: done filtering new block on tags 30564 1726882807.76029: in VariableManager get_vars() 30564 1726882807.76049: done with get_vars() 30564 1726882807.76051: filtering new block on tags 30564 1726882807.76102: done filtering new block on tags 30564 1726882807.76105: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 30564 1726882807.76109: extending task lists for all hosts with included blocks 30564 1726882807.77875: done extending task lists 30564 1726882807.77877: done processing included files 30564 1726882807.77878: results queue empty 30564 1726882807.77879: checking for any_errors_fatal 30564 1726882807.77882: done checking for any_errors_fatal 30564 1726882807.77882: checking for max_fail_percentage 30564 1726882807.77883: done checking for max_fail_percentage 30564 1726882807.77884: checking to see if all hosts have failed and the running result is not ok 30564 1726882807.77885: done checking to see if all hosts have failed 30564 1726882807.77886: getting the remaining hosts for this loop 30564 1726882807.77887: done getting the remaining hosts for this loop 30564 1726882807.77889: getting the next task for host managed_node2 30564 1726882807.77894: done getting next task for host managed_node2 30564 1726882807.77896: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30564 1726882807.77900: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882807.77908: getting variables 30564 1726882807.77909: in VariableManager get_vars() 30564 1726882807.77920: Calling all_inventory to load vars for managed_node2 30564 1726882807.77922: Calling groups_inventory to load vars for managed_node2 30564 1726882807.77929: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882807.77934: Calling all_plugins_play to load vars for managed_node2 30564 1726882807.77936: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882807.77939: Calling groups_plugins_play to load vars for managed_node2 30564 1726882807.78095: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882807.78336: done with get_vars() 30564 1726882807.78345: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:40:07 -0400 (0:00:00.054) 0:00:06.365 ****** 30564 1726882807.78419: entering _queue_task() for managed_node2/setup 30564 1726882807.78638: worker is 1 (out of 1 available) 30564 1726882807.78650: exiting _queue_task() for managed_node2/setup 30564 1726882807.78660: done queuing things up, now waiting for results queue to drain 30564 1726882807.78661: waiting for pending results... 30564 1726882807.78927: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30564 1726882807.79075: in run() - task 0e448fcc-3ce9-4216-acec-000000000269 30564 1726882807.79094: variable 'ansible_search_path' from source: unknown 30564 1726882807.79105: variable 'ansible_search_path' from source: unknown 30564 1726882807.79144: calling self._execute() 30564 1726882807.79226: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882807.79239: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882807.79252: variable 'omit' from source: magic vars 30564 1726882807.79603: variable 'ansible_distribution_major_version' from source: facts 30564 1726882807.79619: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882807.79883: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882807.82162: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882807.82234: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882807.82285: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882807.82334: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882807.82376: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882807.82453: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882807.82497: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882807.82528: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882807.82584: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882807.82605: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882807.82657: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882807.82700: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882807.82730: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882807.82780: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882807.82809: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882807.82973: variable '__network_required_facts' from source: role '' defaults 30564 1726882807.82987: variable 'ansible_facts' from source: unknown 30564 1726882807.83091: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 30564 1726882807.83099: when evaluation is False, skipping this task 30564 1726882807.83106: _execute() done 30564 1726882807.83117: dumping result to json 30564 1726882807.83130: done dumping result, returning 30564 1726882807.83141: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0e448fcc-3ce9-4216-acec-000000000269] 30564 1726882807.83151: sending task result for task 0e448fcc-3ce9-4216-acec-000000000269 skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30564 1726882807.83294: no more pending results, returning what we have 30564 1726882807.83298: results queue empty 30564 1726882807.83299: checking for any_errors_fatal 30564 1726882807.83301: done checking for any_errors_fatal 30564 1726882807.83302: checking for max_fail_percentage 30564 1726882807.83303: done checking for max_fail_percentage 30564 1726882807.83304: checking to see if all hosts have failed and the running result is not ok 30564 1726882807.83305: done checking to see if all hosts have failed 30564 1726882807.83306: getting the remaining hosts for this loop 30564 1726882807.83308: done getting the remaining hosts for this loop 30564 1726882807.83312: getting the next task for host managed_node2 30564 1726882807.83323: done getting next task for host managed_node2 30564 1726882807.83328: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 30564 1726882807.83334: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882807.83349: getting variables 30564 1726882807.83351: in VariableManager get_vars() 30564 1726882807.83389: Calling all_inventory to load vars for managed_node2 30564 1726882807.83392: Calling groups_inventory to load vars for managed_node2 30564 1726882807.83395: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882807.83404: Calling all_plugins_play to load vars for managed_node2 30564 1726882807.83407: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882807.83410: Calling groups_plugins_play to load vars for managed_node2 30564 1726882807.83624: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882807.83879: done with get_vars() 30564 1726882807.84002: done getting variables 30564 1726882807.84032: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000269 30564 1726882807.84035: WORKER PROCESS EXITING TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:40:07 -0400 (0:00:00.058) 0:00:06.423 ****** 30564 1726882807.84224: entering _queue_task() for managed_node2/stat 30564 1726882807.84461: worker is 1 (out of 1 available) 30564 1726882807.84476: exiting _queue_task() for managed_node2/stat 30564 1726882807.84486: done queuing things up, now waiting for results queue to drain 30564 1726882807.84487: waiting for pending results... 30564 1726882807.84748: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 30564 1726882807.84897: in run() - task 0e448fcc-3ce9-4216-acec-00000000026b 30564 1726882807.84913: variable 'ansible_search_path' from source: unknown 30564 1726882807.84919: variable 'ansible_search_path' from source: unknown 30564 1726882807.84955: calling self._execute() 30564 1726882807.85040: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882807.85052: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882807.85071: variable 'omit' from source: magic vars 30564 1726882807.85438: variable 'ansible_distribution_major_version' from source: facts 30564 1726882807.85457: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882807.85640: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882807.85927: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882807.85983: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882807.86024: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882807.86069: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882807.86156: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882807.86194: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882807.86225: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882807.86259: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882807.86353: variable '__network_is_ostree' from source: set_fact 30564 1726882807.86370: Evaluated conditional (not __network_is_ostree is defined): False 30564 1726882807.86383: when evaluation is False, skipping this task 30564 1726882807.86390: _execute() done 30564 1726882807.86397: dumping result to json 30564 1726882807.86404: done dumping result, returning 30564 1726882807.86415: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0e448fcc-3ce9-4216-acec-00000000026b] 30564 1726882807.86426: sending task result for task 0e448fcc-3ce9-4216-acec-00000000026b 30564 1726882807.86536: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000026b 30564 1726882807.86544: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30564 1726882807.86606: no more pending results, returning what we have 30564 1726882807.86610: results queue empty 30564 1726882807.86611: checking for any_errors_fatal 30564 1726882807.86619: done checking for any_errors_fatal 30564 1726882807.86619: checking for max_fail_percentage 30564 1726882807.86622: done checking for max_fail_percentage 30564 1726882807.86623: checking to see if all hosts have failed and the running result is not ok 30564 1726882807.86624: done checking to see if all hosts have failed 30564 1726882807.86625: getting the remaining hosts for this loop 30564 1726882807.86626: done getting the remaining hosts for this loop 30564 1726882807.86630: getting the next task for host managed_node2 30564 1726882807.86640: done getting next task for host managed_node2 30564 1726882807.86644: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30564 1726882807.86650: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882807.86665: getting variables 30564 1726882807.86670: in VariableManager get_vars() 30564 1726882807.86705: Calling all_inventory to load vars for managed_node2 30564 1726882807.86707: Calling groups_inventory to load vars for managed_node2 30564 1726882807.86710: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882807.86720: Calling all_plugins_play to load vars for managed_node2 30564 1726882807.86723: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882807.86726: Calling groups_plugins_play to load vars for managed_node2 30564 1726882807.86919: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882807.87162: done with get_vars() 30564 1726882807.87178: done getting variables 30564 1726882807.87350: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:40:07 -0400 (0:00:00.031) 0:00:06.455 ****** 30564 1726882807.87395: entering _queue_task() for managed_node2/set_fact 30564 1726882807.87794: worker is 1 (out of 1 available) 30564 1726882807.87804: exiting _queue_task() for managed_node2/set_fact 30564 1726882807.87817: done queuing things up, now waiting for results queue to drain 30564 1726882807.87818: waiting for pending results... 30564 1726882807.88102: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30564 1726882807.88246: in run() - task 0e448fcc-3ce9-4216-acec-00000000026c 30564 1726882807.88281: variable 'ansible_search_path' from source: unknown 30564 1726882807.88290: variable 'ansible_search_path' from source: unknown 30564 1726882807.88328: calling self._execute() 30564 1726882807.88418: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882807.88430: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882807.88443: variable 'omit' from source: magic vars 30564 1726882807.88847: variable 'ansible_distribution_major_version' from source: facts 30564 1726882807.88865: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882807.89040: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882807.89314: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882807.89371: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882807.89407: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882807.89445: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882807.89538: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882807.89578: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882807.89610: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882807.89639: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882807.89731: variable '__network_is_ostree' from source: set_fact 30564 1726882807.89743: Evaluated conditional (not __network_is_ostree is defined): False 30564 1726882807.89751: when evaluation is False, skipping this task 30564 1726882807.89757: _execute() done 30564 1726882807.89766: dumping result to json 30564 1726882807.89777: done dumping result, returning 30564 1726882807.89793: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0e448fcc-3ce9-4216-acec-00000000026c] 30564 1726882807.89803: sending task result for task 0e448fcc-3ce9-4216-acec-00000000026c skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30564 1726882807.89935: no more pending results, returning what we have 30564 1726882807.89939: results queue empty 30564 1726882807.89940: checking for any_errors_fatal 30564 1726882807.89945: done checking for any_errors_fatal 30564 1726882807.89946: checking for max_fail_percentage 30564 1726882807.89948: done checking for max_fail_percentage 30564 1726882807.89949: checking to see if all hosts have failed and the running result is not ok 30564 1726882807.89950: done checking to see if all hosts have failed 30564 1726882807.89951: getting the remaining hosts for this loop 30564 1726882807.89953: done getting the remaining hosts for this loop 30564 1726882807.89956: getting the next task for host managed_node2 30564 1726882807.89970: done getting next task for host managed_node2 30564 1726882807.89975: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 30564 1726882807.89980: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882807.89993: getting variables 30564 1726882807.89995: in VariableManager get_vars() 30564 1726882807.90026: Calling all_inventory to load vars for managed_node2 30564 1726882807.90029: Calling groups_inventory to load vars for managed_node2 30564 1726882807.90031: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882807.90040: Calling all_plugins_play to load vars for managed_node2 30564 1726882807.90043: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882807.90046: Calling groups_plugins_play to load vars for managed_node2 30564 1726882807.90280: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882807.90523: done with get_vars() 30564 1726882807.90532: done getting variables 30564 1726882807.90713: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000026c 30564 1726882807.90716: WORKER PROCESS EXITING TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:40:07 -0400 (0:00:00.033) 0:00:06.488 ****** 30564 1726882807.90754: entering _queue_task() for managed_node2/service_facts 30564 1726882807.90755: Creating lock for service_facts 30564 1726882807.91100: worker is 1 (out of 1 available) 30564 1726882807.91112: exiting _queue_task() for managed_node2/service_facts 30564 1726882807.91123: done queuing things up, now waiting for results queue to drain 30564 1726882807.91125: waiting for pending results... 30564 1726882807.91388: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 30564 1726882807.91537: in run() - task 0e448fcc-3ce9-4216-acec-00000000026e 30564 1726882807.91558: variable 'ansible_search_path' from source: unknown 30564 1726882807.91576: variable 'ansible_search_path' from source: unknown 30564 1726882807.91617: calling self._execute() 30564 1726882807.91706: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882807.91718: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882807.91731: variable 'omit' from source: magic vars 30564 1726882807.92078: variable 'ansible_distribution_major_version' from source: facts 30564 1726882807.92097: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882807.92112: variable 'omit' from source: magic vars 30564 1726882807.92202: variable 'omit' from source: magic vars 30564 1726882807.92248: variable 'omit' from source: magic vars 30564 1726882807.92297: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882807.92347: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882807.92377: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882807.92400: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882807.92417: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882807.92457: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882807.92471: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882807.92480: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882807.92596: Set connection var ansible_timeout to 10 30564 1726882807.92607: Set connection var ansible_pipelining to False 30564 1726882807.92614: Set connection var ansible_shell_type to sh 30564 1726882807.92624: Set connection var ansible_shell_executable to /bin/sh 30564 1726882807.92635: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882807.92641: Set connection var ansible_connection to ssh 30564 1726882807.92680: variable 'ansible_shell_executable' from source: unknown 30564 1726882807.92688: variable 'ansible_connection' from source: unknown 30564 1726882807.92696: variable 'ansible_module_compression' from source: unknown 30564 1726882807.92703: variable 'ansible_shell_type' from source: unknown 30564 1726882807.92709: variable 'ansible_shell_executable' from source: unknown 30564 1726882807.92715: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882807.92722: variable 'ansible_pipelining' from source: unknown 30564 1726882807.92729: variable 'ansible_timeout' from source: unknown 30564 1726882807.92736: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882807.92952: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30564 1726882807.92975: variable 'omit' from source: magic vars 30564 1726882807.92986: starting attempt loop 30564 1726882807.92997: running the handler 30564 1726882807.93015: _low_level_execute_command(): starting 30564 1726882807.93027: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882807.93829: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882807.93848: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882807.93873: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882807.93894: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882807.93938: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882807.93953: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882807.93975: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882807.93998: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882807.94010: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882807.94023: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882807.94036: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882807.94051: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882807.94076: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882807.94094: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882807.94108: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882807.94124: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882807.94210: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882807.94234: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882807.94251: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882807.94397: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882807.96080: stdout chunk (state=3): >>>/root <<< 30564 1726882807.96190: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882807.96288: stderr chunk (state=3): >>><<< 30564 1726882807.96302: stdout chunk (state=3): >>><<< 30564 1726882807.96439: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882807.96442: _low_level_execute_command(): starting 30564 1726882807.96445: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882807.9635723-30927-211515638846683 `" && echo ansible-tmp-1726882807.9635723-30927-211515638846683="` echo /root/.ansible/tmp/ansible-tmp-1726882807.9635723-30927-211515638846683 `" ) && sleep 0' 30564 1726882807.97146: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882807.97150: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882807.97195: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882807.97199: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 30564 1726882807.97201: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882807.97260: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882807.97268: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882807.97272: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882807.97376: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882807.99278: stdout chunk (state=3): >>>ansible-tmp-1726882807.9635723-30927-211515638846683=/root/.ansible/tmp/ansible-tmp-1726882807.9635723-30927-211515638846683 <<< 30564 1726882807.99391: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882807.99450: stderr chunk (state=3): >>><<< 30564 1726882807.99453: stdout chunk (state=3): >>><<< 30564 1726882807.99773: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882807.9635723-30927-211515638846683=/root/.ansible/tmp/ansible-tmp-1726882807.9635723-30927-211515638846683 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882807.99777: variable 'ansible_module_compression' from source: unknown 30564 1726882807.99779: ANSIBALLZ: Using lock for service_facts 30564 1726882807.99781: ANSIBALLZ: Acquiring lock 30564 1726882807.99783: ANSIBALLZ: Lock acquired: 140506261315008 30564 1726882807.99785: ANSIBALLZ: Creating module 30564 1726882808.14160: ANSIBALLZ: Writing module into payload 30564 1726882808.14290: ANSIBALLZ: Writing module 30564 1726882808.14317: ANSIBALLZ: Renaming module 30564 1726882808.14332: ANSIBALLZ: Done creating module 30564 1726882808.14354: variable 'ansible_facts' from source: unknown 30564 1726882808.14437: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882807.9635723-30927-211515638846683/AnsiballZ_service_facts.py 30564 1726882808.14594: Sending initial data 30564 1726882808.14597: Sent initial data (162 bytes) 30564 1726882808.15604: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882808.15618: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882808.15634: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882808.15656: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882808.15704: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882808.15717: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882808.15732: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882808.15750: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882808.15767: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882808.15781: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882808.15798: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882808.15813: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882808.15829: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882808.15842: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882808.15854: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882808.15870: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882808.15951: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882808.15970: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882808.15987: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882808.16137: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882808.17967: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882808.18059: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882808.18159: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmpq4_nxpv5 /root/.ansible/tmp/ansible-tmp-1726882807.9635723-30927-211515638846683/AnsiballZ_service_facts.py <<< 30564 1726882808.18254: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882808.19635: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882808.19773: stderr chunk (state=3): >>><<< 30564 1726882808.19777: stdout chunk (state=3): >>><<< 30564 1726882808.19793: done transferring module to remote 30564 1726882808.19807: _low_level_execute_command(): starting 30564 1726882808.19810: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882807.9635723-30927-211515638846683/ /root/.ansible/tmp/ansible-tmp-1726882807.9635723-30927-211515638846683/AnsiballZ_service_facts.py && sleep 0' 30564 1726882808.20456: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882808.20467: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882808.20481: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882808.20493: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882808.20533: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882808.20539: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882808.20549: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882808.20562: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882808.20574: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882808.20581: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882808.20589: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882808.20598: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882808.20610: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882808.20619: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882808.20625: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882808.20642: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882808.20719: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882808.20735: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882808.20746: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882808.20880: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882808.22717: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882808.22720: stdout chunk (state=3): >>><<< 30564 1726882808.22722: stderr chunk (state=3): >>><<< 30564 1726882808.22803: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882808.22806: _low_level_execute_command(): starting 30564 1726882808.22809: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882807.9635723-30927-211515638846683/AnsiballZ_service_facts.py && sleep 0' 30564 1726882808.23332: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882808.23348: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882808.23365: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882808.23385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882808.23427: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882808.23439: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882808.23453: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882808.23475: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882808.23489: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882808.23500: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882808.23512: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882808.23526: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882808.23541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882808.23554: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882808.23569: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882808.23583: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882808.23647: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882808.23668: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882808.23684: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882808.23819: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882809.58001: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": <<< 30564 1726882809.58056: stdout chunk (state=3): >>>"systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "a<<< 30564 1726882809.58065: stdout chunk (state=3): >>>lias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 30564 1726882809.59396: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882809.59399: stdout chunk (state=3): >>><<< 30564 1726882809.59401: stderr chunk (state=3): >>><<< 30564 1726882809.59679: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882809.60086: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882807.9635723-30927-211515638846683/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882809.60101: _low_level_execute_command(): starting 30564 1726882809.60115: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882807.9635723-30927-211515638846683/ > /dev/null 2>&1 && sleep 0' 30564 1726882809.61154: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882809.61158: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882809.61198: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 30564 1726882809.61201: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882809.61204: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882809.61271: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882809.61280: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882809.61292: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882809.61434: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882809.63285: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882809.63339: stderr chunk (state=3): >>><<< 30564 1726882809.63343: stdout chunk (state=3): >>><<< 30564 1726882809.63604: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882809.63608: handler run complete 30564 1726882809.63610: variable 'ansible_facts' from source: unknown 30564 1726882809.63743: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882809.64250: variable 'ansible_facts' from source: unknown 30564 1726882809.64399: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882809.64606: attempt loop complete, returning result 30564 1726882809.64616: _execute() done 30564 1726882809.64622: dumping result to json 30564 1726882809.64691: done dumping result, returning 30564 1726882809.64710: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [0e448fcc-3ce9-4216-acec-00000000026e] 30564 1726882809.64721: sending task result for task 0e448fcc-3ce9-4216-acec-00000000026e ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30564 1726882809.65645: no more pending results, returning what we have 30564 1726882809.65648: results queue empty 30564 1726882809.65649: checking for any_errors_fatal 30564 1726882809.65652: done checking for any_errors_fatal 30564 1726882809.65652: checking for max_fail_percentage 30564 1726882809.65654: done checking for max_fail_percentage 30564 1726882809.65655: checking to see if all hosts have failed and the running result is not ok 30564 1726882809.65655: done checking to see if all hosts have failed 30564 1726882809.65656: getting the remaining hosts for this loop 30564 1726882809.65658: done getting the remaining hosts for this loop 30564 1726882809.65661: getting the next task for host managed_node2 30564 1726882809.65672: done getting next task for host managed_node2 30564 1726882809.65675: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 30564 1726882809.65681: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882809.65689: getting variables 30564 1726882809.65691: in VariableManager get_vars() 30564 1726882809.65717: Calling all_inventory to load vars for managed_node2 30564 1726882809.65720: Calling groups_inventory to load vars for managed_node2 30564 1726882809.65722: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882809.65731: Calling all_plugins_play to load vars for managed_node2 30564 1726882809.65734: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882809.65737: Calling groups_plugins_play to load vars for managed_node2 30564 1726882809.66120: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882809.66924: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000026e 30564 1726882809.66927: WORKER PROCESS EXITING 30564 1726882809.67088: done with get_vars() 30564 1726882809.67107: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:40:09 -0400 (0:00:01.764) 0:00:08.253 ****** 30564 1726882809.67228: entering _queue_task() for managed_node2/package_facts 30564 1726882809.67230: Creating lock for package_facts 30564 1726882809.67518: worker is 1 (out of 1 available) 30564 1726882809.67529: exiting _queue_task() for managed_node2/package_facts 30564 1726882809.67541: done queuing things up, now waiting for results queue to drain 30564 1726882809.67543: waiting for pending results... 30564 1726882809.67845: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 30564 1726882809.68049: in run() - task 0e448fcc-3ce9-4216-acec-00000000026f 30564 1726882809.68074: variable 'ansible_search_path' from source: unknown 30564 1726882809.68083: variable 'ansible_search_path' from source: unknown 30564 1726882809.68124: calling self._execute() 30564 1726882809.68216: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882809.68229: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882809.68243: variable 'omit' from source: magic vars 30564 1726882809.68759: variable 'ansible_distribution_major_version' from source: facts 30564 1726882809.68786: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882809.68797: variable 'omit' from source: magic vars 30564 1726882809.68886: variable 'omit' from source: magic vars 30564 1726882809.68920: variable 'omit' from source: magic vars 30564 1726882809.68961: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882809.69011: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882809.69033: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882809.69057: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882809.69080: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882809.69119: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882809.69127: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882809.69134: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882809.69246: Set connection var ansible_timeout to 10 30564 1726882809.69256: Set connection var ansible_pipelining to False 30564 1726882809.69262: Set connection var ansible_shell_type to sh 30564 1726882809.69277: Set connection var ansible_shell_executable to /bin/sh 30564 1726882809.69288: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882809.69293: Set connection var ansible_connection to ssh 30564 1726882809.69324: variable 'ansible_shell_executable' from source: unknown 30564 1726882809.69331: variable 'ansible_connection' from source: unknown 30564 1726882809.69337: variable 'ansible_module_compression' from source: unknown 30564 1726882809.69342: variable 'ansible_shell_type' from source: unknown 30564 1726882809.69347: variable 'ansible_shell_executable' from source: unknown 30564 1726882809.69352: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882809.69358: variable 'ansible_pipelining' from source: unknown 30564 1726882809.69365: variable 'ansible_timeout' from source: unknown 30564 1726882809.69374: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882809.69573: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30564 1726882809.69589: variable 'omit' from source: magic vars 30564 1726882809.69598: starting attempt loop 30564 1726882809.69604: running the handler 30564 1726882809.69619: _low_level_execute_command(): starting 30564 1726882809.69635: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882809.70413: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882809.70429: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882809.70443: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882809.70460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882809.70514: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882809.70530: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882809.70543: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882809.70561: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882809.70578: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882809.70589: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882809.70600: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882809.70618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882809.70637: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882809.70647: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882809.70658: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882809.70679: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882809.70766: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882809.70793: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882809.70809: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882809.70957: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882809.72616: stdout chunk (state=3): >>>/root <<< 30564 1726882809.72721: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882809.72805: stderr chunk (state=3): >>><<< 30564 1726882809.72823: stdout chunk (state=3): >>><<< 30564 1726882809.72873: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882809.72876: _low_level_execute_command(): starting 30564 1726882809.72969: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882809.728499-30989-406478606652 `" && echo ansible-tmp-1726882809.728499-30989-406478606652="` echo /root/.ansible/tmp/ansible-tmp-1726882809.728499-30989-406478606652 `" ) && sleep 0' 30564 1726882809.73605: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882809.73627: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882809.73645: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882809.73670: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882809.73717: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882809.73737: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882809.73755: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882809.73781: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882809.73794: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882809.73805: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882809.73818: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882809.73831: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882809.73851: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882809.73862: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882809.73879: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882809.73892: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882809.73977: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882809.73998: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882809.74013: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882809.74142: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882809.76026: stdout chunk (state=3): >>>ansible-tmp-1726882809.728499-30989-406478606652=/root/.ansible/tmp/ansible-tmp-1726882809.728499-30989-406478606652 <<< 30564 1726882809.76218: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882809.76221: stdout chunk (state=3): >>><<< 30564 1726882809.76224: stderr chunk (state=3): >>><<< 30564 1726882809.76571: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882809.728499-30989-406478606652=/root/.ansible/tmp/ansible-tmp-1726882809.728499-30989-406478606652 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882809.76575: variable 'ansible_module_compression' from source: unknown 30564 1726882809.76578: ANSIBALLZ: Using lock for package_facts 30564 1726882809.76581: ANSIBALLZ: Acquiring lock 30564 1726882809.76584: ANSIBALLZ: Lock acquired: 140506259213536 30564 1726882809.76587: ANSIBALLZ: Creating module 30564 1726882810.24079: ANSIBALLZ: Writing module into payload 30564 1726882810.24378: ANSIBALLZ: Writing module 30564 1726882810.24494: ANSIBALLZ: Renaming module 30564 1726882810.24574: ANSIBALLZ: Done creating module 30564 1726882810.24616: variable 'ansible_facts' from source: unknown 30564 1726882810.25049: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882809.728499-30989-406478606652/AnsiballZ_package_facts.py 30564 1726882810.25926: Sending initial data 30564 1726882810.25929: Sent initial data (158 bytes) 30564 1726882810.28636: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882810.28640: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882810.28661: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882810.28722: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882810.28732: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882810.28745: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882810.28831: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882810.28837: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882810.28845: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882810.28855: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882810.28870: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882810.28873: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882810.28881: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882810.28891: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882810.28970: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882810.29054: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882810.29066: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882810.29275: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882810.31135: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 <<< 30564 1726882810.31140: stderr chunk (state=3): >>>debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882810.31237: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882810.31343: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmp85scutaq /root/.ansible/tmp/ansible-tmp-1726882809.728499-30989-406478606652/AnsiballZ_package_facts.py <<< 30564 1726882810.31436: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882810.34644: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882810.34668: stderr chunk (state=3): >>><<< 30564 1726882810.34672: stdout chunk (state=3): >>><<< 30564 1726882810.34776: done transferring module to remote 30564 1726882810.34779: _low_level_execute_command(): starting 30564 1726882810.34782: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882809.728499-30989-406478606652/ /root/.ansible/tmp/ansible-tmp-1726882809.728499-30989-406478606652/AnsiballZ_package_facts.py && sleep 0' 30564 1726882810.35327: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882810.35342: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882810.35357: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882810.35381: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882810.35421: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882810.35435: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882810.35450: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882810.35469: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882810.35484: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882810.35496: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882810.35509: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882810.35522: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882810.35542: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882810.35555: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882810.35570: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882810.35584: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882810.35660: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882810.35679: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882810.35694: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882810.35848: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882810.37608: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882810.37661: stderr chunk (state=3): >>><<< 30564 1726882810.37664: stdout chunk (state=3): >>><<< 30564 1726882810.37681: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882810.37684: _low_level_execute_command(): starting 30564 1726882810.37689: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882809.728499-30989-406478606652/AnsiballZ_package_facts.py && sleep 0' 30564 1726882810.38253: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882810.38263: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882810.38279: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882810.38292: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882810.38327: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882810.38334: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882810.38344: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882810.38357: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882810.38366: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882810.38382: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882810.38385: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882810.38396: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882810.38407: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882810.38414: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882810.38420: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882810.38429: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882810.38501: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882810.38515: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882810.38526: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882810.38661: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882810.84539: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "e<<< 30564 1726882810.84559: stdout chunk (state=3): >>>poch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": nu<<< 30564 1726882810.84566: stdout chunk (state=3): >>>ll, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects"<<< 30564 1726882810.84577: stdout chunk (state=3): >>>: [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source"<<< 30564 1726882810.84582: stdout chunk (state=3): >>>: "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release<<< 30564 1726882810.84590: stdout chunk (state=3): >>>": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]<<< 30564 1726882810.84610: stdout chunk (state=3): >>>, "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202<<< 30564 1726882810.84646: stdout chunk (state=3): >>>", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-<<< 30564 1726882810.84679: stdout chunk (state=3): >>>base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "a<<< 30564 1726882810.84688: stdout chunk (state=3): >>>rch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "sour<<< 30564 1726882810.84691: stdout chunk (state=3): >>>ce": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, <<< 30564 1726882810.84695: stdout chunk (state=3): >>>"arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300<<< 30564 1726882810.84705: stdout chunk (state=3): >>>", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64"<<< 30564 1726882810.84720: stdout chunk (state=3): >>>, "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_6<<< 30564 1726882810.84737: stdout chunk (state=3): >>>4", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", <<< 30564 1726882810.84756: stdout chunk (state=3): >>>"release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch<<< 30564 1726882810.84760: stdout chunk (state=3): >>>", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 30564 1726882810.86293: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882810.86352: stderr chunk (state=3): >>><<< 30564 1726882810.86355: stdout chunk (state=3): >>><<< 30564 1726882810.86394: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882810.88293: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882809.728499-30989-406478606652/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882810.88320: _low_level_execute_command(): starting 30564 1726882810.88330: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882809.728499-30989-406478606652/ > /dev/null 2>&1 && sleep 0' 30564 1726882810.88909: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882810.88922: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882810.88937: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882810.88955: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882810.88997: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882810.89007: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882810.89018: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882810.89032: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882810.89042: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882810.89050: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882810.89060: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882810.89080: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882810.89099: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882810.89112: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882810.89125: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882810.89140: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882810.89217: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882810.89241: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882810.89257: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882810.89397: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882810.91324: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882810.91328: stdout chunk (state=3): >>><<< 30564 1726882810.91330: stderr chunk (state=3): >>><<< 30564 1726882810.91373: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882810.91376: handler run complete 30564 1726882810.92330: variable 'ansible_facts' from source: unknown 30564 1726882810.92839: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882810.95176: variable 'ansible_facts' from source: unknown 30564 1726882810.95665: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882810.96535: attempt loop complete, returning result 30564 1726882810.96553: _execute() done 30564 1726882810.96561: dumping result to json 30564 1726882810.96818: done dumping result, returning 30564 1726882810.96832: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0e448fcc-3ce9-4216-acec-00000000026f] 30564 1726882810.96844: sending task result for task 0e448fcc-3ce9-4216-acec-00000000026f 30564 1726882811.00107: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000026f 30564 1726882811.00110: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30564 1726882811.00258: no more pending results, returning what we have 30564 1726882811.00261: results queue empty 30564 1726882811.00262: checking for any_errors_fatal 30564 1726882811.00274: done checking for any_errors_fatal 30564 1726882811.00275: checking for max_fail_percentage 30564 1726882811.00277: done checking for max_fail_percentage 30564 1726882811.00278: checking to see if all hosts have failed and the running result is not ok 30564 1726882811.00279: done checking to see if all hosts have failed 30564 1726882811.00280: getting the remaining hosts for this loop 30564 1726882811.00282: done getting the remaining hosts for this loop 30564 1726882811.00286: getting the next task for host managed_node2 30564 1726882811.00295: done getting next task for host managed_node2 30564 1726882811.00298: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 30564 1726882811.00304: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882811.00314: getting variables 30564 1726882811.00316: in VariableManager get_vars() 30564 1726882811.00346: Calling all_inventory to load vars for managed_node2 30564 1726882811.00349: Calling groups_inventory to load vars for managed_node2 30564 1726882811.00354: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882811.00369: Calling all_plugins_play to load vars for managed_node2 30564 1726882811.00372: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882811.00375: Calling groups_plugins_play to load vars for managed_node2 30564 1726882811.02186: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882811.06400: done with get_vars() 30564 1726882811.06426: done getting variables 30564 1726882811.06489: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:40:11 -0400 (0:00:01.392) 0:00:09.646 ****** 30564 1726882811.06521: entering _queue_task() for managed_node2/debug 30564 1726882811.07220: worker is 1 (out of 1 available) 30564 1726882811.07230: exiting _queue_task() for managed_node2/debug 30564 1726882811.07242: done queuing things up, now waiting for results queue to drain 30564 1726882811.07243: waiting for pending results... 30564 1726882811.07762: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 30564 1726882811.07991: in run() - task 0e448fcc-3ce9-4216-acec-00000000020d 30564 1726882811.08003: variable 'ansible_search_path' from source: unknown 30564 1726882811.08007: variable 'ansible_search_path' from source: unknown 30564 1726882811.08039: calling self._execute() 30564 1726882811.08230: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882811.08234: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882811.08245: variable 'omit' from source: magic vars 30564 1726882811.09009: variable 'ansible_distribution_major_version' from source: facts 30564 1726882811.09134: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882811.09141: variable 'omit' from source: magic vars 30564 1726882811.09202: variable 'omit' from source: magic vars 30564 1726882811.09336: variable 'network_provider' from source: set_fact 30564 1726882811.09478: variable 'omit' from source: magic vars 30564 1726882811.09526: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882811.09639: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882811.09692: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882811.09759: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882811.10583: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882811.10618: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882811.10866: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882811.10876: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882811.10979: Set connection var ansible_timeout to 10 30564 1726882811.11078: Set connection var ansible_pipelining to False 30564 1726882811.11143: Set connection var ansible_shell_type to sh 30564 1726882811.11156: Set connection var ansible_shell_executable to /bin/sh 30564 1726882811.11171: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882811.11181: Set connection var ansible_connection to ssh 30564 1726882811.11210: variable 'ansible_shell_executable' from source: unknown 30564 1726882811.11875: variable 'ansible_connection' from source: unknown 30564 1726882811.11884: variable 'ansible_module_compression' from source: unknown 30564 1726882811.11891: variable 'ansible_shell_type' from source: unknown 30564 1726882811.11898: variable 'ansible_shell_executable' from source: unknown 30564 1726882811.11905: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882811.11912: variable 'ansible_pipelining' from source: unknown 30564 1726882811.11918: variable 'ansible_timeout' from source: unknown 30564 1726882811.11926: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882811.12069: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882811.12087: variable 'omit' from source: magic vars 30564 1726882811.12098: starting attempt loop 30564 1726882811.12105: running the handler 30564 1726882811.12151: handler run complete 30564 1726882811.12172: attempt loop complete, returning result 30564 1726882811.12179: _execute() done 30564 1726882811.12185: dumping result to json 30564 1726882811.12191: done dumping result, returning 30564 1726882811.12201: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [0e448fcc-3ce9-4216-acec-00000000020d] 30564 1726882811.12210: sending task result for task 0e448fcc-3ce9-4216-acec-00000000020d ok: [managed_node2] => {} MSG: Using network provider: nm 30564 1726882811.12360: no more pending results, returning what we have 30564 1726882811.12365: results queue empty 30564 1726882811.12367: checking for any_errors_fatal 30564 1726882811.12376: done checking for any_errors_fatal 30564 1726882811.12376: checking for max_fail_percentage 30564 1726882811.12378: done checking for max_fail_percentage 30564 1726882811.12379: checking to see if all hosts have failed and the running result is not ok 30564 1726882811.12380: done checking to see if all hosts have failed 30564 1726882811.12381: getting the remaining hosts for this loop 30564 1726882811.12383: done getting the remaining hosts for this loop 30564 1726882811.12387: getting the next task for host managed_node2 30564 1726882811.12396: done getting next task for host managed_node2 30564 1726882811.12401: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30564 1726882811.12408: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882811.12419: getting variables 30564 1726882811.12421: in VariableManager get_vars() 30564 1726882811.12457: Calling all_inventory to load vars for managed_node2 30564 1726882811.12460: Calling groups_inventory to load vars for managed_node2 30564 1726882811.12462: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882811.12474: Calling all_plugins_play to load vars for managed_node2 30564 1726882811.12477: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882811.12480: Calling groups_plugins_play to load vars for managed_node2 30564 1726882811.13482: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000020d 30564 1726882811.13486: WORKER PROCESS EXITING 30564 1726882811.14135: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882811.16103: done with get_vars() 30564 1726882811.16130: done getting variables 30564 1726882811.16332: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:40:11 -0400 (0:00:00.098) 0:00:09.745 ****** 30564 1726882811.16377: entering _queue_task() for managed_node2/fail 30564 1726882811.16379: Creating lock for fail 30564 1726882811.16728: worker is 1 (out of 1 available) 30564 1726882811.16743: exiting _queue_task() for managed_node2/fail 30564 1726882811.16754: done queuing things up, now waiting for results queue to drain 30564 1726882811.16756: waiting for pending results... 30564 1726882811.17113: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30564 1726882811.17272: in run() - task 0e448fcc-3ce9-4216-acec-00000000020e 30564 1726882811.17293: variable 'ansible_search_path' from source: unknown 30564 1726882811.17304: variable 'ansible_search_path' from source: unknown 30564 1726882811.17351: calling self._execute() 30564 1726882811.17451: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882811.17463: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882811.17487: variable 'omit' from source: magic vars 30564 1726882811.17896: variable 'ansible_distribution_major_version' from source: facts 30564 1726882811.17921: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882811.18060: variable 'network_state' from source: role '' defaults 30564 1726882811.18082: Evaluated conditional (network_state != {}): False 30564 1726882811.18091: when evaluation is False, skipping this task 30564 1726882811.18103: _execute() done 30564 1726882811.18110: dumping result to json 30564 1726882811.18116: done dumping result, returning 30564 1726882811.18130: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0e448fcc-3ce9-4216-acec-00000000020e] 30564 1726882811.18145: sending task result for task 0e448fcc-3ce9-4216-acec-00000000020e 30564 1726882811.18270: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000020e 30564 1726882811.18280: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30564 1726882811.18328: no more pending results, returning what we have 30564 1726882811.18333: results queue empty 30564 1726882811.18334: checking for any_errors_fatal 30564 1726882811.18341: done checking for any_errors_fatal 30564 1726882811.18342: checking for max_fail_percentage 30564 1726882811.18344: done checking for max_fail_percentage 30564 1726882811.18345: checking to see if all hosts have failed and the running result is not ok 30564 1726882811.18346: done checking to see if all hosts have failed 30564 1726882811.18347: getting the remaining hosts for this loop 30564 1726882811.18349: done getting the remaining hosts for this loop 30564 1726882811.18353: getting the next task for host managed_node2 30564 1726882811.18364: done getting next task for host managed_node2 30564 1726882811.18371: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30564 1726882811.18379: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882811.18395: getting variables 30564 1726882811.18397: in VariableManager get_vars() 30564 1726882811.18432: Calling all_inventory to load vars for managed_node2 30564 1726882811.18435: Calling groups_inventory to load vars for managed_node2 30564 1726882811.18438: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882811.18450: Calling all_plugins_play to load vars for managed_node2 30564 1726882811.18453: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882811.18457: Calling groups_plugins_play to load vars for managed_node2 30564 1726882811.20241: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882811.22926: done with get_vars() 30564 1726882811.22954: done getting variables 30564 1726882811.23018: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:40:11 -0400 (0:00:00.066) 0:00:09.811 ****** 30564 1726882811.23055: entering _queue_task() for managed_node2/fail 30564 1726882811.23335: worker is 1 (out of 1 available) 30564 1726882811.23347: exiting _queue_task() for managed_node2/fail 30564 1726882811.23359: done queuing things up, now waiting for results queue to drain 30564 1726882811.23361: waiting for pending results... 30564 1726882811.23656: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30564 1726882811.23803: in run() - task 0e448fcc-3ce9-4216-acec-00000000020f 30564 1726882811.23832: variable 'ansible_search_path' from source: unknown 30564 1726882811.23841: variable 'ansible_search_path' from source: unknown 30564 1726882811.23882: calling self._execute() 30564 1726882811.23979: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882811.23991: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882811.24005: variable 'omit' from source: magic vars 30564 1726882811.24414: variable 'ansible_distribution_major_version' from source: facts 30564 1726882811.24431: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882811.24619: variable 'network_state' from source: role '' defaults 30564 1726882811.24633: Evaluated conditional (network_state != {}): False 30564 1726882811.24640: when evaluation is False, skipping this task 30564 1726882811.24647: _execute() done 30564 1726882811.24653: dumping result to json 30564 1726882811.24660: done dumping result, returning 30564 1726882811.24694: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0e448fcc-3ce9-4216-acec-00000000020f] 30564 1726882811.24717: sending task result for task 0e448fcc-3ce9-4216-acec-00000000020f skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30564 1726882811.25342: no more pending results, returning what we have 30564 1726882811.25346: results queue empty 30564 1726882811.25347: checking for any_errors_fatal 30564 1726882811.25360: done checking for any_errors_fatal 30564 1726882811.25361: checking for max_fail_percentage 30564 1726882811.25362: done checking for max_fail_percentage 30564 1726882811.25365: checking to see if all hosts have failed and the running result is not ok 30564 1726882811.25366: done checking to see if all hosts have failed 30564 1726882811.25368: getting the remaining hosts for this loop 30564 1726882811.25370: done getting the remaining hosts for this loop 30564 1726882811.25373: getting the next task for host managed_node2 30564 1726882811.25384: done getting next task for host managed_node2 30564 1726882811.25388: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30564 1726882811.25394: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882811.25408: getting variables 30564 1726882811.25410: in VariableManager get_vars() 30564 1726882811.25447: Calling all_inventory to load vars for managed_node2 30564 1726882811.25450: Calling groups_inventory to load vars for managed_node2 30564 1726882811.25453: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882811.25470: Calling all_plugins_play to load vars for managed_node2 30564 1726882811.25474: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882811.25480: Calling groups_plugins_play to load vars for managed_node2 30564 1726882811.26116: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000020f 30564 1726882811.26119: WORKER PROCESS EXITING 30564 1726882811.27878: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882811.30020: done with get_vars() 30564 1726882811.30057: done getting variables 30564 1726882811.30117: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:40:11 -0400 (0:00:00.070) 0:00:09.882 ****** 30564 1726882811.30154: entering _queue_task() for managed_node2/fail 30564 1726882811.30776: worker is 1 (out of 1 available) 30564 1726882811.30789: exiting _queue_task() for managed_node2/fail 30564 1726882811.30800: done queuing things up, now waiting for results queue to drain 30564 1726882811.30801: waiting for pending results... 30564 1726882811.31105: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30564 1726882811.31252: in run() - task 0e448fcc-3ce9-4216-acec-000000000210 30564 1726882811.31278: variable 'ansible_search_path' from source: unknown 30564 1726882811.31285: variable 'ansible_search_path' from source: unknown 30564 1726882811.31399: calling self._execute() 30564 1726882811.31607: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882811.31619: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882811.31634: variable 'omit' from source: magic vars 30564 1726882811.32145: variable 'ansible_distribution_major_version' from source: facts 30564 1726882811.32172: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882811.32487: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882811.34078: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882811.34129: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882811.34155: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882811.34182: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882811.34204: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882811.34260: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882811.34285: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882811.34320: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882811.34344: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882811.34355: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882811.34448: variable 'ansible_distribution_major_version' from source: facts 30564 1726882811.34461: Evaluated conditional (ansible_distribution_major_version | int > 9): False 30564 1726882811.34469: when evaluation is False, skipping this task 30564 1726882811.34473: _execute() done 30564 1726882811.34475: dumping result to json 30564 1726882811.34478: done dumping result, returning 30564 1726882811.34481: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0e448fcc-3ce9-4216-acec-000000000210] 30564 1726882811.34486: sending task result for task 0e448fcc-3ce9-4216-acec-000000000210 30564 1726882811.34577: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000210 30564 1726882811.34579: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 30564 1726882811.34620: no more pending results, returning what we have 30564 1726882811.34623: results queue empty 30564 1726882811.34624: checking for any_errors_fatal 30564 1726882811.34630: done checking for any_errors_fatal 30564 1726882811.34631: checking for max_fail_percentage 30564 1726882811.34633: done checking for max_fail_percentage 30564 1726882811.34633: checking to see if all hosts have failed and the running result is not ok 30564 1726882811.34634: done checking to see if all hosts have failed 30564 1726882811.34635: getting the remaining hosts for this loop 30564 1726882811.34636: done getting the remaining hosts for this loop 30564 1726882811.34640: getting the next task for host managed_node2 30564 1726882811.34647: done getting next task for host managed_node2 30564 1726882811.34651: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30564 1726882811.34656: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882811.34674: getting variables 30564 1726882811.34677: in VariableManager get_vars() 30564 1726882811.34709: Calling all_inventory to load vars for managed_node2 30564 1726882811.34711: Calling groups_inventory to load vars for managed_node2 30564 1726882811.34713: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882811.34722: Calling all_plugins_play to load vars for managed_node2 30564 1726882811.34724: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882811.34726: Calling groups_plugins_play to load vars for managed_node2 30564 1726882811.36050: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882811.36992: done with get_vars() 30564 1726882811.37008: done getting variables 30564 1726882811.37075: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:40:11 -0400 (0:00:00.069) 0:00:09.952 ****** 30564 1726882811.37097: entering _queue_task() for managed_node2/dnf 30564 1726882811.37284: worker is 1 (out of 1 available) 30564 1726882811.37297: exiting _queue_task() for managed_node2/dnf 30564 1726882811.37308: done queuing things up, now waiting for results queue to drain 30564 1726882811.37309: waiting for pending results... 30564 1726882811.37481: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30564 1726882811.37558: in run() - task 0e448fcc-3ce9-4216-acec-000000000211 30564 1726882811.37569: variable 'ansible_search_path' from source: unknown 30564 1726882811.37575: variable 'ansible_search_path' from source: unknown 30564 1726882811.37603: calling self._execute() 30564 1726882811.37662: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882811.37669: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882811.37681: variable 'omit' from source: magic vars 30564 1726882811.38170: variable 'ansible_distribution_major_version' from source: facts 30564 1726882811.38174: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882811.38289: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882811.40533: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882811.40578: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882811.40614: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882811.40637: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882811.40657: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882811.40717: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882811.40737: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882811.40756: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882811.40787: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882811.40799: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882811.40875: variable 'ansible_distribution' from source: facts 30564 1726882811.40878: variable 'ansible_distribution_major_version' from source: facts 30564 1726882811.40890: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 30564 1726882811.40965: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882811.41046: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882811.41062: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882811.41086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882811.41112: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882811.41123: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882811.41152: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882811.41171: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882811.41189: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882811.41214: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882811.41224: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882811.41252: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882811.41271: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882811.41289: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882811.41314: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882811.41324: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882811.41454: variable 'network_connections' from source: include params 30564 1726882811.41462: variable 'interface' from source: play vars 30564 1726882811.41533: variable 'interface' from source: play vars 30564 1726882811.41607: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882811.41744: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882811.41777: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882811.42578: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882811.42581: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882811.42583: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882811.42588: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882811.42595: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882811.42597: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882811.42599: variable '__network_team_connections_defined' from source: role '' defaults 30564 1726882811.42601: variable 'network_connections' from source: include params 30564 1726882811.42603: variable 'interface' from source: play vars 30564 1726882811.42605: variable 'interface' from source: play vars 30564 1726882811.42606: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30564 1726882811.42608: when evaluation is False, skipping this task 30564 1726882811.42610: _execute() done 30564 1726882811.42612: dumping result to json 30564 1726882811.42613: done dumping result, returning 30564 1726882811.42615: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0e448fcc-3ce9-4216-acec-000000000211] 30564 1726882811.42617: sending task result for task 0e448fcc-3ce9-4216-acec-000000000211 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30564 1726882811.42737: no more pending results, returning what we have 30564 1726882811.42740: results queue empty 30564 1726882811.42741: checking for any_errors_fatal 30564 1726882811.42746: done checking for any_errors_fatal 30564 1726882811.42747: checking for max_fail_percentage 30564 1726882811.42749: done checking for max_fail_percentage 30564 1726882811.42749: checking to see if all hosts have failed and the running result is not ok 30564 1726882811.42750: done checking to see if all hosts have failed 30564 1726882811.42751: getting the remaining hosts for this loop 30564 1726882811.42752: done getting the remaining hosts for this loop 30564 1726882811.42755: getting the next task for host managed_node2 30564 1726882811.42762: done getting next task for host managed_node2 30564 1726882811.42770: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30564 1726882811.42775: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882811.42788: getting variables 30564 1726882811.42789: in VariableManager get_vars() 30564 1726882811.42826: Calling all_inventory to load vars for managed_node2 30564 1726882811.42828: Calling groups_inventory to load vars for managed_node2 30564 1726882811.42831: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882811.42839: Calling all_plugins_play to load vars for managed_node2 30564 1726882811.42841: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882811.42844: Calling groups_plugins_play to load vars for managed_node2 30564 1726882811.43582: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000211 30564 1726882811.43585: WORKER PROCESS EXITING 30564 1726882811.44213: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882811.45413: done with get_vars() 30564 1726882811.45428: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30564 1726882811.45483: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:40:11 -0400 (0:00:00.084) 0:00:10.036 ****** 30564 1726882811.45504: entering _queue_task() for managed_node2/yum 30564 1726882811.45505: Creating lock for yum 30564 1726882811.45693: worker is 1 (out of 1 available) 30564 1726882811.45707: exiting _queue_task() for managed_node2/yum 30564 1726882811.45718: done queuing things up, now waiting for results queue to drain 30564 1726882811.45719: waiting for pending results... 30564 1726882811.45879: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30564 1726882811.45960: in run() - task 0e448fcc-3ce9-4216-acec-000000000212 30564 1726882811.45973: variable 'ansible_search_path' from source: unknown 30564 1726882811.45977: variable 'ansible_search_path' from source: unknown 30564 1726882811.46004: calling self._execute() 30564 1726882811.46065: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882811.46071: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882811.46078: variable 'omit' from source: magic vars 30564 1726882811.46322: variable 'ansible_distribution_major_version' from source: facts 30564 1726882811.46332: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882811.46532: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882811.49014: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882811.49063: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882811.49094: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882811.49118: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882811.49138: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882811.49195: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882811.49215: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882811.49232: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882811.49257: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882811.49271: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882811.49334: variable 'ansible_distribution_major_version' from source: facts 30564 1726882811.49345: Evaluated conditional (ansible_distribution_major_version | int < 8): False 30564 1726882811.49348: when evaluation is False, skipping this task 30564 1726882811.49350: _execute() done 30564 1726882811.49353: dumping result to json 30564 1726882811.49355: done dumping result, returning 30564 1726882811.49363: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0e448fcc-3ce9-4216-acec-000000000212] 30564 1726882811.49371: sending task result for task 0e448fcc-3ce9-4216-acec-000000000212 30564 1726882811.49451: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000212 30564 1726882811.49454: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 30564 1726882811.49505: no more pending results, returning what we have 30564 1726882811.49508: results queue empty 30564 1726882811.49509: checking for any_errors_fatal 30564 1726882811.49514: done checking for any_errors_fatal 30564 1726882811.49515: checking for max_fail_percentage 30564 1726882811.49516: done checking for max_fail_percentage 30564 1726882811.49517: checking to see if all hosts have failed and the running result is not ok 30564 1726882811.49518: done checking to see if all hosts have failed 30564 1726882811.49518: getting the remaining hosts for this loop 30564 1726882811.49520: done getting the remaining hosts for this loop 30564 1726882811.49523: getting the next task for host managed_node2 30564 1726882811.49530: done getting next task for host managed_node2 30564 1726882811.49533: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30564 1726882811.49538: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882811.49550: getting variables 30564 1726882811.49551: in VariableManager get_vars() 30564 1726882811.49583: Calling all_inventory to load vars for managed_node2 30564 1726882811.49585: Calling groups_inventory to load vars for managed_node2 30564 1726882811.49587: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882811.49595: Calling all_plugins_play to load vars for managed_node2 30564 1726882811.49597: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882811.49600: Calling groups_plugins_play to load vars for managed_node2 30564 1726882811.50614: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882811.52324: done with get_vars() 30564 1726882811.52345: done getting variables 30564 1726882811.52404: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:40:11 -0400 (0:00:00.069) 0:00:10.105 ****** 30564 1726882811.52438: entering _queue_task() for managed_node2/fail 30564 1726882811.52685: worker is 1 (out of 1 available) 30564 1726882811.52697: exiting _queue_task() for managed_node2/fail 30564 1726882811.52708: done queuing things up, now waiting for results queue to drain 30564 1726882811.52709: waiting for pending results... 30564 1726882811.52982: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30564 1726882811.53121: in run() - task 0e448fcc-3ce9-4216-acec-000000000213 30564 1726882811.53139: variable 'ansible_search_path' from source: unknown 30564 1726882811.53151: variable 'ansible_search_path' from source: unknown 30564 1726882811.53193: calling self._execute() 30564 1726882811.53292: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882811.53304: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882811.53318: variable 'omit' from source: magic vars 30564 1726882811.53660: variable 'ansible_distribution_major_version' from source: facts 30564 1726882811.53685: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882811.53807: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882811.54006: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882811.56298: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882811.56377: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882811.56420: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882811.56458: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882811.56493: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882811.56579: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882811.56618: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882811.56649: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882811.56701: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882811.56725: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882811.56777: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882811.56805: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882811.56839: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882811.56887: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882811.56907: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882811.56955: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882811.56988: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882811.57021: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882811.57073: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882811.57094: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882811.57281: variable 'network_connections' from source: include params 30564 1726882811.57296: variable 'interface' from source: play vars 30564 1726882811.57365: variable 'interface' from source: play vars 30564 1726882811.57441: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882811.57621: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882811.57662: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882811.57707: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882811.57740: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882811.57790: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882811.57821: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882811.57849: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882811.57885: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882811.57948: variable '__network_team_connections_defined' from source: role '' defaults 30564 1726882811.58291: variable 'network_connections' from source: include params 30564 1726882811.58302: variable 'interface' from source: play vars 30564 1726882811.58373: variable 'interface' from source: play vars 30564 1726882811.58407: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30564 1726882811.58414: when evaluation is False, skipping this task 30564 1726882811.58420: _execute() done 30564 1726882811.58425: dumping result to json 30564 1726882811.58431: done dumping result, returning 30564 1726882811.58440: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-4216-acec-000000000213] 30564 1726882811.58448: sending task result for task 0e448fcc-3ce9-4216-acec-000000000213 30564 1726882811.58555: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000213 30564 1726882811.58562: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30564 1726882811.58625: no more pending results, returning what we have 30564 1726882811.58629: results queue empty 30564 1726882811.58630: checking for any_errors_fatal 30564 1726882811.58635: done checking for any_errors_fatal 30564 1726882811.58635: checking for max_fail_percentage 30564 1726882811.58637: done checking for max_fail_percentage 30564 1726882811.58638: checking to see if all hosts have failed and the running result is not ok 30564 1726882811.58639: done checking to see if all hosts have failed 30564 1726882811.58640: getting the remaining hosts for this loop 30564 1726882811.58642: done getting the remaining hosts for this loop 30564 1726882811.58645: getting the next task for host managed_node2 30564 1726882811.58653: done getting next task for host managed_node2 30564 1726882811.58657: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 30564 1726882811.58662: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882811.58680: getting variables 30564 1726882811.58682: in VariableManager get_vars() 30564 1726882811.58716: Calling all_inventory to load vars for managed_node2 30564 1726882811.58718: Calling groups_inventory to load vars for managed_node2 30564 1726882811.58721: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882811.58731: Calling all_plugins_play to load vars for managed_node2 30564 1726882811.58734: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882811.58737: Calling groups_plugins_play to load vars for managed_node2 30564 1726882811.60622: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882811.63242: done with get_vars() 30564 1726882811.63372: done getting variables 30564 1726882811.63429: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:40:11 -0400 (0:00:00.110) 0:00:10.215 ****** 30564 1726882811.63465: entering _queue_task() for managed_node2/package 30564 1726882811.63757: worker is 1 (out of 1 available) 30564 1726882811.63773: exiting _queue_task() for managed_node2/package 30564 1726882811.63787: done queuing things up, now waiting for results queue to drain 30564 1726882811.63788: waiting for pending results... 30564 1726882811.64073: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 30564 1726882811.64214: in run() - task 0e448fcc-3ce9-4216-acec-000000000214 30564 1726882811.64239: variable 'ansible_search_path' from source: unknown 30564 1726882811.64248: variable 'ansible_search_path' from source: unknown 30564 1726882811.64290: calling self._execute() 30564 1726882811.64386: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882811.64398: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882811.64412: variable 'omit' from source: magic vars 30564 1726882811.64829: variable 'ansible_distribution_major_version' from source: facts 30564 1726882811.64848: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882811.65048: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882811.65430: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882811.65483: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882811.65520: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882811.65562: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882811.65677: variable 'network_packages' from source: role '' defaults 30564 1726882811.65793: variable '__network_provider_setup' from source: role '' defaults 30564 1726882811.65808: variable '__network_service_name_default_nm' from source: role '' defaults 30564 1726882811.65883: variable '__network_service_name_default_nm' from source: role '' defaults 30564 1726882811.65895: variable '__network_packages_default_nm' from source: role '' defaults 30564 1726882811.65958: variable '__network_packages_default_nm' from source: role '' defaults 30564 1726882811.66151: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882811.72212: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882811.72294: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882811.72333: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882811.72374: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882811.72405: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882811.72478: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882811.72513: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882811.72543: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882811.72597: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882811.72617: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882811.72663: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882811.72698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882811.72729: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882811.72779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882811.72800: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882811.73038: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30564 1726882811.73158: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882811.73193: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882811.73223: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882811.73274: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882811.73294: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882811.73390: variable 'ansible_python' from source: facts 30564 1726882811.73411: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30564 1726882811.73502: variable '__network_wpa_supplicant_required' from source: role '' defaults 30564 1726882811.73592: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30564 1726882811.73726: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882811.73753: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882811.73791: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882811.73834: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882811.73851: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882811.73908: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882811.73944: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882811.73981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882811.74029: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882811.74047: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882811.74188: variable 'network_connections' from source: include params 30564 1726882811.74198: variable 'interface' from source: play vars 30564 1726882811.74307: variable 'interface' from source: play vars 30564 1726882811.74378: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882811.74406: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882811.74442: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882811.74481: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882811.74519: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882811.74808: variable 'network_connections' from source: include params 30564 1726882811.74817: variable 'interface' from source: play vars 30564 1726882811.74923: variable 'interface' from source: play vars 30564 1726882811.74981: variable '__network_packages_default_wireless' from source: role '' defaults 30564 1726882811.75060: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882811.75377: variable 'network_connections' from source: include params 30564 1726882811.75387: variable 'interface' from source: play vars 30564 1726882811.75456: variable 'interface' from source: play vars 30564 1726882811.75489: variable '__network_packages_default_team' from source: role '' defaults 30564 1726882811.75575: variable '__network_team_connections_defined' from source: role '' defaults 30564 1726882811.75887: variable 'network_connections' from source: include params 30564 1726882811.75897: variable 'interface' from source: play vars 30564 1726882811.75965: variable 'interface' from source: play vars 30564 1726882811.76024: variable '__network_service_name_default_initscripts' from source: role '' defaults 30564 1726882811.76093: variable '__network_service_name_default_initscripts' from source: role '' defaults 30564 1726882811.76104: variable '__network_packages_default_initscripts' from source: role '' defaults 30564 1726882811.76166: variable '__network_packages_default_initscripts' from source: role '' defaults 30564 1726882811.76410: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30564 1726882811.76955: variable 'network_connections' from source: include params 30564 1726882811.76966: variable 'interface' from source: play vars 30564 1726882811.77032: variable 'interface' from source: play vars 30564 1726882811.77050: variable 'ansible_distribution' from source: facts 30564 1726882811.77058: variable '__network_rh_distros' from source: role '' defaults 30564 1726882811.77073: variable 'ansible_distribution_major_version' from source: facts 30564 1726882811.77108: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30564 1726882811.77291: variable 'ansible_distribution' from source: facts 30564 1726882811.77300: variable '__network_rh_distros' from source: role '' defaults 30564 1726882811.77310: variable 'ansible_distribution_major_version' from source: facts 30564 1726882811.77323: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30564 1726882811.77497: variable 'ansible_distribution' from source: facts 30564 1726882811.77505: variable '__network_rh_distros' from source: role '' defaults 30564 1726882811.77514: variable 'ansible_distribution_major_version' from source: facts 30564 1726882811.77548: variable 'network_provider' from source: set_fact 30564 1726882811.77571: variable 'ansible_facts' from source: unknown 30564 1726882811.78256: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 30564 1726882811.78265: when evaluation is False, skipping this task 30564 1726882811.78431: _execute() done 30564 1726882811.78595: dumping result to json 30564 1726882811.78603: done dumping result, returning 30564 1726882811.78613: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [0e448fcc-3ce9-4216-acec-000000000214] 30564 1726882811.78620: sending task result for task 0e448fcc-3ce9-4216-acec-000000000214 skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 30564 1726882811.78761: no more pending results, returning what we have 30564 1726882811.78769: results queue empty 30564 1726882811.78770: checking for any_errors_fatal 30564 1726882811.78777: done checking for any_errors_fatal 30564 1726882811.78778: checking for max_fail_percentage 30564 1726882811.78780: done checking for max_fail_percentage 30564 1726882811.78781: checking to see if all hosts have failed and the running result is not ok 30564 1726882811.78782: done checking to see if all hosts have failed 30564 1726882811.78783: getting the remaining hosts for this loop 30564 1726882811.78784: done getting the remaining hosts for this loop 30564 1726882811.78788: getting the next task for host managed_node2 30564 1726882811.78796: done getting next task for host managed_node2 30564 1726882811.78800: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30564 1726882811.78805: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882811.78818: getting variables 30564 1726882811.78820: in VariableManager get_vars() 30564 1726882811.78853: Calling all_inventory to load vars for managed_node2 30564 1726882811.78855: Calling groups_inventory to load vars for managed_node2 30564 1726882811.78862: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882811.78875: Calling all_plugins_play to load vars for managed_node2 30564 1726882811.78878: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882811.78881: Calling groups_plugins_play to load vars for managed_node2 30564 1726882811.79888: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000214 30564 1726882811.79891: WORKER PROCESS EXITING 30564 1726882811.84613: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882811.86344: done with get_vars() 30564 1726882811.86371: done getting variables 30564 1726882811.86415: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:40:11 -0400 (0:00:00.229) 0:00:10.445 ****** 30564 1726882811.86444: entering _queue_task() for managed_node2/package 30564 1726882811.86748: worker is 1 (out of 1 available) 30564 1726882811.86759: exiting _queue_task() for managed_node2/package 30564 1726882811.86775: done queuing things up, now waiting for results queue to drain 30564 1726882811.86777: waiting for pending results... 30564 1726882811.87044: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30564 1726882811.87193: in run() - task 0e448fcc-3ce9-4216-acec-000000000215 30564 1726882811.87212: variable 'ansible_search_path' from source: unknown 30564 1726882811.87222: variable 'ansible_search_path' from source: unknown 30564 1726882811.87259: calling self._execute() 30564 1726882811.87361: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882811.87381: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882811.87397: variable 'omit' from source: magic vars 30564 1726882811.87770: variable 'ansible_distribution_major_version' from source: facts 30564 1726882811.87790: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882811.87916: variable 'network_state' from source: role '' defaults 30564 1726882811.87930: Evaluated conditional (network_state != {}): False 30564 1726882811.87936: when evaluation is False, skipping this task 30564 1726882811.87943: _execute() done 30564 1726882811.87949: dumping result to json 30564 1726882811.87955: done dumping result, returning 30564 1726882811.87972: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0e448fcc-3ce9-4216-acec-000000000215] 30564 1726882811.87987: sending task result for task 0e448fcc-3ce9-4216-acec-000000000215 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30564 1726882811.88140: no more pending results, returning what we have 30564 1726882811.88144: results queue empty 30564 1726882811.88145: checking for any_errors_fatal 30564 1726882811.88154: done checking for any_errors_fatal 30564 1726882811.88155: checking for max_fail_percentage 30564 1726882811.88157: done checking for max_fail_percentage 30564 1726882811.88157: checking to see if all hosts have failed and the running result is not ok 30564 1726882811.88158: done checking to see if all hosts have failed 30564 1726882811.88159: getting the remaining hosts for this loop 30564 1726882811.88161: done getting the remaining hosts for this loop 30564 1726882811.88166: getting the next task for host managed_node2 30564 1726882811.88178: done getting next task for host managed_node2 30564 1726882811.88182: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30564 1726882811.88188: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882811.88204: getting variables 30564 1726882811.88206: in VariableManager get_vars() 30564 1726882811.88240: Calling all_inventory to load vars for managed_node2 30564 1726882811.88243: Calling groups_inventory to load vars for managed_node2 30564 1726882811.88246: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882811.88257: Calling all_plugins_play to load vars for managed_node2 30564 1726882811.88259: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882811.88262: Calling groups_plugins_play to load vars for managed_node2 30564 1726882811.89285: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000215 30564 1726882811.89288: WORKER PROCESS EXITING 30564 1726882811.90047: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882811.92679: done with get_vars() 30564 1726882811.92700: done getting variables 30564 1726882811.92755: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:40:11 -0400 (0:00:00.063) 0:00:10.509 ****** 30564 1726882811.92792: entering _queue_task() for managed_node2/package 30564 1726882811.93635: worker is 1 (out of 1 available) 30564 1726882811.93646: exiting _queue_task() for managed_node2/package 30564 1726882811.93657: done queuing things up, now waiting for results queue to drain 30564 1726882811.93658: waiting for pending results... 30564 1726882811.93918: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30564 1726882811.94060: in run() - task 0e448fcc-3ce9-4216-acec-000000000216 30564 1726882811.94085: variable 'ansible_search_path' from source: unknown 30564 1726882811.94099: variable 'ansible_search_path' from source: unknown 30564 1726882811.94135: calling self._execute() 30564 1726882811.94226: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882811.94238: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882811.94250: variable 'omit' from source: magic vars 30564 1726882811.94601: variable 'ansible_distribution_major_version' from source: facts 30564 1726882811.94617: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882811.94744: variable 'network_state' from source: role '' defaults 30564 1726882811.94759: Evaluated conditional (network_state != {}): False 30564 1726882811.94771: when evaluation is False, skipping this task 30564 1726882811.94778: _execute() done 30564 1726882811.94785: dumping result to json 30564 1726882811.94791: done dumping result, returning 30564 1726882811.94802: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0e448fcc-3ce9-4216-acec-000000000216] 30564 1726882811.94812: sending task result for task 0e448fcc-3ce9-4216-acec-000000000216 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30564 1726882811.94956: no more pending results, returning what we have 30564 1726882811.94959: results queue empty 30564 1726882811.94961: checking for any_errors_fatal 30564 1726882811.94973: done checking for any_errors_fatal 30564 1726882811.94974: checking for max_fail_percentage 30564 1726882811.94976: done checking for max_fail_percentage 30564 1726882811.94977: checking to see if all hosts have failed and the running result is not ok 30564 1726882811.94978: done checking to see if all hosts have failed 30564 1726882811.94978: getting the remaining hosts for this loop 30564 1726882811.94980: done getting the remaining hosts for this loop 30564 1726882811.94984: getting the next task for host managed_node2 30564 1726882811.94992: done getting next task for host managed_node2 30564 1726882811.94996: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30564 1726882811.95001: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882811.95015: getting variables 30564 1726882811.95017: in VariableManager get_vars() 30564 1726882811.95049: Calling all_inventory to load vars for managed_node2 30564 1726882811.95052: Calling groups_inventory to load vars for managed_node2 30564 1726882811.95054: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882811.95070: Calling all_plugins_play to load vars for managed_node2 30564 1726882811.95074: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882811.95079: Calling groups_plugins_play to load vars for managed_node2 30564 1726882811.96186: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000216 30564 1726882811.96190: WORKER PROCESS EXITING 30564 1726882811.96712: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882811.98510: done with get_vars() 30564 1726882811.98532: done getting variables 30564 1726882811.98626: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:40:11 -0400 (0:00:00.058) 0:00:10.567 ****** 30564 1726882811.98657: entering _queue_task() for managed_node2/service 30564 1726882811.98659: Creating lock for service 30564 1726882811.98922: worker is 1 (out of 1 available) 30564 1726882811.98935: exiting _queue_task() for managed_node2/service 30564 1726882811.98947: done queuing things up, now waiting for results queue to drain 30564 1726882811.98948: waiting for pending results... 30564 1726882811.99206: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30564 1726882811.99355: in run() - task 0e448fcc-3ce9-4216-acec-000000000217 30564 1726882811.99377: variable 'ansible_search_path' from source: unknown 30564 1726882811.99386: variable 'ansible_search_path' from source: unknown 30564 1726882811.99422: calling self._execute() 30564 1726882811.99514: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882811.99523: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882811.99536: variable 'omit' from source: magic vars 30564 1726882812.00196: variable 'ansible_distribution_major_version' from source: facts 30564 1726882812.00216: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882812.00511: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882812.00884: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882812.05946: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882812.06021: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882812.06085: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882812.06124: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882812.06159: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882812.06244: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882812.06286: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882812.06318: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882812.06373: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882812.06416: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882812.06469: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882812.06517: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882812.06546: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882812.06598: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882812.06616: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882812.06718: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882812.06776: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882812.06805: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882812.06853: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882812.06889: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882812.07083: variable 'network_connections' from source: include params 30564 1726882812.07101: variable 'interface' from source: play vars 30564 1726882812.07189: variable 'interface' from source: play vars 30564 1726882812.07274: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882812.07446: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882812.07498: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882812.07534: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882812.07589: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882812.07634: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882812.07661: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882812.07702: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882812.07735: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882812.07807: variable '__network_team_connections_defined' from source: role '' defaults 30564 1726882812.08071: variable 'network_connections' from source: include params 30564 1726882812.08082: variable 'interface' from source: play vars 30564 1726882812.08150: variable 'interface' from source: play vars 30564 1726882812.08191: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30564 1726882812.08200: when evaluation is False, skipping this task 30564 1726882812.08207: _execute() done 30564 1726882812.08218: dumping result to json 30564 1726882812.08225: done dumping result, returning 30564 1726882812.08236: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-4216-acec-000000000217] 30564 1726882812.08246: sending task result for task 0e448fcc-3ce9-4216-acec-000000000217 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30564 1726882812.08413: no more pending results, returning what we have 30564 1726882812.08417: results queue empty 30564 1726882812.08418: checking for any_errors_fatal 30564 1726882812.08426: done checking for any_errors_fatal 30564 1726882812.08427: checking for max_fail_percentage 30564 1726882812.08429: done checking for max_fail_percentage 30564 1726882812.08430: checking to see if all hosts have failed and the running result is not ok 30564 1726882812.08430: done checking to see if all hosts have failed 30564 1726882812.08431: getting the remaining hosts for this loop 30564 1726882812.08433: done getting the remaining hosts for this loop 30564 1726882812.08437: getting the next task for host managed_node2 30564 1726882812.08446: done getting next task for host managed_node2 30564 1726882812.08451: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30564 1726882812.08455: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882812.08474: getting variables 30564 1726882812.08477: in VariableManager get_vars() 30564 1726882812.08513: Calling all_inventory to load vars for managed_node2 30564 1726882812.08516: Calling groups_inventory to load vars for managed_node2 30564 1726882812.08518: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882812.08536: Calling all_plugins_play to load vars for managed_node2 30564 1726882812.08540: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882812.08543: Calling groups_plugins_play to load vars for managed_node2 30564 1726882812.10004: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000217 30564 1726882812.10008: WORKER PROCESS EXITING 30564 1726882812.10676: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882812.12552: done with get_vars() 30564 1726882812.12580: done getting variables 30564 1726882812.12635: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:40:12 -0400 (0:00:00.140) 0:00:10.707 ****** 30564 1726882812.12669: entering _queue_task() for managed_node2/service 30564 1726882812.12957: worker is 1 (out of 1 available) 30564 1726882812.12974: exiting _queue_task() for managed_node2/service 30564 1726882812.12985: done queuing things up, now waiting for results queue to drain 30564 1726882812.12986: waiting for pending results... 30564 1726882812.13255: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30564 1726882812.13400: in run() - task 0e448fcc-3ce9-4216-acec-000000000218 30564 1726882812.13421: variable 'ansible_search_path' from source: unknown 30564 1726882812.13431: variable 'ansible_search_path' from source: unknown 30564 1726882812.13474: calling self._execute() 30564 1726882812.13573: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882812.13584: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882812.13598: variable 'omit' from source: magic vars 30564 1726882812.14512: variable 'ansible_distribution_major_version' from source: facts 30564 1726882812.14529: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882812.14702: variable 'network_provider' from source: set_fact 30564 1726882812.14716: variable 'network_state' from source: role '' defaults 30564 1726882812.14731: Evaluated conditional (network_provider == "nm" or network_state != {}): True 30564 1726882812.14741: variable 'omit' from source: magic vars 30564 1726882812.14808: variable 'omit' from source: magic vars 30564 1726882812.14845: variable 'network_service_name' from source: role '' defaults 30564 1726882812.14920: variable 'network_service_name' from source: role '' defaults 30564 1726882812.15032: variable '__network_provider_setup' from source: role '' defaults 30564 1726882812.15156: variable '__network_service_name_default_nm' from source: role '' defaults 30564 1726882812.15230: variable '__network_service_name_default_nm' from source: role '' defaults 30564 1726882812.15270: variable '__network_packages_default_nm' from source: role '' defaults 30564 1726882812.15427: variable '__network_packages_default_nm' from source: role '' defaults 30564 1726882812.15856: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882812.18761: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882812.18857: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882812.18908: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882812.18958: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882812.18996: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882812.19085: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882812.19117: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882812.19155: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882812.19211: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882812.19229: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882812.19290: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882812.19321: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882812.19358: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882812.19410: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882812.19433: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882812.19693: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30564 1726882812.19824: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882812.19859: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882812.19894: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882812.19946: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882812.19969: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882812.20065: variable 'ansible_python' from source: facts 30564 1726882812.20089: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30564 1726882812.20182: variable '__network_wpa_supplicant_required' from source: role '' defaults 30564 1726882812.20271: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30564 1726882812.20401: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882812.20428: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882812.20459: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882812.20509: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882812.20527: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882812.20590: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882812.20626: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882812.20660: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882812.20715: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882812.20732: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882812.20895: variable 'network_connections' from source: include params 30564 1726882812.20910: variable 'interface' from source: play vars 30564 1726882812.20991: variable 'interface' from source: play vars 30564 1726882812.21105: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882812.21315: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882812.21383: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882812.21429: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882812.21487: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882812.21550: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882812.21597: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882812.21632: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882812.21674: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882812.21724: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882812.22016: variable 'network_connections' from source: include params 30564 1726882812.22027: variable 'interface' from source: play vars 30564 1726882812.22109: variable 'interface' from source: play vars 30564 1726882812.22157: variable '__network_packages_default_wireless' from source: role '' defaults 30564 1726882812.22355: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882812.23002: variable 'network_connections' from source: include params 30564 1726882812.23012: variable 'interface' from source: play vars 30564 1726882812.23195: variable 'interface' from source: play vars 30564 1726882812.23220: variable '__network_packages_default_team' from source: role '' defaults 30564 1726882812.23380: variable '__network_team_connections_defined' from source: role '' defaults 30564 1726882812.23881: variable 'network_connections' from source: include params 30564 1726882812.23891: variable 'interface' from source: play vars 30564 1726882812.24002: variable 'interface' from source: play vars 30564 1726882812.24078: variable '__network_service_name_default_initscripts' from source: role '' defaults 30564 1726882812.24163: variable '__network_service_name_default_initscripts' from source: role '' defaults 30564 1726882812.24182: variable '__network_packages_default_initscripts' from source: role '' defaults 30564 1726882812.24245: variable '__network_packages_default_initscripts' from source: role '' defaults 30564 1726882812.24482: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30564 1726882812.25421: variable 'network_connections' from source: include params 30564 1726882812.25431: variable 'interface' from source: play vars 30564 1726882812.25504: variable 'interface' from source: play vars 30564 1726882812.25520: variable 'ansible_distribution' from source: facts 30564 1726882812.25529: variable '__network_rh_distros' from source: role '' defaults 30564 1726882812.25538: variable 'ansible_distribution_major_version' from source: facts 30564 1726882812.25589: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30564 1726882812.25994: variable 'ansible_distribution' from source: facts 30564 1726882812.26004: variable '__network_rh_distros' from source: role '' defaults 30564 1726882812.26022: variable 'ansible_distribution_major_version' from source: facts 30564 1726882812.26050: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30564 1726882812.26383: variable 'ansible_distribution' from source: facts 30564 1726882812.26387: variable '__network_rh_distros' from source: role '' defaults 30564 1726882812.26389: variable 'ansible_distribution_major_version' from source: facts 30564 1726882812.26422: variable 'network_provider' from source: set_fact 30564 1726882812.26444: variable 'omit' from source: magic vars 30564 1726882812.26499: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882812.26528: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882812.26545: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882812.26565: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882812.26575: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882812.26615: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882812.26618: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882812.26621: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882812.26731: Set connection var ansible_timeout to 10 30564 1726882812.26736: Set connection var ansible_pipelining to False 30564 1726882812.26739: Set connection var ansible_shell_type to sh 30564 1726882812.26745: Set connection var ansible_shell_executable to /bin/sh 30564 1726882812.26752: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882812.26755: Set connection var ansible_connection to ssh 30564 1726882812.26784: variable 'ansible_shell_executable' from source: unknown 30564 1726882812.26787: variable 'ansible_connection' from source: unknown 30564 1726882812.26789: variable 'ansible_module_compression' from source: unknown 30564 1726882812.26791: variable 'ansible_shell_type' from source: unknown 30564 1726882812.26797: variable 'ansible_shell_executable' from source: unknown 30564 1726882812.26800: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882812.26809: variable 'ansible_pipelining' from source: unknown 30564 1726882812.26812: variable 'ansible_timeout' from source: unknown 30564 1726882812.26816: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882812.26919: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882812.26927: variable 'omit' from source: magic vars 30564 1726882812.26934: starting attempt loop 30564 1726882812.26936: running the handler 30564 1726882812.27020: variable 'ansible_facts' from source: unknown 30564 1726882812.27709: _low_level_execute_command(): starting 30564 1726882812.27714: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882812.28213: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882812.28233: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882812.28347: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882812.28419: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882812.28435: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882812.28574: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882812.30223: stdout chunk (state=3): >>>/root <<< 30564 1726882812.30329: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882812.30374: stderr chunk (state=3): >>><<< 30564 1726882812.30377: stdout chunk (state=3): >>><<< 30564 1726882812.30393: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882812.30402: _low_level_execute_command(): starting 30564 1726882812.30408: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882812.303932-31084-261316129893337 `" && echo ansible-tmp-1726882812.303932-31084-261316129893337="` echo /root/.ansible/tmp/ansible-tmp-1726882812.303932-31084-261316129893337 `" ) && sleep 0' 30564 1726882812.30826: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882812.30832: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882812.30862: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882812.30872: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882812.30893: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration <<< 30564 1726882812.30897: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882812.30949: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882812.30963: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882812.31101: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882812.32959: stdout chunk (state=3): >>>ansible-tmp-1726882812.303932-31084-261316129893337=/root/.ansible/tmp/ansible-tmp-1726882812.303932-31084-261316129893337 <<< 30564 1726882812.33073: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882812.33119: stderr chunk (state=3): >>><<< 30564 1726882812.33121: stdout chunk (state=3): >>><<< 30564 1726882812.33172: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882812.303932-31084-261316129893337=/root/.ansible/tmp/ansible-tmp-1726882812.303932-31084-261316129893337 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882812.33177: variable 'ansible_module_compression' from source: unknown 30564 1726882812.33205: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 30564 1726882812.33209: ANSIBALLZ: Acquiring lock 30564 1726882812.33211: ANSIBALLZ: Lock acquired: 140506263950048 30564 1726882812.33213: ANSIBALLZ: Creating module 30564 1726882812.53574: ANSIBALLZ: Writing module into payload 30564 1726882812.53727: ANSIBALLZ: Writing module 30564 1726882812.53760: ANSIBALLZ: Renaming module 30564 1726882812.53766: ANSIBALLZ: Done creating module 30564 1726882812.53804: variable 'ansible_facts' from source: unknown 30564 1726882812.53989: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882812.303932-31084-261316129893337/AnsiballZ_systemd.py 30564 1726882812.54131: Sending initial data 30564 1726882812.54135: Sent initial data (155 bytes) 30564 1726882812.55007: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882812.55014: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882812.55041: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 30564 1726882812.55045: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882812.55047: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882812.55098: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882812.55109: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882812.55224: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882812.57098: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882812.57203: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882812.57309: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmp6ym_31qj /root/.ansible/tmp/ansible-tmp-1726882812.303932-31084-261316129893337/AnsiballZ_systemd.py <<< 30564 1726882812.57403: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882812.60174: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882812.60427: stderr chunk (state=3): >>><<< 30564 1726882812.60430: stdout chunk (state=3): >>><<< 30564 1726882812.60432: done transferring module to remote 30564 1726882812.60434: _low_level_execute_command(): starting 30564 1726882812.60437: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882812.303932-31084-261316129893337/ /root/.ansible/tmp/ansible-tmp-1726882812.303932-31084-261316129893337/AnsiballZ_systemd.py && sleep 0' 30564 1726882812.61040: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882812.61053: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882812.61070: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882812.61089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882812.61136: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882812.61147: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882812.61160: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882812.61182: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882812.61192: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882812.61207: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882812.61219: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882812.61231: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882812.61246: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882812.61258: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882812.61273: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882812.61286: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882812.61371: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882812.61391: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882812.61410: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882812.61551: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882812.63324: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882812.63399: stderr chunk (state=3): >>><<< 30564 1726882812.63409: stdout chunk (state=3): >>><<< 30564 1726882812.63471: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882812.63475: _low_level_execute_command(): starting 30564 1726882812.63478: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882812.303932-31084-261316129893337/AnsiballZ_systemd.py && sleep 0' 30564 1726882812.64076: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882812.64089: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882812.64102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882812.64118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882812.64157: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882812.64179: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882812.64193: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882812.64209: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882812.64219: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882812.64229: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882812.64240: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882812.64255: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882812.64281: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882812.64293: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882812.64303: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882812.64316: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882812.64402: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882812.64422: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882812.64437: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882812.64575: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882812.89498: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6692", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ExecMainStartTimestampMonotonic": "202392137", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6692", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManag<<< 30564 1726882812.89531: stdout chunk (state=3): >>>er.service", "ControlGroupId": "3602", "MemoryCurrent": "9183232", "MemoryAvailable": "infinity", "CPUUsageNSec": "2045755000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Watchdo<<< 30564 1726882812.89534: stdout chunk (state=3): >>>gSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service multi-user.target network.target shutdown.target cloud-init.service", "After": "cloud-init-local.service dbus-broker.service network-pre.target system.slice dbus.socket systemd-journald.socket basic.target sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:57 EDT", "StateChangeTimestampMonotonic": "316658837", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveExitTimestampMonotonic": "202392395", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveEnterTimestampMonotonic": "202472383", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveExitTimestampMonotonic": "202362940", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveEnterTimestampMonotonic": "202381901", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ConditionTimestampMonotonic": "202382734", "AssertTimestamp": "Fri 2024-09-20 21:31:03 EDT", "AssertTimestampMonotonic": "202382737", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "55e27919215348fab37a11b7ea324f90", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 30564 1726882812.91084: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882812.91131: stderr chunk (state=3): >>><<< 30564 1726882812.91135: stdout chunk (state=3): >>><<< 30564 1726882812.91171: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6692", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ExecMainStartTimestampMonotonic": "202392137", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6692", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3602", "MemoryCurrent": "9183232", "MemoryAvailable": "infinity", "CPUUsageNSec": "2045755000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service multi-user.target network.target shutdown.target cloud-init.service", "After": "cloud-init-local.service dbus-broker.service network-pre.target system.slice dbus.socket systemd-journald.socket basic.target sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:57 EDT", "StateChangeTimestampMonotonic": "316658837", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveExitTimestampMonotonic": "202392395", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveEnterTimestampMonotonic": "202472383", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveExitTimestampMonotonic": "202362940", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveEnterTimestampMonotonic": "202381901", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ConditionTimestampMonotonic": "202382734", "AssertTimestamp": "Fri 2024-09-20 21:31:03 EDT", "AssertTimestampMonotonic": "202382737", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "55e27919215348fab37a11b7ea324f90", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882812.91446: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882812.303932-31084-261316129893337/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882812.91450: _low_level_execute_command(): starting 30564 1726882812.91452: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882812.303932-31084-261316129893337/ > /dev/null 2>&1 && sleep 0' 30564 1726882812.92056: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882812.92076: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882812.92090: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882812.92113: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882812.92152: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882812.92165: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882812.92182: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882812.92198: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882812.92215: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882812.92225: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882812.92235: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882812.92246: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882812.92260: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882812.92276: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882812.92287: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882812.92299: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882812.92384: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882812.92405: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882812.92425: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882812.92556: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882812.94455: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882812.94459: stdout chunk (state=3): >>><<< 30564 1726882812.94461: stderr chunk (state=3): >>><<< 30564 1726882812.95071: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882812.95075: handler run complete 30564 1726882812.95078: attempt loop complete, returning result 30564 1726882812.95080: _execute() done 30564 1726882812.95083: dumping result to json 30564 1726882812.95084: done dumping result, returning 30564 1726882812.95086: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0e448fcc-3ce9-4216-acec-000000000218] 30564 1726882812.95088: sending task result for task 0e448fcc-3ce9-4216-acec-000000000218 30564 1726882812.95225: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000218 30564 1726882812.95230: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30564 1726882812.95277: no more pending results, returning what we have 30564 1726882812.95280: results queue empty 30564 1726882812.95281: checking for any_errors_fatal 30564 1726882812.95287: done checking for any_errors_fatal 30564 1726882812.95288: checking for max_fail_percentage 30564 1726882812.95289: done checking for max_fail_percentage 30564 1726882812.95290: checking to see if all hosts have failed and the running result is not ok 30564 1726882812.95291: done checking to see if all hosts have failed 30564 1726882812.95291: getting the remaining hosts for this loop 30564 1726882812.95293: done getting the remaining hosts for this loop 30564 1726882812.95296: getting the next task for host managed_node2 30564 1726882812.95304: done getting next task for host managed_node2 30564 1726882812.95308: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30564 1726882812.95313: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882812.95323: getting variables 30564 1726882812.95325: in VariableManager get_vars() 30564 1726882812.95354: Calling all_inventory to load vars for managed_node2 30564 1726882812.95356: Calling groups_inventory to load vars for managed_node2 30564 1726882812.95359: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882812.95371: Calling all_plugins_play to load vars for managed_node2 30564 1726882812.95374: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882812.95382: Calling groups_plugins_play to load vars for managed_node2 30564 1726882812.97230: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882812.99026: done with get_vars() 30564 1726882812.99049: done getting variables 30564 1726882812.99111: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:40:12 -0400 (0:00:00.864) 0:00:11.572 ****** 30564 1726882812.99144: entering _queue_task() for managed_node2/service 30564 1726882812.99479: worker is 1 (out of 1 available) 30564 1726882812.99491: exiting _queue_task() for managed_node2/service 30564 1726882812.99504: done queuing things up, now waiting for results queue to drain 30564 1726882812.99505: waiting for pending results... 30564 1726882812.99780: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30564 1726882812.99912: in run() - task 0e448fcc-3ce9-4216-acec-000000000219 30564 1726882812.99932: variable 'ansible_search_path' from source: unknown 30564 1726882812.99942: variable 'ansible_search_path' from source: unknown 30564 1726882812.99986: calling self._execute() 30564 1726882813.00081: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882813.00092: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882813.00106: variable 'omit' from source: magic vars 30564 1726882813.00465: variable 'ansible_distribution_major_version' from source: facts 30564 1726882813.00490: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882813.00615: variable 'network_provider' from source: set_fact 30564 1726882813.00625: Evaluated conditional (network_provider == "nm"): True 30564 1726882813.00725: variable '__network_wpa_supplicant_required' from source: role '' defaults 30564 1726882813.00822: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30564 1726882813.01009: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882813.03412: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882813.03489: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882813.03533: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882813.03577: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882813.03607: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882813.03707: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882813.03739: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882813.03777: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882813.03822: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882813.03839: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882813.03895: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882813.03924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882813.03953: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882813.04005: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882813.04026: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882813.04075: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882813.04103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882813.04130: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882813.04178: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882813.04201: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882813.04349: variable 'network_connections' from source: include params 30564 1726882813.04370: variable 'interface' from source: play vars 30564 1726882813.04449: variable 'interface' from source: play vars 30564 1726882813.04533: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882813.04707: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882813.04751: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882813.04790: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882813.04823: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882813.04876: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882813.04903: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882813.04932: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882813.04971: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882813.05021: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882813.05298: variable 'network_connections' from source: include params 30564 1726882813.05310: variable 'interface' from source: play vars 30564 1726882813.05378: variable 'interface' from source: play vars 30564 1726882813.05422: Evaluated conditional (__network_wpa_supplicant_required): False 30564 1726882813.05430: when evaluation is False, skipping this task 30564 1726882813.05436: _execute() done 30564 1726882813.05442: dumping result to json 30564 1726882813.05448: done dumping result, returning 30564 1726882813.05459: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0e448fcc-3ce9-4216-acec-000000000219] 30564 1726882813.05484: sending task result for task 0e448fcc-3ce9-4216-acec-000000000219 30564 1726882813.05596: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000219 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 30564 1726882813.05643: no more pending results, returning what we have 30564 1726882813.05647: results queue empty 30564 1726882813.05648: checking for any_errors_fatal 30564 1726882813.05676: done checking for any_errors_fatal 30564 1726882813.05677: checking for max_fail_percentage 30564 1726882813.05679: done checking for max_fail_percentage 30564 1726882813.05680: checking to see if all hosts have failed and the running result is not ok 30564 1726882813.05681: done checking to see if all hosts have failed 30564 1726882813.05682: getting the remaining hosts for this loop 30564 1726882813.05683: done getting the remaining hosts for this loop 30564 1726882813.05687: getting the next task for host managed_node2 30564 1726882813.05697: done getting next task for host managed_node2 30564 1726882813.05701: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 30564 1726882813.05705: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882813.05719: getting variables 30564 1726882813.05721: in VariableManager get_vars() 30564 1726882813.05756: Calling all_inventory to load vars for managed_node2 30564 1726882813.05758: Calling groups_inventory to load vars for managed_node2 30564 1726882813.05761: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882813.05775: Calling all_plugins_play to load vars for managed_node2 30564 1726882813.05778: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882813.05781: Calling groups_plugins_play to load vars for managed_node2 30564 1726882813.06985: WORKER PROCESS EXITING 30564 1726882813.07522: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882813.09257: done with get_vars() 30564 1726882813.09292: done getting variables 30564 1726882813.09356: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:40:13 -0400 (0:00:00.102) 0:00:11.675 ****** 30564 1726882813.09397: entering _queue_task() for managed_node2/service 30564 1726882813.09736: worker is 1 (out of 1 available) 30564 1726882813.09749: exiting _queue_task() for managed_node2/service 30564 1726882813.09761: done queuing things up, now waiting for results queue to drain 30564 1726882813.09762: waiting for pending results... 30564 1726882813.10055: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 30564 1726882813.10208: in run() - task 0e448fcc-3ce9-4216-acec-00000000021a 30564 1726882813.10230: variable 'ansible_search_path' from source: unknown 30564 1726882813.10238: variable 'ansible_search_path' from source: unknown 30564 1726882813.10283: calling self._execute() 30564 1726882813.10387: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882813.10399: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882813.10415: variable 'omit' from source: magic vars 30564 1726882813.10798: variable 'ansible_distribution_major_version' from source: facts 30564 1726882813.10818: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882813.10939: variable 'network_provider' from source: set_fact 30564 1726882813.10950: Evaluated conditional (network_provider == "initscripts"): False 30564 1726882813.10957: when evaluation is False, skipping this task 30564 1726882813.10965: _execute() done 30564 1726882813.10977: dumping result to json 30564 1726882813.10987: done dumping result, returning 30564 1726882813.10998: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [0e448fcc-3ce9-4216-acec-00000000021a] 30564 1726882813.11011: sending task result for task 0e448fcc-3ce9-4216-acec-00000000021a skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30564 1726882813.11160: no more pending results, returning what we have 30564 1726882813.11169: results queue empty 30564 1726882813.11171: checking for any_errors_fatal 30564 1726882813.11180: done checking for any_errors_fatal 30564 1726882813.11181: checking for max_fail_percentage 30564 1726882813.11183: done checking for max_fail_percentage 30564 1726882813.11184: checking to see if all hosts have failed and the running result is not ok 30564 1726882813.11184: done checking to see if all hosts have failed 30564 1726882813.11185: getting the remaining hosts for this loop 30564 1726882813.11187: done getting the remaining hosts for this loop 30564 1726882813.11191: getting the next task for host managed_node2 30564 1726882813.11200: done getting next task for host managed_node2 30564 1726882813.11205: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30564 1726882813.11210: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882813.11227: getting variables 30564 1726882813.11229: in VariableManager get_vars() 30564 1726882813.11271: Calling all_inventory to load vars for managed_node2 30564 1726882813.11274: Calling groups_inventory to load vars for managed_node2 30564 1726882813.11277: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882813.11290: Calling all_plugins_play to load vars for managed_node2 30564 1726882813.11293: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882813.11296: Calling groups_plugins_play to load vars for managed_node2 30564 1726882813.12887: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000021a 30564 1726882813.12891: WORKER PROCESS EXITING 30564 1726882813.13236: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882813.14957: done with get_vars() 30564 1726882813.14987: done getting variables 30564 1726882813.15047: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:40:13 -0400 (0:00:00.056) 0:00:11.732 ****** 30564 1726882813.15086: entering _queue_task() for managed_node2/copy 30564 1726882813.15400: worker is 1 (out of 1 available) 30564 1726882813.15411: exiting _queue_task() for managed_node2/copy 30564 1726882813.15425: done queuing things up, now waiting for results queue to drain 30564 1726882813.15426: waiting for pending results... 30564 1726882813.15711: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30564 1726882813.15860: in run() - task 0e448fcc-3ce9-4216-acec-00000000021b 30564 1726882813.15889: variable 'ansible_search_path' from source: unknown 30564 1726882813.15898: variable 'ansible_search_path' from source: unknown 30564 1726882813.15939: calling self._execute() 30564 1726882813.16043: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882813.16056: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882813.16076: variable 'omit' from source: magic vars 30564 1726882813.16451: variable 'ansible_distribution_major_version' from source: facts 30564 1726882813.16473: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882813.16596: variable 'network_provider' from source: set_fact 30564 1726882813.16607: Evaluated conditional (network_provider == "initscripts"): False 30564 1726882813.16614: when evaluation is False, skipping this task 30564 1726882813.16619: _execute() done 30564 1726882813.16624: dumping result to json 30564 1726882813.16632: done dumping result, returning 30564 1726882813.16642: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0e448fcc-3ce9-4216-acec-00000000021b] 30564 1726882813.16651: sending task result for task 0e448fcc-3ce9-4216-acec-00000000021b skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 30564 1726882813.16802: no more pending results, returning what we have 30564 1726882813.16807: results queue empty 30564 1726882813.16808: checking for any_errors_fatal 30564 1726882813.16816: done checking for any_errors_fatal 30564 1726882813.16816: checking for max_fail_percentage 30564 1726882813.16818: done checking for max_fail_percentage 30564 1726882813.16819: checking to see if all hosts have failed and the running result is not ok 30564 1726882813.16820: done checking to see if all hosts have failed 30564 1726882813.16820: getting the remaining hosts for this loop 30564 1726882813.16822: done getting the remaining hosts for this loop 30564 1726882813.16826: getting the next task for host managed_node2 30564 1726882813.16834: done getting next task for host managed_node2 30564 1726882813.16837: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30564 1726882813.16842: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882813.16856: getting variables 30564 1726882813.16857: in VariableManager get_vars() 30564 1726882813.16894: Calling all_inventory to load vars for managed_node2 30564 1726882813.16897: Calling groups_inventory to load vars for managed_node2 30564 1726882813.16899: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882813.16912: Calling all_plugins_play to load vars for managed_node2 30564 1726882813.16915: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882813.16918: Calling groups_plugins_play to load vars for managed_node2 30564 1726882813.17985: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000021b 30564 1726882813.17989: WORKER PROCESS EXITING 30564 1726882813.18602: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882813.20448: done with get_vars() 30564 1726882813.20477: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:40:13 -0400 (0:00:00.054) 0:00:11.786 ****** 30564 1726882813.20565: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 30564 1726882813.20569: Creating lock for fedora.linux_system_roles.network_connections 30564 1726882813.20879: worker is 1 (out of 1 available) 30564 1726882813.20892: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 30564 1726882813.20904: done queuing things up, now waiting for results queue to drain 30564 1726882813.20906: waiting for pending results... 30564 1726882813.21196: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30564 1726882813.21337: in run() - task 0e448fcc-3ce9-4216-acec-00000000021c 30564 1726882813.21362: variable 'ansible_search_path' from source: unknown 30564 1726882813.21377: variable 'ansible_search_path' from source: unknown 30564 1726882813.21417: calling self._execute() 30564 1726882813.21517: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882813.21530: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882813.21545: variable 'omit' from source: magic vars 30564 1726882813.21934: variable 'ansible_distribution_major_version' from source: facts 30564 1726882813.21953: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882813.21970: variable 'omit' from source: magic vars 30564 1726882813.22043: variable 'omit' from source: magic vars 30564 1726882813.22213: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882813.24503: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882813.24570: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882813.24614: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882813.24650: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882813.24687: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882813.24773: variable 'network_provider' from source: set_fact 30564 1726882813.24906: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882813.24942: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882813.24978: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882813.25025: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882813.25047: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882813.25127: variable 'omit' from source: magic vars 30564 1726882813.25244: variable 'omit' from source: magic vars 30564 1726882813.25379: variable 'network_connections' from source: include params 30564 1726882813.25393: variable 'interface' from source: play vars 30564 1726882813.25459: variable 'interface' from source: play vars 30564 1726882813.25632: variable 'omit' from source: magic vars 30564 1726882813.25644: variable '__lsr_ansible_managed' from source: task vars 30564 1726882813.25713: variable '__lsr_ansible_managed' from source: task vars 30564 1726882813.25894: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 30564 1726882813.26114: Loaded config def from plugin (lookup/template) 30564 1726882813.26127: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 30564 1726882813.26157: File lookup term: get_ansible_managed.j2 30564 1726882813.26165: variable 'ansible_search_path' from source: unknown 30564 1726882813.26177: evaluation_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 30564 1726882813.26194: search_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 30564 1726882813.26214: variable 'ansible_search_path' from source: unknown 30564 1726882813.32718: variable 'ansible_managed' from source: unknown 30564 1726882813.32862: variable 'omit' from source: magic vars 30564 1726882813.32900: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882813.32932: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882813.32957: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882813.32984: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882813.32998: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882813.33028: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882813.33035: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882813.33042: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882813.33144: Set connection var ansible_timeout to 10 30564 1726882813.33154: Set connection var ansible_pipelining to False 30564 1726882813.33164: Set connection var ansible_shell_type to sh 30564 1726882813.33178: Set connection var ansible_shell_executable to /bin/sh 30564 1726882813.33190: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882813.33195: Set connection var ansible_connection to ssh 30564 1726882813.33224: variable 'ansible_shell_executable' from source: unknown 30564 1726882813.33231: variable 'ansible_connection' from source: unknown 30564 1726882813.33237: variable 'ansible_module_compression' from source: unknown 30564 1726882813.33243: variable 'ansible_shell_type' from source: unknown 30564 1726882813.33248: variable 'ansible_shell_executable' from source: unknown 30564 1726882813.33254: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882813.33261: variable 'ansible_pipelining' from source: unknown 30564 1726882813.33274: variable 'ansible_timeout' from source: unknown 30564 1726882813.33281: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882813.33411: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30564 1726882813.33434: variable 'omit' from source: magic vars 30564 1726882813.33444: starting attempt loop 30564 1726882813.33450: running the handler 30564 1726882813.33470: _low_level_execute_command(): starting 30564 1726882813.33483: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882813.34229: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882813.34245: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882813.34261: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882813.34285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882813.34329: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882813.34340: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882813.34354: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882813.34378: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882813.34390: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882813.34400: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882813.34412: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882813.34424: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882813.34438: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882813.34448: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882813.34457: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882813.34477: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882813.34552: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882813.34573: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882813.34590: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882813.34733: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882813.36396: stdout chunk (state=3): >>>/root <<< 30564 1726882813.36500: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882813.36580: stderr chunk (state=3): >>><<< 30564 1726882813.36583: stdout chunk (state=3): >>><<< 30564 1726882813.36698: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882813.36701: _low_level_execute_command(): starting 30564 1726882813.36704: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882813.366033-31127-6925921261928 `" && echo ansible-tmp-1726882813.366033-31127-6925921261928="` echo /root/.ansible/tmp/ansible-tmp-1726882813.366033-31127-6925921261928 `" ) && sleep 0' 30564 1726882813.37337: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882813.37357: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882813.37377: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882813.37396: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882813.37437: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882813.37450: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882813.37473: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882813.37493: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882813.37505: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882813.37517: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882813.37529: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882813.37543: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882813.37559: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882813.37582: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882813.37596: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882813.37610: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882813.37689: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882813.37715: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882813.37731: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882813.37861: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882813.39714: stdout chunk (state=3): >>>ansible-tmp-1726882813.366033-31127-6925921261928=/root/.ansible/tmp/ansible-tmp-1726882813.366033-31127-6925921261928 <<< 30564 1726882813.39824: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882813.39870: stderr chunk (state=3): >>><<< 30564 1726882813.39877: stdout chunk (state=3): >>><<< 30564 1726882813.39892: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882813.366033-31127-6925921261928=/root/.ansible/tmp/ansible-tmp-1726882813.366033-31127-6925921261928 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882813.39930: variable 'ansible_module_compression' from source: unknown 30564 1726882813.39967: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 30564 1726882813.39970: ANSIBALLZ: Acquiring lock 30564 1726882813.39973: ANSIBALLZ: Lock acquired: 140506263000160 30564 1726882813.39979: ANSIBALLZ: Creating module 30564 1726882813.61580: ANSIBALLZ: Writing module into payload 30564 1726882813.62047: ANSIBALLZ: Writing module 30564 1726882813.62079: ANSIBALLZ: Renaming module 30564 1726882813.62085: ANSIBALLZ: Done creating module 30564 1726882813.62109: variable 'ansible_facts' from source: unknown 30564 1726882813.62201: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882813.366033-31127-6925921261928/AnsiballZ_network_connections.py 30564 1726882813.62335: Sending initial data 30564 1726882813.62338: Sent initial data (165 bytes) 30564 1726882813.63293: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882813.63302: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882813.63313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882813.63327: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882813.63368: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882813.63378: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882813.63388: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882813.63402: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882813.63409: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882813.63416: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882813.63424: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882813.63433: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882813.63445: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882813.63451: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882813.63458: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882813.63480: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882813.63550: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882813.63572: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882813.63586: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882813.63721: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882813.65546: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882813.65645: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882813.65745: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmpq7y8ca4d /root/.ansible/tmp/ansible-tmp-1726882813.366033-31127-6925921261928/AnsiballZ_network_connections.py <<< 30564 1726882813.65841: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882813.67682: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882813.67753: stderr chunk (state=3): >>><<< 30564 1726882813.67756: stdout chunk (state=3): >>><<< 30564 1726882813.67781: done transferring module to remote 30564 1726882813.67791: _low_level_execute_command(): starting 30564 1726882813.67796: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882813.366033-31127-6925921261928/ /root/.ansible/tmp/ansible-tmp-1726882813.366033-31127-6925921261928/AnsiballZ_network_connections.py && sleep 0' 30564 1726882813.68498: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882813.68665: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882813.68684: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882813.68698: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882813.68734: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882813.68741: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882813.68751: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882813.68766: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882813.68779: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882813.68786: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882813.68793: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882813.68802: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882813.68813: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882813.68820: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882813.68826: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882813.68835: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882813.68909: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882813.68927: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882813.68938: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882813.69060: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882813.70885: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882813.70916: stderr chunk (state=3): >>><<< 30564 1726882813.70919: stdout chunk (state=3): >>><<< 30564 1726882813.70938: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882813.70945: _low_level_execute_command(): starting 30564 1726882813.70947: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882813.366033-31127-6925921261928/AnsiballZ_network_connections.py && sleep 0' 30564 1726882813.72185: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882813.72297: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882813.72307: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882813.72340: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882813.72355: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882813.72363: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882813.72378: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882813.72392: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882813.72401: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882813.72407: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882813.72414: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882813.72423: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882813.72435: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882813.72443: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882813.72450: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882813.72459: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882813.72534: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882813.72552: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882813.72567: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882813.72705: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882813.97909: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, ef1ddb35-9196-4b00-9c2c-f98653d92d9c\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 30564 1726882814.00663: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882814.00723: stderr chunk (state=3): >>><<< 30564 1726882814.00726: stdout chunk (state=3): >>><<< 30564 1726882814.00742: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, ef1ddb35-9196-4b00-9c2c-f98653d92d9c\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882814.00779: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'persistent_state': 'present', 'type': 'bridge', 'ip': {'dhcp4': False, 'auto6': False}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882813.366033-31127-6925921261928/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882814.00785: _low_level_execute_command(): starting 30564 1726882814.00790: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882813.366033-31127-6925921261928/ > /dev/null 2>&1 && sleep 0' 30564 1726882814.01485: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882814.01494: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882814.01506: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882814.01523: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882814.01566: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882814.01579: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882814.01589: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882814.01602: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882814.01609: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882814.01617: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882814.01629: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882814.01638: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882814.01649: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882814.01661: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882814.01670: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882814.01683: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882814.01760: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882814.01784: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882814.01790: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882814.01935: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882814.03713: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882814.03760: stderr chunk (state=3): >>><<< 30564 1726882814.03765: stdout chunk (state=3): >>><<< 30564 1726882814.03782: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882814.03788: handler run complete 30564 1726882814.03809: attempt loop complete, returning result 30564 1726882814.03815: _execute() done 30564 1726882814.03818: dumping result to json 30564 1726882814.03822: done dumping result, returning 30564 1726882814.03830: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0e448fcc-3ce9-4216-acec-00000000021c] 30564 1726882814.03836: sending task result for task 0e448fcc-3ce9-4216-acec-00000000021c 30564 1726882814.03955: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000021c 30564 1726882814.03957: WORKER PROCESS EXITING changed: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [002] #0, state:None persistent_state:present, 'statebr': add connection statebr, ef1ddb35-9196-4b00-9c2c-f98653d92d9c 30564 1726882814.04055: no more pending results, returning what we have 30564 1726882814.04058: results queue empty 30564 1726882814.04059: checking for any_errors_fatal 30564 1726882814.04070: done checking for any_errors_fatal 30564 1726882814.04071: checking for max_fail_percentage 30564 1726882814.04072: done checking for max_fail_percentage 30564 1726882814.04073: checking to see if all hosts have failed and the running result is not ok 30564 1726882814.04074: done checking to see if all hosts have failed 30564 1726882814.04075: getting the remaining hosts for this loop 30564 1726882814.04076: done getting the remaining hosts for this loop 30564 1726882814.04080: getting the next task for host managed_node2 30564 1726882814.04088: done getting next task for host managed_node2 30564 1726882814.04091: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 30564 1726882814.04096: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882814.04108: getting variables 30564 1726882814.04110: in VariableManager get_vars() 30564 1726882814.04144: Calling all_inventory to load vars for managed_node2 30564 1726882814.04146: Calling groups_inventory to load vars for managed_node2 30564 1726882814.04148: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882814.04158: Calling all_plugins_play to load vars for managed_node2 30564 1726882814.04160: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882814.04163: Calling groups_plugins_play to load vars for managed_node2 30564 1726882814.05700: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882814.07459: done with get_vars() 30564 1726882814.07486: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:40:14 -0400 (0:00:00.870) 0:00:12.657 ****** 30564 1726882814.07571: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 30564 1726882814.07573: Creating lock for fedora.linux_system_roles.network_state 30564 1726882814.07854: worker is 1 (out of 1 available) 30564 1726882814.07871: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 30564 1726882814.07885: done queuing things up, now waiting for results queue to drain 30564 1726882814.07886: waiting for pending results... 30564 1726882814.08158: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 30564 1726882814.08303: in run() - task 0e448fcc-3ce9-4216-acec-00000000021d 30564 1726882814.08325: variable 'ansible_search_path' from source: unknown 30564 1726882814.08336: variable 'ansible_search_path' from source: unknown 30564 1726882814.08378: calling self._execute() 30564 1726882814.08473: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882814.08486: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882814.08500: variable 'omit' from source: magic vars 30564 1726882814.08860: variable 'ansible_distribution_major_version' from source: facts 30564 1726882814.08885: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882814.09011: variable 'network_state' from source: role '' defaults 30564 1726882814.09026: Evaluated conditional (network_state != {}): False 30564 1726882814.09033: when evaluation is False, skipping this task 30564 1726882814.09039: _execute() done 30564 1726882814.09045: dumping result to json 30564 1726882814.09051: done dumping result, returning 30564 1726882814.09060: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [0e448fcc-3ce9-4216-acec-00000000021d] 30564 1726882814.09074: sending task result for task 0e448fcc-3ce9-4216-acec-00000000021d 30564 1726882814.09183: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000021d 30564 1726882814.09192: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30564 1726882814.09242: no more pending results, returning what we have 30564 1726882814.09246: results queue empty 30564 1726882814.09247: checking for any_errors_fatal 30564 1726882814.09259: done checking for any_errors_fatal 30564 1726882814.09260: checking for max_fail_percentage 30564 1726882814.09262: done checking for max_fail_percentage 30564 1726882814.09265: checking to see if all hosts have failed and the running result is not ok 30564 1726882814.09266: done checking to see if all hosts have failed 30564 1726882814.09269: getting the remaining hosts for this loop 30564 1726882814.09271: done getting the remaining hosts for this loop 30564 1726882814.09275: getting the next task for host managed_node2 30564 1726882814.09283: done getting next task for host managed_node2 30564 1726882814.09288: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30564 1726882814.09295: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882814.09312: getting variables 30564 1726882814.09314: in VariableManager get_vars() 30564 1726882814.09348: Calling all_inventory to load vars for managed_node2 30564 1726882814.09351: Calling groups_inventory to load vars for managed_node2 30564 1726882814.09354: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882814.09371: Calling all_plugins_play to load vars for managed_node2 30564 1726882814.09375: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882814.09379: Calling groups_plugins_play to load vars for managed_node2 30564 1726882814.11108: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882814.12844: done with get_vars() 30564 1726882814.12871: done getting variables 30564 1726882814.12929: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:40:14 -0400 (0:00:00.053) 0:00:12.710 ****** 30564 1726882814.12961: entering _queue_task() for managed_node2/debug 30564 1726882814.13252: worker is 1 (out of 1 available) 30564 1726882814.13265: exiting _queue_task() for managed_node2/debug 30564 1726882814.13279: done queuing things up, now waiting for results queue to drain 30564 1726882814.13280: waiting for pending results... 30564 1726882814.13557: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30564 1726882814.13703: in run() - task 0e448fcc-3ce9-4216-acec-00000000021e 30564 1726882814.13726: variable 'ansible_search_path' from source: unknown 30564 1726882814.13734: variable 'ansible_search_path' from source: unknown 30564 1726882814.13777: calling self._execute() 30564 1726882814.13872: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882814.13885: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882814.13900: variable 'omit' from source: magic vars 30564 1726882814.14274: variable 'ansible_distribution_major_version' from source: facts 30564 1726882814.14293: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882814.14305: variable 'omit' from source: magic vars 30564 1726882814.14374: variable 'omit' from source: magic vars 30564 1726882814.14410: variable 'omit' from source: magic vars 30564 1726882814.14453: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882814.14500: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882814.14522: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882814.14543: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882814.14557: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882814.14599: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882814.14608: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882814.14616: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882814.14727: Set connection var ansible_timeout to 10 30564 1726882814.14737: Set connection var ansible_pipelining to False 30564 1726882814.14743: Set connection var ansible_shell_type to sh 30564 1726882814.14752: Set connection var ansible_shell_executable to /bin/sh 30564 1726882814.14762: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882814.14774: Set connection var ansible_connection to ssh 30564 1726882814.14802: variable 'ansible_shell_executable' from source: unknown 30564 1726882814.14812: variable 'ansible_connection' from source: unknown 30564 1726882814.14819: variable 'ansible_module_compression' from source: unknown 30564 1726882814.14824: variable 'ansible_shell_type' from source: unknown 30564 1726882814.14830: variable 'ansible_shell_executable' from source: unknown 30564 1726882814.14836: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882814.14842: variable 'ansible_pipelining' from source: unknown 30564 1726882814.14848: variable 'ansible_timeout' from source: unknown 30564 1726882814.14854: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882814.14995: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882814.15013: variable 'omit' from source: magic vars 30564 1726882814.15027: starting attempt loop 30564 1726882814.15034: running the handler 30564 1726882814.15169: variable '__network_connections_result' from source: set_fact 30564 1726882814.15221: handler run complete 30564 1726882814.15246: attempt loop complete, returning result 30564 1726882814.15253: _execute() done 30564 1726882814.15259: dumping result to json 30564 1726882814.15269: done dumping result, returning 30564 1726882814.15281: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0e448fcc-3ce9-4216-acec-00000000021e] 30564 1726882814.15290: sending task result for task 0e448fcc-3ce9-4216-acec-00000000021e ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, ef1ddb35-9196-4b00-9c2c-f98653d92d9c" ] } 30564 1726882814.15450: no more pending results, returning what we have 30564 1726882814.15453: results queue empty 30564 1726882814.15455: checking for any_errors_fatal 30564 1726882814.15465: done checking for any_errors_fatal 30564 1726882814.15466: checking for max_fail_percentage 30564 1726882814.15469: done checking for max_fail_percentage 30564 1726882814.15470: checking to see if all hosts have failed and the running result is not ok 30564 1726882814.15471: done checking to see if all hosts have failed 30564 1726882814.15472: getting the remaining hosts for this loop 30564 1726882814.15474: done getting the remaining hosts for this loop 30564 1726882814.15479: getting the next task for host managed_node2 30564 1726882814.15487: done getting next task for host managed_node2 30564 1726882814.15492: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30564 1726882814.15497: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882814.15508: getting variables 30564 1726882814.15509: in VariableManager get_vars() 30564 1726882814.15542: Calling all_inventory to load vars for managed_node2 30564 1726882814.15545: Calling groups_inventory to load vars for managed_node2 30564 1726882814.15547: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882814.15557: Calling all_plugins_play to load vars for managed_node2 30564 1726882814.15561: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882814.15566: Calling groups_plugins_play to load vars for managed_node2 30564 1726882814.16584: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000021e 30564 1726882814.16587: WORKER PROCESS EXITING 30564 1726882814.17250: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882814.19114: done with get_vars() 30564 1726882814.19135: done getting variables 30564 1726882814.19196: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:40:14 -0400 (0:00:00.062) 0:00:12.773 ****** 30564 1726882814.19232: entering _queue_task() for managed_node2/debug 30564 1726882814.19513: worker is 1 (out of 1 available) 30564 1726882814.19525: exiting _queue_task() for managed_node2/debug 30564 1726882814.19537: done queuing things up, now waiting for results queue to drain 30564 1726882814.19538: waiting for pending results... 30564 1726882814.19817: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30564 1726882814.19972: in run() - task 0e448fcc-3ce9-4216-acec-00000000021f 30564 1726882814.19995: variable 'ansible_search_path' from source: unknown 30564 1726882814.20001: variable 'ansible_search_path' from source: unknown 30564 1726882814.20043: calling self._execute() 30564 1726882814.20140: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882814.20150: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882814.20166: variable 'omit' from source: magic vars 30564 1726882814.20543: variable 'ansible_distribution_major_version' from source: facts 30564 1726882814.20562: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882814.20578: variable 'omit' from source: magic vars 30564 1726882814.20643: variable 'omit' from source: magic vars 30564 1726882814.20686: variable 'omit' from source: magic vars 30564 1726882814.20727: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882814.20769: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882814.20797: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882814.20818: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882814.20833: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882814.20869: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882814.20877: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882814.20885: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882814.20991: Set connection var ansible_timeout to 10 30564 1726882814.21002: Set connection var ansible_pipelining to False 30564 1726882814.21012: Set connection var ansible_shell_type to sh 30564 1726882814.21022: Set connection var ansible_shell_executable to /bin/sh 30564 1726882814.21032: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882814.21039: Set connection var ansible_connection to ssh 30564 1726882814.21073: variable 'ansible_shell_executable' from source: unknown 30564 1726882814.21082: variable 'ansible_connection' from source: unknown 30564 1726882814.21088: variable 'ansible_module_compression' from source: unknown 30564 1726882814.21094: variable 'ansible_shell_type' from source: unknown 30564 1726882814.21100: variable 'ansible_shell_executable' from source: unknown 30564 1726882814.21106: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882814.21113: variable 'ansible_pipelining' from source: unknown 30564 1726882814.21122: variable 'ansible_timeout' from source: unknown 30564 1726882814.21129: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882814.21266: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882814.21287: variable 'omit' from source: magic vars 30564 1726882814.21296: starting attempt loop 30564 1726882814.21302: running the handler 30564 1726882814.21355: variable '__network_connections_result' from source: set_fact 30564 1726882814.21434: variable '__network_connections_result' from source: set_fact 30564 1726882814.21560: handler run complete 30564 1726882814.21592: attempt loop complete, returning result 30564 1726882814.21600: _execute() done 30564 1726882814.21608: dumping result to json 30564 1726882814.21616: done dumping result, returning 30564 1726882814.21626: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0e448fcc-3ce9-4216-acec-00000000021f] 30564 1726882814.21635: sending task result for task 0e448fcc-3ce9-4216-acec-00000000021f ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, ef1ddb35-9196-4b00-9c2c-f98653d92d9c\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, ef1ddb35-9196-4b00-9c2c-f98653d92d9c" ] } } 30564 1726882814.21826: no more pending results, returning what we have 30564 1726882814.21829: results queue empty 30564 1726882814.21831: checking for any_errors_fatal 30564 1726882814.21840: done checking for any_errors_fatal 30564 1726882814.21841: checking for max_fail_percentage 30564 1726882814.21843: done checking for max_fail_percentage 30564 1726882814.21844: checking to see if all hosts have failed and the running result is not ok 30564 1726882814.21844: done checking to see if all hosts have failed 30564 1726882814.21845: getting the remaining hosts for this loop 30564 1726882814.21847: done getting the remaining hosts for this loop 30564 1726882814.21851: getting the next task for host managed_node2 30564 1726882814.21860: done getting next task for host managed_node2 30564 1726882814.21869: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30564 1726882814.21874: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882814.21886: getting variables 30564 1726882814.21888: in VariableManager get_vars() 30564 1726882814.21920: Calling all_inventory to load vars for managed_node2 30564 1726882814.21922: Calling groups_inventory to load vars for managed_node2 30564 1726882814.21930: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882814.21941: Calling all_plugins_play to load vars for managed_node2 30564 1726882814.21944: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882814.21947: Calling groups_plugins_play to load vars for managed_node2 30564 1726882814.22929: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000021f 30564 1726882814.22933: WORKER PROCESS EXITING 30564 1726882814.23657: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882814.25374: done with get_vars() 30564 1726882814.25398: done getting variables 30564 1726882814.25458: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:40:14 -0400 (0:00:00.062) 0:00:12.836 ****** 30564 1726882814.25496: entering _queue_task() for managed_node2/debug 30564 1726882814.25756: worker is 1 (out of 1 available) 30564 1726882814.25773: exiting _queue_task() for managed_node2/debug 30564 1726882814.25785: done queuing things up, now waiting for results queue to drain 30564 1726882814.25786: waiting for pending results... 30564 1726882814.25965: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30564 1726882814.26055: in run() - task 0e448fcc-3ce9-4216-acec-000000000220 30564 1726882814.26067: variable 'ansible_search_path' from source: unknown 30564 1726882814.26072: variable 'ansible_search_path' from source: unknown 30564 1726882814.26103: calling self._execute() 30564 1726882814.26177: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882814.26181: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882814.26191: variable 'omit' from source: magic vars 30564 1726882814.26456: variable 'ansible_distribution_major_version' from source: facts 30564 1726882814.26469: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882814.26550: variable 'network_state' from source: role '' defaults 30564 1726882814.26565: Evaluated conditional (network_state != {}): False 30564 1726882814.26571: when evaluation is False, skipping this task 30564 1726882814.26574: _execute() done 30564 1726882814.26577: dumping result to json 30564 1726882814.26580: done dumping result, returning 30564 1726882814.26583: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0e448fcc-3ce9-4216-acec-000000000220] 30564 1726882814.26589: sending task result for task 0e448fcc-3ce9-4216-acec-000000000220 30564 1726882814.26682: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000220 30564 1726882814.26685: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "network_state != {}" } 30564 1726882814.26727: no more pending results, returning what we have 30564 1726882814.26732: results queue empty 30564 1726882814.26733: checking for any_errors_fatal 30564 1726882814.26743: done checking for any_errors_fatal 30564 1726882814.26744: checking for max_fail_percentage 30564 1726882814.26745: done checking for max_fail_percentage 30564 1726882814.26746: checking to see if all hosts have failed and the running result is not ok 30564 1726882814.26747: done checking to see if all hosts have failed 30564 1726882814.26748: getting the remaining hosts for this loop 30564 1726882814.26750: done getting the remaining hosts for this loop 30564 1726882814.26753: getting the next task for host managed_node2 30564 1726882814.26760: done getting next task for host managed_node2 30564 1726882814.26769: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 30564 1726882814.26773: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882814.26786: getting variables 30564 1726882814.26788: in VariableManager get_vars() 30564 1726882814.26814: Calling all_inventory to load vars for managed_node2 30564 1726882814.26817: Calling groups_inventory to load vars for managed_node2 30564 1726882814.26819: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882814.26828: Calling all_plugins_play to load vars for managed_node2 30564 1726882814.26830: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882814.26833: Calling groups_plugins_play to load vars for managed_node2 30564 1726882814.27926: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882814.29219: done with get_vars() 30564 1726882814.29234: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:40:14 -0400 (0:00:00.038) 0:00:12.874 ****** 30564 1726882814.29303: entering _queue_task() for managed_node2/ping 30564 1726882814.29305: Creating lock for ping 30564 1726882814.29512: worker is 1 (out of 1 available) 30564 1726882814.29525: exiting _queue_task() for managed_node2/ping 30564 1726882814.29537: done queuing things up, now waiting for results queue to drain 30564 1726882814.29538: waiting for pending results... 30564 1726882814.29719: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 30564 1726882814.29817: in run() - task 0e448fcc-3ce9-4216-acec-000000000221 30564 1726882814.29827: variable 'ansible_search_path' from source: unknown 30564 1726882814.29830: variable 'ansible_search_path' from source: unknown 30564 1726882814.29860: calling self._execute() 30564 1726882814.29929: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882814.29933: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882814.29942: variable 'omit' from source: magic vars 30564 1726882814.30205: variable 'ansible_distribution_major_version' from source: facts 30564 1726882814.30217: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882814.30222: variable 'omit' from source: magic vars 30564 1726882814.30261: variable 'omit' from source: magic vars 30564 1726882814.30292: variable 'omit' from source: magic vars 30564 1726882814.30321: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882814.30346: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882814.30362: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882814.30380: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882814.30389: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882814.30413: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882814.30417: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882814.30419: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882814.30489: Set connection var ansible_timeout to 10 30564 1726882814.30492: Set connection var ansible_pipelining to False 30564 1726882814.30495: Set connection var ansible_shell_type to sh 30564 1726882814.30502: Set connection var ansible_shell_executable to /bin/sh 30564 1726882814.30509: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882814.30512: Set connection var ansible_connection to ssh 30564 1726882814.30531: variable 'ansible_shell_executable' from source: unknown 30564 1726882814.30534: variable 'ansible_connection' from source: unknown 30564 1726882814.30537: variable 'ansible_module_compression' from source: unknown 30564 1726882814.30539: variable 'ansible_shell_type' from source: unknown 30564 1726882814.30542: variable 'ansible_shell_executable' from source: unknown 30564 1726882814.30544: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882814.30546: variable 'ansible_pipelining' from source: unknown 30564 1726882814.30549: variable 'ansible_timeout' from source: unknown 30564 1726882814.30551: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882814.30713: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30564 1726882814.30721: variable 'omit' from source: magic vars 30564 1726882814.30732: starting attempt loop 30564 1726882814.30735: running the handler 30564 1726882814.30753: _low_level_execute_command(): starting 30564 1726882814.30782: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882814.31559: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882814.31579: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882814.31600: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882814.31620: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882814.31661: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882814.31679: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882814.31692: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882814.31716: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882814.31728: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882814.31739: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882814.31751: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882814.31767: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882814.31788: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882814.31802: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882814.31814: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882814.31836: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882814.31919: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882814.31953: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882814.31973: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882814.32116: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882814.33775: stdout chunk (state=3): >>>/root <<< 30564 1726882814.33877: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882814.33932: stderr chunk (state=3): >>><<< 30564 1726882814.33934: stdout chunk (state=3): >>><<< 30564 1726882814.34014: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882814.34017: _low_level_execute_command(): starting 30564 1726882814.34021: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882814.3394887-31168-125307328327971 `" && echo ansible-tmp-1726882814.3394887-31168-125307328327971="` echo /root/.ansible/tmp/ansible-tmp-1726882814.3394887-31168-125307328327971 `" ) && sleep 0' 30564 1726882814.34450: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882814.34460: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882814.34498: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882814.34511: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882814.34541: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882814.34549: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882814.34558: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882814.34581: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration <<< 30564 1726882814.34731: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882814.34739: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882814.34743: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882814.34745: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882814.34834: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882814.36712: stdout chunk (state=3): >>>ansible-tmp-1726882814.3394887-31168-125307328327971=/root/.ansible/tmp/ansible-tmp-1726882814.3394887-31168-125307328327971 <<< 30564 1726882814.36825: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882814.36898: stderr chunk (state=3): >>><<< 30564 1726882814.36909: stdout chunk (state=3): >>><<< 30564 1726882814.37173: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882814.3394887-31168-125307328327971=/root/.ansible/tmp/ansible-tmp-1726882814.3394887-31168-125307328327971 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882814.37176: variable 'ansible_module_compression' from source: unknown 30564 1726882814.37179: ANSIBALLZ: Using lock for ping 30564 1726882814.37181: ANSIBALLZ: Acquiring lock 30564 1726882814.37183: ANSIBALLZ: Lock acquired: 140506261689680 30564 1726882814.37185: ANSIBALLZ: Creating module 30564 1726882814.45642: ANSIBALLZ: Writing module into payload 30564 1726882814.45688: ANSIBALLZ: Writing module 30564 1726882814.45704: ANSIBALLZ: Renaming module 30564 1726882814.45709: ANSIBALLZ: Done creating module 30564 1726882814.45725: variable 'ansible_facts' from source: unknown 30564 1726882814.45768: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882814.3394887-31168-125307328327971/AnsiballZ_ping.py 30564 1726882814.45881: Sending initial data 30564 1726882814.45885: Sent initial data (153 bytes) 30564 1726882814.46569: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882814.46580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882814.46618: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 30564 1726882814.46622: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882814.46624: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882814.46668: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882814.46681: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882814.46802: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882814.48682: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882814.48781: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882814.48881: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmpfzd321ae /root/.ansible/tmp/ansible-tmp-1726882814.3394887-31168-125307328327971/AnsiballZ_ping.py <<< 30564 1726882814.48985: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882814.49999: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882814.50108: stderr chunk (state=3): >>><<< 30564 1726882814.50111: stdout chunk (state=3): >>><<< 30564 1726882814.50129: done transferring module to remote 30564 1726882814.50137: _low_level_execute_command(): starting 30564 1726882814.50142: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882814.3394887-31168-125307328327971/ /root/.ansible/tmp/ansible-tmp-1726882814.3394887-31168-125307328327971/AnsiballZ_ping.py && sleep 0' 30564 1726882814.50595: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882814.50611: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882814.50638: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882814.50654: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882814.50708: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882814.50724: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882814.50833: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882814.52682: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882814.52730: stderr chunk (state=3): >>><<< 30564 1726882814.52733: stdout chunk (state=3): >>><<< 30564 1726882814.52749: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882814.52752: _low_level_execute_command(): starting 30564 1726882814.52758: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882814.3394887-31168-125307328327971/AnsiballZ_ping.py && sleep 0' 30564 1726882814.53208: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882814.53213: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882814.53245: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882814.53257: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882814.53272: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882814.53319: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882814.53326: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882814.53447: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882814.66332: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 30564 1726882814.67284: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882814.67341: stderr chunk (state=3): >>><<< 30564 1726882814.67344: stdout chunk (state=3): >>><<< 30564 1726882814.67362: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882814.67385: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882814.3394887-31168-125307328327971/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882814.67396: _low_level_execute_command(): starting 30564 1726882814.67401: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882814.3394887-31168-125307328327971/ > /dev/null 2>&1 && sleep 0' 30564 1726882814.67862: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882814.67871: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882814.67906: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882814.67921: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882814.67965: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882814.67979: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882814.68001: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882814.68092: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882814.69880: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882814.69924: stderr chunk (state=3): >>><<< 30564 1726882814.69927: stdout chunk (state=3): >>><<< 30564 1726882814.69944: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882814.69949: handler run complete 30564 1726882814.69964: attempt loop complete, returning result 30564 1726882814.69968: _execute() done 30564 1726882814.69972: dumping result to json 30564 1726882814.69974: done dumping result, returning 30564 1726882814.69983: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0e448fcc-3ce9-4216-acec-000000000221] 30564 1726882814.69988: sending task result for task 0e448fcc-3ce9-4216-acec-000000000221 30564 1726882814.70078: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000221 30564 1726882814.70081: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 30564 1726882814.70133: no more pending results, returning what we have 30564 1726882814.70136: results queue empty 30564 1726882814.70137: checking for any_errors_fatal 30564 1726882814.70144: done checking for any_errors_fatal 30564 1726882814.70145: checking for max_fail_percentage 30564 1726882814.70147: done checking for max_fail_percentage 30564 1726882814.70147: checking to see if all hosts have failed and the running result is not ok 30564 1726882814.70148: done checking to see if all hosts have failed 30564 1726882814.70149: getting the remaining hosts for this loop 30564 1726882814.70151: done getting the remaining hosts for this loop 30564 1726882814.70154: getting the next task for host managed_node2 30564 1726882814.70168: done getting next task for host managed_node2 30564 1726882814.70170: ^ task is: TASK: meta (role_complete) 30564 1726882814.70174: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882814.70185: getting variables 30564 1726882814.70186: in VariableManager get_vars() 30564 1726882814.70224: Calling all_inventory to load vars for managed_node2 30564 1726882814.70227: Calling groups_inventory to load vars for managed_node2 30564 1726882814.70229: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882814.70239: Calling all_plugins_play to load vars for managed_node2 30564 1726882814.70241: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882814.70244: Calling groups_plugins_play to load vars for managed_node2 30564 1726882814.71090: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882814.72109: done with get_vars() 30564 1726882814.72129: done getting variables 30564 1726882814.72188: done queuing things up, now waiting for results queue to drain 30564 1726882814.72190: results queue empty 30564 1726882814.72190: checking for any_errors_fatal 30564 1726882814.72192: done checking for any_errors_fatal 30564 1726882814.72192: checking for max_fail_percentage 30564 1726882814.72193: done checking for max_fail_percentage 30564 1726882814.72194: checking to see if all hosts have failed and the running result is not ok 30564 1726882814.72194: done checking to see if all hosts have failed 30564 1726882814.72195: getting the remaining hosts for this loop 30564 1726882814.72195: done getting the remaining hosts for this loop 30564 1726882814.72197: getting the next task for host managed_node2 30564 1726882814.72200: done getting next task for host managed_node2 30564 1726882814.72201: ^ task is: TASK: Show result 30564 1726882814.72203: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882814.72205: getting variables 30564 1726882814.72205: in VariableManager get_vars() 30564 1726882814.72212: Calling all_inventory to load vars for managed_node2 30564 1726882814.72213: Calling groups_inventory to load vars for managed_node2 30564 1726882814.72215: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882814.72219: Calling all_plugins_play to load vars for managed_node2 30564 1726882814.72221: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882814.72223: Calling groups_plugins_play to load vars for managed_node2 30564 1726882814.73336: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882814.74999: done with get_vars() 30564 1726882814.75020: done getting variables 30564 1726882814.75068: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show result] ************************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml:14 Friday 20 September 2024 21:40:14 -0400 (0:00:00.457) 0:00:13.332 ****** 30564 1726882814.75106: entering _queue_task() for managed_node2/debug 30564 1726882814.75418: worker is 1 (out of 1 available) 30564 1726882814.75431: exiting _queue_task() for managed_node2/debug 30564 1726882814.75441: done queuing things up, now waiting for results queue to drain 30564 1726882814.75442: waiting for pending results... 30564 1726882814.75622: running TaskExecutor() for managed_node2/TASK: Show result 30564 1726882814.75702: in run() - task 0e448fcc-3ce9-4216-acec-00000000018f 30564 1726882814.75713: variable 'ansible_search_path' from source: unknown 30564 1726882814.75717: variable 'ansible_search_path' from source: unknown 30564 1726882814.75744: calling self._execute() 30564 1726882814.75818: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882814.75821: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882814.75830: variable 'omit' from source: magic vars 30564 1726882814.76097: variable 'ansible_distribution_major_version' from source: facts 30564 1726882814.76116: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882814.76122: variable 'omit' from source: magic vars 30564 1726882814.76154: variable 'omit' from source: magic vars 30564 1726882814.76179: variable 'omit' from source: magic vars 30564 1726882814.76216: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882814.76241: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882814.76255: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882814.76271: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882814.76285: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882814.76307: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882814.76311: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882814.76319: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882814.76389: Set connection var ansible_timeout to 10 30564 1726882814.76393: Set connection var ansible_pipelining to False 30564 1726882814.76396: Set connection var ansible_shell_type to sh 30564 1726882814.76400: Set connection var ansible_shell_executable to /bin/sh 30564 1726882814.76407: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882814.76409: Set connection var ansible_connection to ssh 30564 1726882814.76432: variable 'ansible_shell_executable' from source: unknown 30564 1726882814.76436: variable 'ansible_connection' from source: unknown 30564 1726882814.76439: variable 'ansible_module_compression' from source: unknown 30564 1726882814.76441: variable 'ansible_shell_type' from source: unknown 30564 1726882814.76444: variable 'ansible_shell_executable' from source: unknown 30564 1726882814.76446: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882814.76448: variable 'ansible_pipelining' from source: unknown 30564 1726882814.76450: variable 'ansible_timeout' from source: unknown 30564 1726882814.76452: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882814.76550: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882814.76561: variable 'omit' from source: magic vars 30564 1726882814.76564: starting attempt loop 30564 1726882814.76574: running the handler 30564 1726882814.76607: variable '__network_connections_result' from source: set_fact 30564 1726882814.76662: variable '__network_connections_result' from source: set_fact 30564 1726882814.76743: handler run complete 30564 1726882814.76765: attempt loop complete, returning result 30564 1726882814.76771: _execute() done 30564 1726882814.76774: dumping result to json 30564 1726882814.76776: done dumping result, returning 30564 1726882814.76781: done running TaskExecutor() for managed_node2/TASK: Show result [0e448fcc-3ce9-4216-acec-00000000018f] 30564 1726882814.76789: sending task result for task 0e448fcc-3ce9-4216-acec-00000000018f 30564 1726882814.76883: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000018f 30564 1726882814.76886: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, ef1ddb35-9196-4b00-9c2c-f98653d92d9c\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, ef1ddb35-9196-4b00-9c2c-f98653d92d9c" ] } } 30564 1726882814.76958: no more pending results, returning what we have 30564 1726882814.76961: results queue empty 30564 1726882814.76971: checking for any_errors_fatal 30564 1726882814.76973: done checking for any_errors_fatal 30564 1726882814.76974: checking for max_fail_percentage 30564 1726882814.76976: done checking for max_fail_percentage 30564 1726882814.76976: checking to see if all hosts have failed and the running result is not ok 30564 1726882814.76977: done checking to see if all hosts have failed 30564 1726882814.76978: getting the remaining hosts for this loop 30564 1726882814.76980: done getting the remaining hosts for this loop 30564 1726882814.76983: getting the next task for host managed_node2 30564 1726882814.76990: done getting next task for host managed_node2 30564 1726882814.76993: ^ task is: TASK: Asserts 30564 1726882814.76996: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882814.77000: getting variables 30564 1726882814.77001: in VariableManager get_vars() 30564 1726882814.77026: Calling all_inventory to load vars for managed_node2 30564 1726882814.77029: Calling groups_inventory to load vars for managed_node2 30564 1726882814.77032: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882814.77041: Calling all_plugins_play to load vars for managed_node2 30564 1726882814.77043: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882814.77046: Calling groups_plugins_play to load vars for managed_node2 30564 1726882814.77940: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882814.78857: done with get_vars() 30564 1726882814.78875: done getting variables TASK [Asserts] ***************************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:36 Friday 20 September 2024 21:40:14 -0400 (0:00:00.038) 0:00:13.370 ****** 30564 1726882814.78943: entering _queue_task() for managed_node2/include_tasks 30564 1726882814.79143: worker is 1 (out of 1 available) 30564 1726882814.79158: exiting _queue_task() for managed_node2/include_tasks 30564 1726882814.79173: done queuing things up, now waiting for results queue to drain 30564 1726882814.79175: waiting for pending results... 30564 1726882814.79337: running TaskExecutor() for managed_node2/TASK: Asserts 30564 1726882814.79413: in run() - task 0e448fcc-3ce9-4216-acec-000000000096 30564 1726882814.79425: variable 'ansible_search_path' from source: unknown 30564 1726882814.79428: variable 'ansible_search_path' from source: unknown 30564 1726882814.79470: variable 'lsr_assert' from source: include params 30564 1726882814.79614: variable 'lsr_assert' from source: include params 30564 1726882814.79659: variable 'omit' from source: magic vars 30564 1726882814.79750: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882814.79757: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882814.79766: variable 'omit' from source: magic vars 30564 1726882814.79920: variable 'ansible_distribution_major_version' from source: facts 30564 1726882814.79929: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882814.79935: variable 'item' from source: unknown 30564 1726882814.79981: variable 'item' from source: unknown 30564 1726882814.80009: variable 'item' from source: unknown 30564 1726882814.80047: variable 'item' from source: unknown 30564 1726882814.80179: dumping result to json 30564 1726882814.80182: done dumping result, returning 30564 1726882814.80184: done running TaskExecutor() for managed_node2/TASK: Asserts [0e448fcc-3ce9-4216-acec-000000000096] 30564 1726882814.80186: sending task result for task 0e448fcc-3ce9-4216-acec-000000000096 30564 1726882814.80225: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000096 30564 1726882814.80228: WORKER PROCESS EXITING 30564 1726882814.80251: no more pending results, returning what we have 30564 1726882814.80254: in VariableManager get_vars() 30564 1726882814.80285: Calling all_inventory to load vars for managed_node2 30564 1726882814.80287: Calling groups_inventory to load vars for managed_node2 30564 1726882814.80290: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882814.80298: Calling all_plugins_play to load vars for managed_node2 30564 1726882814.80300: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882814.80302: Calling groups_plugins_play to load vars for managed_node2 30564 1726882814.81073: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882814.82012: done with get_vars() 30564 1726882814.82027: variable 'ansible_search_path' from source: unknown 30564 1726882814.82028: variable 'ansible_search_path' from source: unknown 30564 1726882814.82054: we have included files to process 30564 1726882814.82054: generating all_blocks data 30564 1726882814.82056: done generating all_blocks data 30564 1726882814.82060: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 30564 1726882814.82061: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 30564 1726882814.82062: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 30564 1726882814.82194: in VariableManager get_vars() 30564 1726882814.82206: done with get_vars() 30564 1726882814.82375: done processing included file 30564 1726882814.82377: iterating over new_blocks loaded from include file 30564 1726882814.82378: in VariableManager get_vars() 30564 1726882814.82386: done with get_vars() 30564 1726882814.82387: filtering new block on tags 30564 1726882814.82420: done filtering new block on tags 30564 1726882814.82421: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node2 => (item=tasks/assert_profile_present.yml) 30564 1726882814.82425: extending task lists for all hosts with included blocks 30564 1726882814.83049: done extending task lists 30564 1726882814.83050: done processing included files 30564 1726882814.83050: results queue empty 30564 1726882814.83051: checking for any_errors_fatal 30564 1726882814.83055: done checking for any_errors_fatal 30564 1726882814.83055: checking for max_fail_percentage 30564 1726882814.83056: done checking for max_fail_percentage 30564 1726882814.83056: checking to see if all hosts have failed and the running result is not ok 30564 1726882814.83057: done checking to see if all hosts have failed 30564 1726882814.83057: getting the remaining hosts for this loop 30564 1726882814.83058: done getting the remaining hosts for this loop 30564 1726882814.83060: getting the next task for host managed_node2 30564 1726882814.83063: done getting next task for host managed_node2 30564 1726882814.83065: ^ task is: TASK: Include the task 'get_profile_stat.yml' 30564 1726882814.83069: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882814.83071: getting variables 30564 1726882814.83072: in VariableManager get_vars() 30564 1726882814.83078: Calling all_inventory to load vars for managed_node2 30564 1726882814.83079: Calling groups_inventory to load vars for managed_node2 30564 1726882814.83081: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882814.83085: Calling all_plugins_play to load vars for managed_node2 30564 1726882814.83086: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882814.83088: Calling groups_plugins_play to load vars for managed_node2 30564 1726882814.86533: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882814.87441: done with get_vars() 30564 1726882814.87455: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 21:40:14 -0400 (0:00:00.085) 0:00:13.456 ****** 30564 1726882814.87509: entering _queue_task() for managed_node2/include_tasks 30564 1726882814.87818: worker is 1 (out of 1 available) 30564 1726882814.87830: exiting _queue_task() for managed_node2/include_tasks 30564 1726882814.87840: done queuing things up, now waiting for results queue to drain 30564 1726882814.87842: waiting for pending results... 30564 1726882814.88018: running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' 30564 1726882814.88101: in run() - task 0e448fcc-3ce9-4216-acec-000000000383 30564 1726882814.88111: variable 'ansible_search_path' from source: unknown 30564 1726882814.88115: variable 'ansible_search_path' from source: unknown 30564 1726882814.88141: calling self._execute() 30564 1726882814.88216: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882814.88220: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882814.88231: variable 'omit' from source: magic vars 30564 1726882814.88503: variable 'ansible_distribution_major_version' from source: facts 30564 1726882814.88514: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882814.88521: _execute() done 30564 1726882814.88524: dumping result to json 30564 1726882814.88527: done dumping result, returning 30564 1726882814.88531: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' [0e448fcc-3ce9-4216-acec-000000000383] 30564 1726882814.88537: sending task result for task 0e448fcc-3ce9-4216-acec-000000000383 30564 1726882814.88626: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000383 30564 1726882814.88629: WORKER PROCESS EXITING 30564 1726882814.88654: no more pending results, returning what we have 30564 1726882814.88658: in VariableManager get_vars() 30564 1726882814.88695: Calling all_inventory to load vars for managed_node2 30564 1726882814.88698: Calling groups_inventory to load vars for managed_node2 30564 1726882814.88701: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882814.88714: Calling all_plugins_play to load vars for managed_node2 30564 1726882814.88717: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882814.88720: Calling groups_plugins_play to load vars for managed_node2 30564 1726882814.90173: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882814.91586: done with get_vars() 30564 1726882814.91597: variable 'ansible_search_path' from source: unknown 30564 1726882814.91598: variable 'ansible_search_path' from source: unknown 30564 1726882814.91604: variable 'item' from source: include params 30564 1726882814.91684: variable 'item' from source: include params 30564 1726882814.91706: we have included files to process 30564 1726882814.91707: generating all_blocks data 30564 1726882814.91708: done generating all_blocks data 30564 1726882814.91709: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30564 1726882814.91710: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30564 1726882814.91711: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30564 1726882814.92471: done processing included file 30564 1726882814.92473: iterating over new_blocks loaded from include file 30564 1726882814.92474: in VariableManager get_vars() 30564 1726882814.92484: done with get_vars() 30564 1726882814.92485: filtering new block on tags 30564 1726882814.92553: done filtering new block on tags 30564 1726882814.92556: in VariableManager get_vars() 30564 1726882814.92566: done with get_vars() 30564 1726882814.92569: filtering new block on tags 30564 1726882814.92602: done filtering new block on tags 30564 1726882814.92603: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node2 30564 1726882814.92607: extending task lists for all hosts with included blocks 30564 1726882814.92765: done extending task lists 30564 1726882814.92766: done processing included files 30564 1726882814.92769: results queue empty 30564 1726882814.92769: checking for any_errors_fatal 30564 1726882814.92772: done checking for any_errors_fatal 30564 1726882814.92772: checking for max_fail_percentage 30564 1726882814.92773: done checking for max_fail_percentage 30564 1726882814.92774: checking to see if all hosts have failed and the running result is not ok 30564 1726882814.92774: done checking to see if all hosts have failed 30564 1726882814.92775: getting the remaining hosts for this loop 30564 1726882814.92775: done getting the remaining hosts for this loop 30564 1726882814.92777: getting the next task for host managed_node2 30564 1726882814.92780: done getting next task for host managed_node2 30564 1726882814.92781: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 30564 1726882814.92783: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882814.92785: getting variables 30564 1726882814.92785: in VariableManager get_vars() 30564 1726882814.92791: Calling all_inventory to load vars for managed_node2 30564 1726882814.92792: Calling groups_inventory to load vars for managed_node2 30564 1726882814.92794: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882814.92797: Calling all_plugins_play to load vars for managed_node2 30564 1726882814.92798: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882814.92800: Calling groups_plugins_play to load vars for managed_node2 30564 1726882814.93993: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882814.95748: done with get_vars() 30564 1726882814.95775: done getting variables 30564 1726882814.95814: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 21:40:14 -0400 (0:00:00.083) 0:00:13.539 ****** 30564 1726882814.95843: entering _queue_task() for managed_node2/set_fact 30564 1726882814.96171: worker is 1 (out of 1 available) 30564 1726882814.96182: exiting _queue_task() for managed_node2/set_fact 30564 1726882814.96194: done queuing things up, now waiting for results queue to drain 30564 1726882814.96196: waiting for pending results... 30564 1726882814.96892: running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag 30564 1726882814.97023: in run() - task 0e448fcc-3ce9-4216-acec-0000000003fe 30564 1726882814.97053: variable 'ansible_search_path' from source: unknown 30564 1726882814.97062: variable 'ansible_search_path' from source: unknown 30564 1726882814.97102: calling self._execute() 30564 1726882814.97198: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882814.97210: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882814.97223: variable 'omit' from source: magic vars 30564 1726882814.97583: variable 'ansible_distribution_major_version' from source: facts 30564 1726882814.97603: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882814.97613: variable 'omit' from source: magic vars 30564 1726882814.97670: variable 'omit' from source: magic vars 30564 1726882814.97710: variable 'omit' from source: magic vars 30564 1726882814.97753: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882814.97796: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882814.97822: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882814.97845: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882814.97862: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882814.97897: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882814.97907: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882814.97917: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882814.98021: Set connection var ansible_timeout to 10 30564 1726882814.98037: Set connection var ansible_pipelining to False 30564 1726882814.98044: Set connection var ansible_shell_type to sh 30564 1726882814.98054: Set connection var ansible_shell_executable to /bin/sh 30564 1726882814.98068: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882814.98075: Set connection var ansible_connection to ssh 30564 1726882814.98102: variable 'ansible_shell_executable' from source: unknown 30564 1726882814.98109: variable 'ansible_connection' from source: unknown 30564 1726882814.98115: variable 'ansible_module_compression' from source: unknown 30564 1726882814.98121: variable 'ansible_shell_type' from source: unknown 30564 1726882814.98127: variable 'ansible_shell_executable' from source: unknown 30564 1726882814.98137: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882814.98144: variable 'ansible_pipelining' from source: unknown 30564 1726882814.98147: variable 'ansible_timeout' from source: unknown 30564 1726882814.98150: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882814.98286: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882814.98295: variable 'omit' from source: magic vars 30564 1726882814.98300: starting attempt loop 30564 1726882814.98303: running the handler 30564 1726882814.98316: handler run complete 30564 1726882814.98325: attempt loop complete, returning result 30564 1726882814.98329: _execute() done 30564 1726882814.98331: dumping result to json 30564 1726882814.98333: done dumping result, returning 30564 1726882814.98340: done running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag [0e448fcc-3ce9-4216-acec-0000000003fe] 30564 1726882814.98346: sending task result for task 0e448fcc-3ce9-4216-acec-0000000003fe 30564 1726882814.98475: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000003fe 30564 1726882814.98477: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 30564 1726882814.98532: no more pending results, returning what we have 30564 1726882814.98535: results queue empty 30564 1726882814.98536: checking for any_errors_fatal 30564 1726882814.98539: done checking for any_errors_fatal 30564 1726882814.98540: checking for max_fail_percentage 30564 1726882814.98542: done checking for max_fail_percentage 30564 1726882814.98542: checking to see if all hosts have failed and the running result is not ok 30564 1726882814.98543: done checking to see if all hosts have failed 30564 1726882814.98544: getting the remaining hosts for this loop 30564 1726882814.98546: done getting the remaining hosts for this loop 30564 1726882814.98550: getting the next task for host managed_node2 30564 1726882814.98560: done getting next task for host managed_node2 30564 1726882814.98562: ^ task is: TASK: Stat profile file 30564 1726882814.98571: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882814.98576: getting variables 30564 1726882814.98578: in VariableManager get_vars() 30564 1726882814.98607: Calling all_inventory to load vars for managed_node2 30564 1726882814.98610: Calling groups_inventory to load vars for managed_node2 30564 1726882814.98613: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882814.98624: Calling all_plugins_play to load vars for managed_node2 30564 1726882814.98627: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882814.98630: Calling groups_plugins_play to load vars for managed_node2 30564 1726882815.00834: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882815.04293: done with get_vars() 30564 1726882815.04316: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 21:40:15 -0400 (0:00:00.085) 0:00:13.625 ****** 30564 1726882815.04413: entering _queue_task() for managed_node2/stat 30564 1726882815.04716: worker is 1 (out of 1 available) 30564 1726882815.04730: exiting _queue_task() for managed_node2/stat 30564 1726882815.04743: done queuing things up, now waiting for results queue to drain 30564 1726882815.04744: waiting for pending results... 30564 1726882815.05825: running TaskExecutor() for managed_node2/TASK: Stat profile file 30564 1726882815.05982: in run() - task 0e448fcc-3ce9-4216-acec-0000000003ff 30564 1726882815.06185: variable 'ansible_search_path' from source: unknown 30564 1726882815.06193: variable 'ansible_search_path' from source: unknown 30564 1726882815.06233: calling self._execute() 30564 1726882815.06331: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882815.06578: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882815.06594: variable 'omit' from source: magic vars 30564 1726882815.07020: variable 'ansible_distribution_major_version' from source: facts 30564 1726882815.07250: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882815.07262: variable 'omit' from source: magic vars 30564 1726882815.07323: variable 'omit' from source: magic vars 30564 1726882815.07551: variable 'profile' from source: play vars 30564 1726882815.07560: variable 'interface' from source: play vars 30564 1726882815.07647: variable 'interface' from source: play vars 30564 1726882815.07812: variable 'omit' from source: magic vars 30564 1726882815.07856: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882815.07900: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882815.08051: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882815.08075: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882815.08091: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882815.08124: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882815.08133: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882815.08140: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882815.08390: Set connection var ansible_timeout to 10 30564 1726882815.08400: Set connection var ansible_pipelining to False 30564 1726882815.08406: Set connection var ansible_shell_type to sh 30564 1726882815.08415: Set connection var ansible_shell_executable to /bin/sh 30564 1726882815.08426: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882815.08432: Set connection var ansible_connection to ssh 30564 1726882815.08458: variable 'ansible_shell_executable' from source: unknown 30564 1726882815.08732: variable 'ansible_connection' from source: unknown 30564 1726882815.08739: variable 'ansible_module_compression' from source: unknown 30564 1726882815.08745: variable 'ansible_shell_type' from source: unknown 30564 1726882815.08751: variable 'ansible_shell_executable' from source: unknown 30564 1726882815.08757: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882815.08766: variable 'ansible_pipelining' from source: unknown 30564 1726882815.08774: variable 'ansible_timeout' from source: unknown 30564 1726882815.08781: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882815.08971: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30564 1726882815.08987: variable 'omit' from source: magic vars 30564 1726882815.08996: starting attempt loop 30564 1726882815.09002: running the handler 30564 1726882815.09017: _low_level_execute_command(): starting 30564 1726882815.09178: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882815.10993: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882815.10999: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882815.11036: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 30564 1726882815.11040: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882815.11212: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882815.11215: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882815.11341: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882815.13015: stdout chunk (state=3): >>>/root <<< 30564 1726882815.13261: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882815.13267: stdout chunk (state=3): >>><<< 30564 1726882815.13269: stderr chunk (state=3): >>><<< 30564 1726882815.13418: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882815.13422: _low_level_execute_command(): starting 30564 1726882815.13424: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882815.1332293-31194-193853977348842 `" && echo ansible-tmp-1726882815.1332293-31194-193853977348842="` echo /root/.ansible/tmp/ansible-tmp-1726882815.1332293-31194-193853977348842 `" ) && sleep 0' 30564 1726882815.14145: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882815.14160: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882815.14181: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882815.14207: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882815.14250: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882815.14275: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882815.14289: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882815.14314: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882815.14327: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882815.14339: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882815.14351: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882815.14367: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882815.14383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882815.14394: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882815.14412: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882815.14425: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882815.14502: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882815.14529: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882815.14546: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882815.14681: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882815.16594: stdout chunk (state=3): >>>ansible-tmp-1726882815.1332293-31194-193853977348842=/root/.ansible/tmp/ansible-tmp-1726882815.1332293-31194-193853977348842 <<< 30564 1726882815.16770: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882815.16774: stderr chunk (state=3): >>><<< 30564 1726882815.16777: stdout chunk (state=3): >>><<< 30564 1726882815.16796: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882815.1332293-31194-193853977348842=/root/.ansible/tmp/ansible-tmp-1726882815.1332293-31194-193853977348842 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882815.16843: variable 'ansible_module_compression' from source: unknown 30564 1726882815.16908: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 30564 1726882815.16948: variable 'ansible_facts' from source: unknown 30564 1726882815.17034: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882815.1332293-31194-193853977348842/AnsiballZ_stat.py 30564 1726882815.17199: Sending initial data 30564 1726882815.17202: Sent initial data (153 bytes) 30564 1726882815.18145: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882815.18154: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882815.18165: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882815.18184: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882815.18221: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882815.18229: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882815.18235: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882815.18248: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882815.18256: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882815.18262: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882815.18275: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882815.18286: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882815.18299: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882815.18308: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882815.18316: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882815.18326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882815.18400: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882815.18884: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882815.18894: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882815.19024: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882815.20812: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882815.20906: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882815.21006: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmpvhp_8tzi /root/.ansible/tmp/ansible-tmp-1726882815.1332293-31194-193853977348842/AnsiballZ_stat.py <<< 30564 1726882815.21101: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882815.23701: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882815.23896: stderr chunk (state=3): >>><<< 30564 1726882815.23899: stdout chunk (state=3): >>><<< 30564 1726882815.23902: done transferring module to remote 30564 1726882815.23904: _low_level_execute_command(): starting 30564 1726882815.23906: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882815.1332293-31194-193853977348842/ /root/.ansible/tmp/ansible-tmp-1726882815.1332293-31194-193853977348842/AnsiballZ_stat.py && sleep 0' 30564 1726882815.24924: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882815.24931: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882815.24986: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882815.24989: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 30564 1726882815.25024: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882815.25332: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882815.27179: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882815.27183: stdout chunk (state=3): >>><<< 30564 1726882815.27190: stderr chunk (state=3): >>><<< 30564 1726882815.27215: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882815.27218: _low_level_execute_command(): starting 30564 1726882815.27223: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882815.1332293-31194-193853977348842/AnsiballZ_stat.py && sleep 0' 30564 1726882815.27863: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882815.27875: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882815.27887: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882815.27899: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882815.27936: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882815.27942: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882815.27951: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882815.27976: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882815.27983: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882815.27990: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882815.27997: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882815.28006: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882815.28016: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882815.28023: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882815.28029: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882815.28038: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882815.28118: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882815.28132: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882815.28143: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882815.28421: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882815.41633: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 30564 1726882815.42682: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882815.42729: stderr chunk (state=3): >>><<< 30564 1726882815.42733: stdout chunk (state=3): >>><<< 30564 1726882815.42748: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882815.42778: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882815.1332293-31194-193853977348842/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882815.42785: _low_level_execute_command(): starting 30564 1726882815.42790: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882815.1332293-31194-193853977348842/ > /dev/null 2>&1 && sleep 0' 30564 1726882815.43223: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882815.43229: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882815.43271: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882815.43274: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882815.43286: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882815.43332: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882815.43339: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882815.43453: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882815.45277: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882815.45314: stderr chunk (state=3): >>><<< 30564 1726882815.45317: stdout chunk (state=3): >>><<< 30564 1726882815.45329: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882815.45335: handler run complete 30564 1726882815.45352: attempt loop complete, returning result 30564 1726882815.45356: _execute() done 30564 1726882815.45358: dumping result to json 30564 1726882815.45360: done dumping result, returning 30564 1726882815.45374: done running TaskExecutor() for managed_node2/TASK: Stat profile file [0e448fcc-3ce9-4216-acec-0000000003ff] 30564 1726882815.45380: sending task result for task 0e448fcc-3ce9-4216-acec-0000000003ff 30564 1726882815.45474: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000003ff 30564 1726882815.45478: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 30564 1726882815.45531: no more pending results, returning what we have 30564 1726882815.45535: results queue empty 30564 1726882815.45536: checking for any_errors_fatal 30564 1726882815.45543: done checking for any_errors_fatal 30564 1726882815.45544: checking for max_fail_percentage 30564 1726882815.45545: done checking for max_fail_percentage 30564 1726882815.45546: checking to see if all hosts have failed and the running result is not ok 30564 1726882815.45547: done checking to see if all hosts have failed 30564 1726882815.45548: getting the remaining hosts for this loop 30564 1726882815.45549: done getting the remaining hosts for this loop 30564 1726882815.45553: getting the next task for host managed_node2 30564 1726882815.45561: done getting next task for host managed_node2 30564 1726882815.45566: ^ task is: TASK: Set NM profile exist flag based on the profile files 30564 1726882815.45573: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882815.45578: getting variables 30564 1726882815.45579: in VariableManager get_vars() 30564 1726882815.45611: Calling all_inventory to load vars for managed_node2 30564 1726882815.45614: Calling groups_inventory to load vars for managed_node2 30564 1726882815.45617: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882815.45628: Calling all_plugins_play to load vars for managed_node2 30564 1726882815.45631: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882815.45633: Calling groups_plugins_play to load vars for managed_node2 30564 1726882815.46792: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882815.48385: done with get_vars() 30564 1726882815.48402: done getting variables 30564 1726882815.48446: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 21:40:15 -0400 (0:00:00.440) 0:00:14.066 ****** 30564 1726882815.48483: entering _queue_task() for managed_node2/set_fact 30564 1726882815.48712: worker is 1 (out of 1 available) 30564 1726882815.48726: exiting _queue_task() for managed_node2/set_fact 30564 1726882815.48738: done queuing things up, now waiting for results queue to drain 30564 1726882815.48739: waiting for pending results... 30564 1726882815.48910: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files 30564 1726882815.48990: in run() - task 0e448fcc-3ce9-4216-acec-000000000400 30564 1726882815.49001: variable 'ansible_search_path' from source: unknown 30564 1726882815.49004: variable 'ansible_search_path' from source: unknown 30564 1726882815.49032: calling self._execute() 30564 1726882815.49107: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882815.49112: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882815.49121: variable 'omit' from source: magic vars 30564 1726882815.49387: variable 'ansible_distribution_major_version' from source: facts 30564 1726882815.49399: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882815.49489: variable 'profile_stat' from source: set_fact 30564 1726882815.49498: Evaluated conditional (profile_stat.stat.exists): False 30564 1726882815.49501: when evaluation is False, skipping this task 30564 1726882815.49508: _execute() done 30564 1726882815.49512: dumping result to json 30564 1726882815.49520: done dumping result, returning 30564 1726882815.49526: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files [0e448fcc-3ce9-4216-acec-000000000400] 30564 1726882815.49531: sending task result for task 0e448fcc-3ce9-4216-acec-000000000400 30564 1726882815.49615: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000400 30564 1726882815.49620: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30564 1726882815.49669: no more pending results, returning what we have 30564 1726882815.49674: results queue empty 30564 1726882815.49675: checking for any_errors_fatal 30564 1726882815.49682: done checking for any_errors_fatal 30564 1726882815.49683: checking for max_fail_percentage 30564 1726882815.49685: done checking for max_fail_percentage 30564 1726882815.49686: checking to see if all hosts have failed and the running result is not ok 30564 1726882815.49687: done checking to see if all hosts have failed 30564 1726882815.49688: getting the remaining hosts for this loop 30564 1726882815.49689: done getting the remaining hosts for this loop 30564 1726882815.49694: getting the next task for host managed_node2 30564 1726882815.49702: done getting next task for host managed_node2 30564 1726882815.49704: ^ task is: TASK: Get NM profile info 30564 1726882815.49708: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882815.49714: getting variables 30564 1726882815.49715: in VariableManager get_vars() 30564 1726882815.49746: Calling all_inventory to load vars for managed_node2 30564 1726882815.49749: Calling groups_inventory to load vars for managed_node2 30564 1726882815.49752: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882815.49762: Calling all_plugins_play to load vars for managed_node2 30564 1726882815.49766: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882815.49769: Calling groups_plugins_play to load vars for managed_node2 30564 1726882815.50751: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882815.52310: done with get_vars() 30564 1726882815.52330: done getting variables 30564 1726882815.52425: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 21:40:15 -0400 (0:00:00.039) 0:00:14.105 ****** 30564 1726882815.52454: entering _queue_task() for managed_node2/shell 30564 1726882815.52455: Creating lock for shell 30564 1726882815.52736: worker is 1 (out of 1 available) 30564 1726882815.52749: exiting _queue_task() for managed_node2/shell 30564 1726882815.52761: done queuing things up, now waiting for results queue to drain 30564 1726882815.52762: waiting for pending results... 30564 1726882815.53052: running TaskExecutor() for managed_node2/TASK: Get NM profile info 30564 1726882815.53145: in run() - task 0e448fcc-3ce9-4216-acec-000000000401 30564 1726882815.53154: variable 'ansible_search_path' from source: unknown 30564 1726882815.53158: variable 'ansible_search_path' from source: unknown 30564 1726882815.53190: calling self._execute() 30564 1726882815.53265: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882815.53274: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882815.53284: variable 'omit' from source: magic vars 30564 1726882815.53542: variable 'ansible_distribution_major_version' from source: facts 30564 1726882815.53553: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882815.53559: variable 'omit' from source: magic vars 30564 1726882815.53599: variable 'omit' from source: magic vars 30564 1726882815.53666: variable 'profile' from source: play vars 30564 1726882815.53674: variable 'interface' from source: play vars 30564 1726882815.53723: variable 'interface' from source: play vars 30564 1726882815.53741: variable 'omit' from source: magic vars 30564 1726882815.53777: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882815.53806: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882815.53820: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882815.53835: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882815.53845: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882815.53872: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882815.53876: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882815.53879: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882815.53946: Set connection var ansible_timeout to 10 30564 1726882815.53949: Set connection var ansible_pipelining to False 30564 1726882815.53952: Set connection var ansible_shell_type to sh 30564 1726882815.53959: Set connection var ansible_shell_executable to /bin/sh 30564 1726882815.53965: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882815.53972: Set connection var ansible_connection to ssh 30564 1726882815.53988: variable 'ansible_shell_executable' from source: unknown 30564 1726882815.53991: variable 'ansible_connection' from source: unknown 30564 1726882815.53993: variable 'ansible_module_compression' from source: unknown 30564 1726882815.53995: variable 'ansible_shell_type' from source: unknown 30564 1726882815.53998: variable 'ansible_shell_executable' from source: unknown 30564 1726882815.54000: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882815.54004: variable 'ansible_pipelining' from source: unknown 30564 1726882815.54007: variable 'ansible_timeout' from source: unknown 30564 1726882815.54010: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882815.54107: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882815.54115: variable 'omit' from source: magic vars 30564 1726882815.54121: starting attempt loop 30564 1726882815.54124: running the handler 30564 1726882815.54133: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882815.54147: _low_level_execute_command(): starting 30564 1726882815.54155: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882815.54646: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882815.54662: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882815.54681: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882815.54694: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882815.54741: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882815.54753: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882815.54873: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882815.56515: stdout chunk (state=3): >>>/root <<< 30564 1726882815.56617: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882815.56659: stderr chunk (state=3): >>><<< 30564 1726882815.56662: stdout chunk (state=3): >>><<< 30564 1726882815.56686: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882815.56698: _low_level_execute_command(): starting 30564 1726882815.56703: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882815.5668511-31219-245802308145401 `" && echo ansible-tmp-1726882815.5668511-31219-245802308145401="` echo /root/.ansible/tmp/ansible-tmp-1726882815.5668511-31219-245802308145401 `" ) && sleep 0' 30564 1726882815.57118: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882815.57129: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882815.57153: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882815.57157: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882815.57159: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882815.57221: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882815.57225: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882815.57331: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882815.59192: stdout chunk (state=3): >>>ansible-tmp-1726882815.5668511-31219-245802308145401=/root/.ansible/tmp/ansible-tmp-1726882815.5668511-31219-245802308145401 <<< 30564 1726882815.59306: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882815.59345: stderr chunk (state=3): >>><<< 30564 1726882815.59348: stdout chunk (state=3): >>><<< 30564 1726882815.59361: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882815.5668511-31219-245802308145401=/root/.ansible/tmp/ansible-tmp-1726882815.5668511-31219-245802308145401 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882815.59387: variable 'ansible_module_compression' from source: unknown 30564 1726882815.59425: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30564 1726882815.59459: variable 'ansible_facts' from source: unknown 30564 1726882815.59516: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882815.5668511-31219-245802308145401/AnsiballZ_command.py 30564 1726882815.59611: Sending initial data 30564 1726882815.59616: Sent initial data (156 bytes) 30564 1726882815.60235: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882815.60241: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882815.60290: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882815.60294: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 30564 1726882815.60296: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882815.60344: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882815.60352: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882815.60465: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882815.62200: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 <<< 30564 1726882815.62206: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882815.62296: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882815.62395: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmpvs81ba6x /root/.ansible/tmp/ansible-tmp-1726882815.5668511-31219-245802308145401/AnsiballZ_command.py <<< 30564 1726882815.62490: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882815.63504: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882815.63597: stderr chunk (state=3): >>><<< 30564 1726882815.63600: stdout chunk (state=3): >>><<< 30564 1726882815.63616: done transferring module to remote 30564 1726882815.63625: _low_level_execute_command(): starting 30564 1726882815.63629: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882815.5668511-31219-245802308145401/ /root/.ansible/tmp/ansible-tmp-1726882815.5668511-31219-245802308145401/AnsiballZ_command.py && sleep 0' 30564 1726882815.64050: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882815.64062: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882815.64095: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882815.64107: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882815.64153: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882815.64169: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882815.64282: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882815.66029: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882815.66071: stderr chunk (state=3): >>><<< 30564 1726882815.66080: stdout chunk (state=3): >>><<< 30564 1726882815.66095: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882815.66098: _low_level_execute_command(): starting 30564 1726882815.66104: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882815.5668511-31219-245802308145401/AnsiballZ_command.py && sleep 0' 30564 1726882815.66523: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882815.66534: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882815.66566: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 30564 1726882815.66572: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration <<< 30564 1726882815.66582: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882815.66592: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882815.66643: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882815.66659: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882815.66768: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882815.81985: stdout chunk (state=3): >>> {"changed": true, "stdout": "statebr /etc/NetworkManager/system-connections/statebr.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-20 21:40:15.797993", "end": "2024-09-20 21:40:15.817767", "delta": "0:00:00.019774", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30564 1726882815.83295: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882815.83299: stdout chunk (state=3): >>><<< 30564 1726882815.83301: stderr chunk (state=3): >>><<< 30564 1726882815.83444: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "statebr /etc/NetworkManager/system-connections/statebr.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-20 21:40:15.797993", "end": "2024-09-20 21:40:15.817767", "delta": "0:00:00.019774", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882815.83448: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882815.5668511-31219-245802308145401/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882815.83451: _low_level_execute_command(): starting 30564 1726882815.83453: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882815.5668511-31219-245802308145401/ > /dev/null 2>&1 && sleep 0' 30564 1726882815.84045: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882815.84058: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882815.84074: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882815.84093: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882815.84140: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882815.84156: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882815.84172: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882815.84189: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882815.84200: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882815.84216: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882815.84229: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882815.84241: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882815.84255: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882815.84268: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882815.84280: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882815.84292: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882815.84376: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882815.84396: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882815.84410: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882815.84544: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882815.86408: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882815.86411: stdout chunk (state=3): >>><<< 30564 1726882815.86413: stderr chunk (state=3): >>><<< 30564 1726882815.86835: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882815.86839: handler run complete 30564 1726882815.86841: Evaluated conditional (False): False 30564 1726882815.86843: attempt loop complete, returning result 30564 1726882815.86845: _execute() done 30564 1726882815.86847: dumping result to json 30564 1726882815.86849: done dumping result, returning 30564 1726882815.86851: done running TaskExecutor() for managed_node2/TASK: Get NM profile info [0e448fcc-3ce9-4216-acec-000000000401] 30564 1726882815.86853: sending task result for task 0e448fcc-3ce9-4216-acec-000000000401 30564 1726882815.86926: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000401 30564 1726882815.86930: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "delta": "0:00:00.019774", "end": "2024-09-20 21:40:15.817767", "rc": 0, "start": "2024-09-20 21:40:15.797993" } STDOUT: statebr /etc/NetworkManager/system-connections/statebr.nmconnection 30564 1726882815.86999: no more pending results, returning what we have 30564 1726882815.87002: results queue empty 30564 1726882815.87003: checking for any_errors_fatal 30564 1726882815.87008: done checking for any_errors_fatal 30564 1726882815.87009: checking for max_fail_percentage 30564 1726882815.87011: done checking for max_fail_percentage 30564 1726882815.87012: checking to see if all hosts have failed and the running result is not ok 30564 1726882815.87013: done checking to see if all hosts have failed 30564 1726882815.87014: getting the remaining hosts for this loop 30564 1726882815.87015: done getting the remaining hosts for this loop 30564 1726882815.87018: getting the next task for host managed_node2 30564 1726882815.87025: done getting next task for host managed_node2 30564 1726882815.87027: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 30564 1726882815.87032: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882815.87035: getting variables 30564 1726882815.87036: in VariableManager get_vars() 30564 1726882815.87067: Calling all_inventory to load vars for managed_node2 30564 1726882815.87070: Calling groups_inventory to load vars for managed_node2 30564 1726882815.87073: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882815.87084: Calling all_plugins_play to load vars for managed_node2 30564 1726882815.87087: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882815.87090: Calling groups_plugins_play to load vars for managed_node2 30564 1726882815.88590: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882815.90258: done with get_vars() 30564 1726882815.90286: done getting variables 30564 1726882815.90347: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 21:40:15 -0400 (0:00:00.379) 0:00:14.485 ****** 30564 1726882815.90382: entering _queue_task() for managed_node2/set_fact 30564 1726882815.90716: worker is 1 (out of 1 available) 30564 1726882815.90730: exiting _queue_task() for managed_node2/set_fact 30564 1726882815.90744: done queuing things up, now waiting for results queue to drain 30564 1726882815.90746: waiting for pending results... 30564 1726882815.91039: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 30564 1726882815.91168: in run() - task 0e448fcc-3ce9-4216-acec-000000000402 30564 1726882815.91195: variable 'ansible_search_path' from source: unknown 30564 1726882815.91203: variable 'ansible_search_path' from source: unknown 30564 1726882815.91244: calling self._execute() 30564 1726882815.91344: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882815.91356: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882815.91373: variable 'omit' from source: magic vars 30564 1726882815.91744: variable 'ansible_distribution_major_version' from source: facts 30564 1726882815.91765: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882815.91899: variable 'nm_profile_exists' from source: set_fact 30564 1726882815.91917: Evaluated conditional (nm_profile_exists.rc == 0): True 30564 1726882815.91928: variable 'omit' from source: magic vars 30564 1726882815.91986: variable 'omit' from source: magic vars 30564 1726882815.92023: variable 'omit' from source: magic vars 30564 1726882815.92074: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882815.92114: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882815.92136: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882815.92163: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882815.92184: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882815.92218: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882815.92227: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882815.92235: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882815.92339: Set connection var ansible_timeout to 10 30564 1726882815.92349: Set connection var ansible_pipelining to False 30564 1726882815.92356: Set connection var ansible_shell_type to sh 30564 1726882815.92369: Set connection var ansible_shell_executable to /bin/sh 30564 1726882815.92386: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882815.92393: Set connection var ansible_connection to ssh 30564 1726882815.92422: variable 'ansible_shell_executable' from source: unknown 30564 1726882815.92431: variable 'ansible_connection' from source: unknown 30564 1726882815.92437: variable 'ansible_module_compression' from source: unknown 30564 1726882815.92444: variable 'ansible_shell_type' from source: unknown 30564 1726882815.92451: variable 'ansible_shell_executable' from source: unknown 30564 1726882815.92457: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882815.92466: variable 'ansible_pipelining' from source: unknown 30564 1726882815.92474: variable 'ansible_timeout' from source: unknown 30564 1726882815.92484: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882815.92630: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882815.92647: variable 'omit' from source: magic vars 30564 1726882815.92657: starting attempt loop 30564 1726882815.92667: running the handler 30564 1726882815.92685: handler run complete 30564 1726882815.92699: attempt loop complete, returning result 30564 1726882815.92709: _execute() done 30564 1726882815.92716: dumping result to json 30564 1726882815.92723: done dumping result, returning 30564 1726882815.92735: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0e448fcc-3ce9-4216-acec-000000000402] 30564 1726882815.92745: sending task result for task 0e448fcc-3ce9-4216-acec-000000000402 ok: [managed_node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 30564 1726882815.92900: no more pending results, returning what we have 30564 1726882815.92904: results queue empty 30564 1726882815.92905: checking for any_errors_fatal 30564 1726882815.92916: done checking for any_errors_fatal 30564 1726882815.92917: checking for max_fail_percentage 30564 1726882815.92919: done checking for max_fail_percentage 30564 1726882815.92920: checking to see if all hosts have failed and the running result is not ok 30564 1726882815.92920: done checking to see if all hosts have failed 30564 1726882815.92922: getting the remaining hosts for this loop 30564 1726882815.92923: done getting the remaining hosts for this loop 30564 1726882815.92927: getting the next task for host managed_node2 30564 1726882815.92940: done getting next task for host managed_node2 30564 1726882815.92943: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 30564 1726882815.92948: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882815.92953: getting variables 30564 1726882815.92954: in VariableManager get_vars() 30564 1726882815.92988: Calling all_inventory to load vars for managed_node2 30564 1726882815.92991: Calling groups_inventory to load vars for managed_node2 30564 1726882815.92994: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882815.93006: Calling all_plugins_play to load vars for managed_node2 30564 1726882815.93010: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882815.93013: Calling groups_plugins_play to load vars for managed_node2 30564 1726882815.94170: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000402 30564 1726882815.94174: WORKER PROCESS EXITING 30564 1726882815.94819: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882815.96551: done with get_vars() 30564 1726882815.96573: done getting variables 30564 1726882815.96628: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30564 1726882815.96749: variable 'profile' from source: play vars 30564 1726882815.96753: variable 'interface' from source: play vars 30564 1726882815.96816: variable 'interface' from source: play vars TASK [Get the ansible_managed comment in ifcfg-statebr] ************************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 21:40:15 -0400 (0:00:00.064) 0:00:14.549 ****** 30564 1726882815.96848: entering _queue_task() for managed_node2/command 30564 1726882815.97133: worker is 1 (out of 1 available) 30564 1726882815.97146: exiting _queue_task() for managed_node2/command 30564 1726882815.97158: done queuing things up, now waiting for results queue to drain 30564 1726882815.97159: waiting for pending results... 30564 1726882815.97431: running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-statebr 30564 1726882815.97550: in run() - task 0e448fcc-3ce9-4216-acec-000000000404 30564 1726882815.97574: variable 'ansible_search_path' from source: unknown 30564 1726882815.97583: variable 'ansible_search_path' from source: unknown 30564 1726882815.97625: calling self._execute() 30564 1726882815.97716: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882815.97725: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882815.97738: variable 'omit' from source: magic vars 30564 1726882815.98107: variable 'ansible_distribution_major_version' from source: facts 30564 1726882815.98126: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882815.98261: variable 'profile_stat' from source: set_fact 30564 1726882815.98281: Evaluated conditional (profile_stat.stat.exists): False 30564 1726882815.98289: when evaluation is False, skipping this task 30564 1726882815.98296: _execute() done 30564 1726882815.98303: dumping result to json 30564 1726882815.98310: done dumping result, returning 30564 1726882815.98320: done running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-statebr [0e448fcc-3ce9-4216-acec-000000000404] 30564 1726882815.98331: sending task result for task 0e448fcc-3ce9-4216-acec-000000000404 30564 1726882815.98445: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000404 30564 1726882815.98453: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30564 1726882815.98516: no more pending results, returning what we have 30564 1726882815.98520: results queue empty 30564 1726882815.98521: checking for any_errors_fatal 30564 1726882815.98527: done checking for any_errors_fatal 30564 1726882815.98528: checking for max_fail_percentage 30564 1726882815.98530: done checking for max_fail_percentage 30564 1726882815.98531: checking to see if all hosts have failed and the running result is not ok 30564 1726882815.98532: done checking to see if all hosts have failed 30564 1726882815.98533: getting the remaining hosts for this loop 30564 1726882815.98535: done getting the remaining hosts for this loop 30564 1726882815.98539: getting the next task for host managed_node2 30564 1726882815.98549: done getting next task for host managed_node2 30564 1726882815.98552: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 30564 1726882815.98558: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882815.98563: getting variables 30564 1726882815.98566: in VariableManager get_vars() 30564 1726882815.98598: Calling all_inventory to load vars for managed_node2 30564 1726882815.98602: Calling groups_inventory to load vars for managed_node2 30564 1726882815.98606: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882815.98620: Calling all_plugins_play to load vars for managed_node2 30564 1726882815.98624: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882815.98627: Calling groups_plugins_play to load vars for managed_node2 30564 1726882816.00281: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882816.01957: done with get_vars() 30564 1726882816.01981: done getting variables 30564 1726882816.02038: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30564 1726882816.02153: variable 'profile' from source: play vars 30564 1726882816.02157: variable 'interface' from source: play vars 30564 1726882816.02218: variable 'interface' from source: play vars TASK [Verify the ansible_managed comment in ifcfg-statebr] ********************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 21:40:16 -0400 (0:00:00.053) 0:00:14.603 ****** 30564 1726882816.02249: entering _queue_task() for managed_node2/set_fact 30564 1726882816.02532: worker is 1 (out of 1 available) 30564 1726882816.02543: exiting _queue_task() for managed_node2/set_fact 30564 1726882816.02554: done queuing things up, now waiting for results queue to drain 30564 1726882816.02555: waiting for pending results... 30564 1726882816.02822: running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-statebr 30564 1726882816.02952: in run() - task 0e448fcc-3ce9-4216-acec-000000000405 30564 1726882816.02974: variable 'ansible_search_path' from source: unknown 30564 1726882816.02982: variable 'ansible_search_path' from source: unknown 30564 1726882816.03023: calling self._execute() 30564 1726882816.03118: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882816.03128: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882816.03143: variable 'omit' from source: magic vars 30564 1726882816.03491: variable 'ansible_distribution_major_version' from source: facts 30564 1726882816.03509: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882816.03634: variable 'profile_stat' from source: set_fact 30564 1726882816.03654: Evaluated conditional (profile_stat.stat.exists): False 30564 1726882816.03661: when evaluation is False, skipping this task 30564 1726882816.03670: _execute() done 30564 1726882816.03679: dumping result to json 30564 1726882816.03687: done dumping result, returning 30564 1726882816.03695: done running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-statebr [0e448fcc-3ce9-4216-acec-000000000405] 30564 1726882816.03705: sending task result for task 0e448fcc-3ce9-4216-acec-000000000405 skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30564 1726882816.03848: no more pending results, returning what we have 30564 1726882816.03853: results queue empty 30564 1726882816.03854: checking for any_errors_fatal 30564 1726882816.03860: done checking for any_errors_fatal 30564 1726882816.03861: checking for max_fail_percentage 30564 1726882816.03865: done checking for max_fail_percentage 30564 1726882816.03867: checking to see if all hosts have failed and the running result is not ok 30564 1726882816.03867: done checking to see if all hosts have failed 30564 1726882816.03868: getting the remaining hosts for this loop 30564 1726882816.03870: done getting the remaining hosts for this loop 30564 1726882816.03874: getting the next task for host managed_node2 30564 1726882816.03884: done getting next task for host managed_node2 30564 1726882816.03886: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 30564 1726882816.03891: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882816.03896: getting variables 30564 1726882816.03897: in VariableManager get_vars() 30564 1726882816.03928: Calling all_inventory to load vars for managed_node2 30564 1726882816.03931: Calling groups_inventory to load vars for managed_node2 30564 1726882816.03935: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882816.03949: Calling all_plugins_play to load vars for managed_node2 30564 1726882816.03952: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882816.03955: Calling groups_plugins_play to load vars for managed_node2 30564 1726882816.04983: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000405 30564 1726882816.04987: WORKER PROCESS EXITING 30564 1726882816.05735: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882816.07377: done with get_vars() 30564 1726882816.07402: done getting variables 30564 1726882816.07463: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30564 1726882816.07579: variable 'profile' from source: play vars 30564 1726882816.07583: variable 'interface' from source: play vars 30564 1726882816.07643: variable 'interface' from source: play vars TASK [Get the fingerprint comment in ifcfg-statebr] **************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 21:40:16 -0400 (0:00:00.054) 0:00:14.658 ****** 30564 1726882816.07679: entering _queue_task() for managed_node2/command 30564 1726882816.07994: worker is 1 (out of 1 available) 30564 1726882816.08007: exiting _queue_task() for managed_node2/command 30564 1726882816.08020: done queuing things up, now waiting for results queue to drain 30564 1726882816.08021: waiting for pending results... 30564 1726882816.08296: running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-statebr 30564 1726882816.08420: in run() - task 0e448fcc-3ce9-4216-acec-000000000406 30564 1726882816.08437: variable 'ansible_search_path' from source: unknown 30564 1726882816.08443: variable 'ansible_search_path' from source: unknown 30564 1726882816.08484: calling self._execute() 30564 1726882816.08581: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882816.08591: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882816.08605: variable 'omit' from source: magic vars 30564 1726882816.08947: variable 'ansible_distribution_major_version' from source: facts 30564 1726882816.08967: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882816.09095: variable 'profile_stat' from source: set_fact 30564 1726882816.09112: Evaluated conditional (profile_stat.stat.exists): False 30564 1726882816.09121: when evaluation is False, skipping this task 30564 1726882816.09128: _execute() done 30564 1726882816.09135: dumping result to json 30564 1726882816.09141: done dumping result, returning 30564 1726882816.09149: done running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-statebr [0e448fcc-3ce9-4216-acec-000000000406] 30564 1726882816.09158: sending task result for task 0e448fcc-3ce9-4216-acec-000000000406 skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30564 1726882816.09305: no more pending results, returning what we have 30564 1726882816.09310: results queue empty 30564 1726882816.09311: checking for any_errors_fatal 30564 1726882816.09318: done checking for any_errors_fatal 30564 1726882816.09319: checking for max_fail_percentage 30564 1726882816.09321: done checking for max_fail_percentage 30564 1726882816.09322: checking to see if all hosts have failed and the running result is not ok 30564 1726882816.09323: done checking to see if all hosts have failed 30564 1726882816.09324: getting the remaining hosts for this loop 30564 1726882816.09325: done getting the remaining hosts for this loop 30564 1726882816.09329: getting the next task for host managed_node2 30564 1726882816.09338: done getting next task for host managed_node2 30564 1726882816.09341: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 30564 1726882816.09345: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882816.09350: getting variables 30564 1726882816.09352: in VariableManager get_vars() 30564 1726882816.09384: Calling all_inventory to load vars for managed_node2 30564 1726882816.09387: Calling groups_inventory to load vars for managed_node2 30564 1726882816.09391: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882816.09406: Calling all_plugins_play to load vars for managed_node2 30564 1726882816.09409: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882816.09412: Calling groups_plugins_play to load vars for managed_node2 30564 1726882816.11011: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000406 30564 1726882816.11015: WORKER PROCESS EXITING 30564 1726882816.11184: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882816.12846: done with get_vars() 30564 1726882816.12869: done getting variables 30564 1726882816.12928: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30564 1726882816.13034: variable 'profile' from source: play vars 30564 1726882816.13038: variable 'interface' from source: play vars 30564 1726882816.13100: variable 'interface' from source: play vars TASK [Verify the fingerprint comment in ifcfg-statebr] ************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 21:40:16 -0400 (0:00:00.054) 0:00:14.712 ****** 30564 1726882816.13131: entering _queue_task() for managed_node2/set_fact 30564 1726882816.13405: worker is 1 (out of 1 available) 30564 1726882816.13416: exiting _queue_task() for managed_node2/set_fact 30564 1726882816.13427: done queuing things up, now waiting for results queue to drain 30564 1726882816.13428: waiting for pending results... 30564 1726882816.13704: running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-statebr 30564 1726882816.13830: in run() - task 0e448fcc-3ce9-4216-acec-000000000407 30564 1726882816.13851: variable 'ansible_search_path' from source: unknown 30564 1726882816.13859: variable 'ansible_search_path' from source: unknown 30564 1726882816.13902: calling self._execute() 30564 1726882816.13999: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882816.14011: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882816.14026: variable 'omit' from source: magic vars 30564 1726882816.14385: variable 'ansible_distribution_major_version' from source: facts 30564 1726882816.14405: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882816.14537: variable 'profile_stat' from source: set_fact 30564 1726882816.14554: Evaluated conditional (profile_stat.stat.exists): False 30564 1726882816.14563: when evaluation is False, skipping this task 30564 1726882816.14574: _execute() done 30564 1726882816.14583: dumping result to json 30564 1726882816.14591: done dumping result, returning 30564 1726882816.14600: done running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-statebr [0e448fcc-3ce9-4216-acec-000000000407] 30564 1726882816.14610: sending task result for task 0e448fcc-3ce9-4216-acec-000000000407 30564 1726882816.14718: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000407 30564 1726882816.14725: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30564 1726882816.14785: no more pending results, returning what we have 30564 1726882816.14790: results queue empty 30564 1726882816.14791: checking for any_errors_fatal 30564 1726882816.14797: done checking for any_errors_fatal 30564 1726882816.14798: checking for max_fail_percentage 30564 1726882816.14800: done checking for max_fail_percentage 30564 1726882816.14802: checking to see if all hosts have failed and the running result is not ok 30564 1726882816.14802: done checking to see if all hosts have failed 30564 1726882816.14803: getting the remaining hosts for this loop 30564 1726882816.14805: done getting the remaining hosts for this loop 30564 1726882816.14809: getting the next task for host managed_node2 30564 1726882816.14819: done getting next task for host managed_node2 30564 1726882816.14822: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 30564 1726882816.14826: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882816.14831: getting variables 30564 1726882816.14833: in VariableManager get_vars() 30564 1726882816.14865: Calling all_inventory to load vars for managed_node2 30564 1726882816.14868: Calling groups_inventory to load vars for managed_node2 30564 1726882816.14872: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882816.14885: Calling all_plugins_play to load vars for managed_node2 30564 1726882816.14889: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882816.14892: Calling groups_plugins_play to load vars for managed_node2 30564 1726882816.16488: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882816.18139: done with get_vars() 30564 1726882816.18167: done getting variables 30564 1726882816.18223: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30564 1726882816.18330: variable 'profile' from source: play vars 30564 1726882816.18334: variable 'interface' from source: play vars 30564 1726882816.18395: variable 'interface' from source: play vars TASK [Assert that the profile is present - 'statebr'] ************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 21:40:16 -0400 (0:00:00.052) 0:00:14.765 ****** 30564 1726882816.18424: entering _queue_task() for managed_node2/assert 30564 1726882816.18696: worker is 1 (out of 1 available) 30564 1726882816.18709: exiting _queue_task() for managed_node2/assert 30564 1726882816.18721: done queuing things up, now waiting for results queue to drain 30564 1726882816.18722: waiting for pending results... 30564 1726882816.18989: running TaskExecutor() for managed_node2/TASK: Assert that the profile is present - 'statebr' 30564 1726882816.19104: in run() - task 0e448fcc-3ce9-4216-acec-000000000384 30564 1726882816.19125: variable 'ansible_search_path' from source: unknown 30564 1726882816.19134: variable 'ansible_search_path' from source: unknown 30564 1726882816.19178: calling self._execute() 30564 1726882816.19268: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882816.19282: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882816.19299: variable 'omit' from source: magic vars 30564 1726882816.19643: variable 'ansible_distribution_major_version' from source: facts 30564 1726882816.19661: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882816.19674: variable 'omit' from source: magic vars 30564 1726882816.19726: variable 'omit' from source: magic vars 30564 1726882816.19828: variable 'profile' from source: play vars 30564 1726882816.19837: variable 'interface' from source: play vars 30564 1726882816.19904: variable 'interface' from source: play vars 30564 1726882816.19929: variable 'omit' from source: magic vars 30564 1726882816.19973: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882816.20009: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882816.20036: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882816.20057: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882816.20076: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882816.20107: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882816.20114: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882816.20121: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882816.20223: Set connection var ansible_timeout to 10 30564 1726882816.20233: Set connection var ansible_pipelining to False 30564 1726882816.20239: Set connection var ansible_shell_type to sh 30564 1726882816.20253: Set connection var ansible_shell_executable to /bin/sh 30564 1726882816.20266: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882816.20273: Set connection var ansible_connection to ssh 30564 1726882816.20302: variable 'ansible_shell_executable' from source: unknown 30564 1726882816.20309: variable 'ansible_connection' from source: unknown 30564 1726882816.20315: variable 'ansible_module_compression' from source: unknown 30564 1726882816.20320: variable 'ansible_shell_type' from source: unknown 30564 1726882816.20326: variable 'ansible_shell_executable' from source: unknown 30564 1726882816.20331: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882816.20337: variable 'ansible_pipelining' from source: unknown 30564 1726882816.20343: variable 'ansible_timeout' from source: unknown 30564 1726882816.20350: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882816.20486: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882816.20501: variable 'omit' from source: magic vars 30564 1726882816.20509: starting attempt loop 30564 1726882816.20516: running the handler 30564 1726882816.20623: variable 'lsr_net_profile_exists' from source: set_fact 30564 1726882816.20633: Evaluated conditional (lsr_net_profile_exists): True 30564 1726882816.20642: handler run complete 30564 1726882816.20658: attempt loop complete, returning result 30564 1726882816.20666: _execute() done 30564 1726882816.20673: dumping result to json 30564 1726882816.20684: done dumping result, returning 30564 1726882816.20694: done running TaskExecutor() for managed_node2/TASK: Assert that the profile is present - 'statebr' [0e448fcc-3ce9-4216-acec-000000000384] 30564 1726882816.20703: sending task result for task 0e448fcc-3ce9-4216-acec-000000000384 ok: [managed_node2] => { "changed": false } MSG: All assertions passed 30564 1726882816.20835: no more pending results, returning what we have 30564 1726882816.20839: results queue empty 30564 1726882816.20840: checking for any_errors_fatal 30564 1726882816.20848: done checking for any_errors_fatal 30564 1726882816.20848: checking for max_fail_percentage 30564 1726882816.20850: done checking for max_fail_percentage 30564 1726882816.20851: checking to see if all hosts have failed and the running result is not ok 30564 1726882816.20852: done checking to see if all hosts have failed 30564 1726882816.20853: getting the remaining hosts for this loop 30564 1726882816.20855: done getting the remaining hosts for this loop 30564 1726882816.20859: getting the next task for host managed_node2 30564 1726882816.20868: done getting next task for host managed_node2 30564 1726882816.20871: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 30564 1726882816.20875: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882816.20880: getting variables 30564 1726882816.20882: in VariableManager get_vars() 30564 1726882816.20912: Calling all_inventory to load vars for managed_node2 30564 1726882816.20914: Calling groups_inventory to load vars for managed_node2 30564 1726882816.20918: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882816.20928: Calling all_plugins_play to load vars for managed_node2 30564 1726882816.20932: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882816.20935: Calling groups_plugins_play to load vars for managed_node2 30564 1726882816.21982: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000384 30564 1726882816.21986: WORKER PROCESS EXITING 30564 1726882816.22698: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882816.24344: done with get_vars() 30564 1726882816.24367: done getting variables 30564 1726882816.24426: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30564 1726882816.24537: variable 'profile' from source: play vars 30564 1726882816.24541: variable 'interface' from source: play vars 30564 1726882816.24602: variable 'interface' from source: play vars TASK [Assert that the ansible managed comment is present in 'statebr'] ********* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 21:40:16 -0400 (0:00:00.062) 0:00:14.827 ****** 30564 1726882816.24637: entering _queue_task() for managed_node2/assert 30564 1726882816.24915: worker is 1 (out of 1 available) 30564 1726882816.24927: exiting _queue_task() for managed_node2/assert 30564 1726882816.24939: done queuing things up, now waiting for results queue to drain 30564 1726882816.24941: waiting for pending results... 30564 1726882816.25219: running TaskExecutor() for managed_node2/TASK: Assert that the ansible managed comment is present in 'statebr' 30564 1726882816.25329: in run() - task 0e448fcc-3ce9-4216-acec-000000000385 30564 1726882816.25348: variable 'ansible_search_path' from source: unknown 30564 1726882816.25356: variable 'ansible_search_path' from source: unknown 30564 1726882816.25397: calling self._execute() 30564 1726882816.25490: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882816.25500: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882816.25513: variable 'omit' from source: magic vars 30564 1726882816.25849: variable 'ansible_distribution_major_version' from source: facts 30564 1726882816.25868: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882816.25880: variable 'omit' from source: magic vars 30564 1726882816.25931: variable 'omit' from source: magic vars 30564 1726882816.26032: variable 'profile' from source: play vars 30564 1726882816.26041: variable 'interface' from source: play vars 30564 1726882816.26101: variable 'interface' from source: play vars 30564 1726882816.26122: variable 'omit' from source: magic vars 30564 1726882816.26168: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882816.26205: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882816.26229: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882816.26257: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882816.26278: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882816.26313: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882816.26322: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882816.26330: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882816.26434: Set connection var ansible_timeout to 10 30564 1726882816.26445: Set connection var ansible_pipelining to False 30564 1726882816.26452: Set connection var ansible_shell_type to sh 30564 1726882816.26468: Set connection var ansible_shell_executable to /bin/sh 30564 1726882816.26482: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882816.26490: Set connection var ansible_connection to ssh 30564 1726882816.26518: variable 'ansible_shell_executable' from source: unknown 30564 1726882816.26526: variable 'ansible_connection' from source: unknown 30564 1726882816.26534: variable 'ansible_module_compression' from source: unknown 30564 1726882816.26541: variable 'ansible_shell_type' from source: unknown 30564 1726882816.26548: variable 'ansible_shell_executable' from source: unknown 30564 1726882816.26555: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882816.26564: variable 'ansible_pipelining' from source: unknown 30564 1726882816.26576: variable 'ansible_timeout' from source: unknown 30564 1726882816.26584: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882816.26724: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882816.26742: variable 'omit' from source: magic vars 30564 1726882816.26753: starting attempt loop 30564 1726882816.26760: running the handler 30564 1726882816.26872: variable 'lsr_net_profile_ansible_managed' from source: set_fact 30564 1726882816.26883: Evaluated conditional (lsr_net_profile_ansible_managed): True 30564 1726882816.26894: handler run complete 30564 1726882816.26915: attempt loop complete, returning result 30564 1726882816.26922: _execute() done 30564 1726882816.26929: dumping result to json 30564 1726882816.26936: done dumping result, returning 30564 1726882816.26947: done running TaskExecutor() for managed_node2/TASK: Assert that the ansible managed comment is present in 'statebr' [0e448fcc-3ce9-4216-acec-000000000385] 30564 1726882816.26956: sending task result for task 0e448fcc-3ce9-4216-acec-000000000385 ok: [managed_node2] => { "changed": false } MSG: All assertions passed 30564 1726882816.27096: no more pending results, returning what we have 30564 1726882816.27100: results queue empty 30564 1726882816.27101: checking for any_errors_fatal 30564 1726882816.27106: done checking for any_errors_fatal 30564 1726882816.27107: checking for max_fail_percentage 30564 1726882816.27109: done checking for max_fail_percentage 30564 1726882816.27110: checking to see if all hosts have failed and the running result is not ok 30564 1726882816.27111: done checking to see if all hosts have failed 30564 1726882816.27112: getting the remaining hosts for this loop 30564 1726882816.27114: done getting the remaining hosts for this loop 30564 1726882816.27118: getting the next task for host managed_node2 30564 1726882816.27126: done getting next task for host managed_node2 30564 1726882816.27129: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 30564 1726882816.27133: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882816.27138: getting variables 30564 1726882816.27140: in VariableManager get_vars() 30564 1726882816.27173: Calling all_inventory to load vars for managed_node2 30564 1726882816.27176: Calling groups_inventory to load vars for managed_node2 30564 1726882816.27180: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882816.27191: Calling all_plugins_play to load vars for managed_node2 30564 1726882816.27194: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882816.27198: Calling groups_plugins_play to load vars for managed_node2 30564 1726882816.28182: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000385 30564 1726882816.28185: WORKER PROCESS EXITING 30564 1726882816.28968: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882816.30614: done with get_vars() 30564 1726882816.30635: done getting variables 30564 1726882816.30698: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30564 1726882816.30807: variable 'profile' from source: play vars 30564 1726882816.30811: variable 'interface' from source: play vars 30564 1726882816.30874: variable 'interface' from source: play vars TASK [Assert that the fingerprint comment is present in statebr] *************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 21:40:16 -0400 (0:00:00.062) 0:00:14.890 ****** 30564 1726882816.30905: entering _queue_task() for managed_node2/assert 30564 1726882816.31182: worker is 1 (out of 1 available) 30564 1726882816.31195: exiting _queue_task() for managed_node2/assert 30564 1726882816.31207: done queuing things up, now waiting for results queue to drain 30564 1726882816.31208: waiting for pending results... 30564 1726882816.31473: running TaskExecutor() for managed_node2/TASK: Assert that the fingerprint comment is present in statebr 30564 1726882816.31581: in run() - task 0e448fcc-3ce9-4216-acec-000000000386 30564 1726882816.31599: variable 'ansible_search_path' from source: unknown 30564 1726882816.31606: variable 'ansible_search_path' from source: unknown 30564 1726882816.31646: calling self._execute() 30564 1726882816.31740: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882816.31752: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882816.31771: variable 'omit' from source: magic vars 30564 1726882816.32113: variable 'ansible_distribution_major_version' from source: facts 30564 1726882816.32131: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882816.32140: variable 'omit' from source: magic vars 30564 1726882816.32190: variable 'omit' from source: magic vars 30564 1726882816.32290: variable 'profile' from source: play vars 30564 1726882816.32302: variable 'interface' from source: play vars 30564 1726882816.32369: variable 'interface' from source: play vars 30564 1726882816.32391: variable 'omit' from source: magic vars 30564 1726882816.32438: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882816.32478: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882816.32500: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882816.32526: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882816.32541: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882816.32575: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882816.32583: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882816.32590: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882816.32695: Set connection var ansible_timeout to 10 30564 1726882816.32705: Set connection var ansible_pipelining to False 30564 1726882816.32712: Set connection var ansible_shell_type to sh 30564 1726882816.32720: Set connection var ansible_shell_executable to /bin/sh 30564 1726882816.32732: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882816.32744: Set connection var ansible_connection to ssh 30564 1726882816.32776: variable 'ansible_shell_executable' from source: unknown 30564 1726882816.32783: variable 'ansible_connection' from source: unknown 30564 1726882816.32789: variable 'ansible_module_compression' from source: unknown 30564 1726882816.32795: variable 'ansible_shell_type' from source: unknown 30564 1726882816.32800: variable 'ansible_shell_executable' from source: unknown 30564 1726882816.32806: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882816.32812: variable 'ansible_pipelining' from source: unknown 30564 1726882816.32818: variable 'ansible_timeout' from source: unknown 30564 1726882816.32825: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882816.32960: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882816.32979: variable 'omit' from source: magic vars 30564 1726882816.32988: starting attempt loop 30564 1726882816.32994: running the handler 30564 1726882816.33103: variable 'lsr_net_profile_fingerprint' from source: set_fact 30564 1726882816.33114: Evaluated conditional (lsr_net_profile_fingerprint): True 30564 1726882816.33124: handler run complete 30564 1726882816.33143: attempt loop complete, returning result 30564 1726882816.33150: _execute() done 30564 1726882816.33156: dumping result to json 30564 1726882816.33165: done dumping result, returning 30564 1726882816.33177: done running TaskExecutor() for managed_node2/TASK: Assert that the fingerprint comment is present in statebr [0e448fcc-3ce9-4216-acec-000000000386] 30564 1726882816.33187: sending task result for task 0e448fcc-3ce9-4216-acec-000000000386 ok: [managed_node2] => { "changed": false } MSG: All assertions passed 30564 1726882816.33328: no more pending results, returning what we have 30564 1726882816.33332: results queue empty 30564 1726882816.33333: checking for any_errors_fatal 30564 1726882816.33341: done checking for any_errors_fatal 30564 1726882816.33342: checking for max_fail_percentage 30564 1726882816.33345: done checking for max_fail_percentage 30564 1726882816.33346: checking to see if all hosts have failed and the running result is not ok 30564 1726882816.33347: done checking to see if all hosts have failed 30564 1726882816.33348: getting the remaining hosts for this loop 30564 1726882816.33350: done getting the remaining hosts for this loop 30564 1726882816.33354: getting the next task for host managed_node2 30564 1726882816.33366: done getting next task for host managed_node2 30564 1726882816.33369: ^ task is: TASK: Conditional asserts 30564 1726882816.33372: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882816.33377: getting variables 30564 1726882816.33379: in VariableManager get_vars() 30564 1726882816.33406: Calling all_inventory to load vars for managed_node2 30564 1726882816.33409: Calling groups_inventory to load vars for managed_node2 30564 1726882816.33412: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882816.33422: Calling all_plugins_play to load vars for managed_node2 30564 1726882816.33425: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882816.33427: Calling groups_plugins_play to load vars for managed_node2 30564 1726882816.34683: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000386 30564 1726882816.34686: WORKER PROCESS EXITING 30564 1726882816.35126: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882816.36783: done with get_vars() 30564 1726882816.36806: done getting variables TASK [Conditional asserts] ***************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:42 Friday 20 September 2024 21:40:16 -0400 (0:00:00.059) 0:00:14.950 ****** 30564 1726882816.36903: entering _queue_task() for managed_node2/include_tasks 30564 1726882816.37184: worker is 1 (out of 1 available) 30564 1726882816.37197: exiting _queue_task() for managed_node2/include_tasks 30564 1726882816.37210: done queuing things up, now waiting for results queue to drain 30564 1726882816.37211: waiting for pending results... 30564 1726882816.37481: running TaskExecutor() for managed_node2/TASK: Conditional asserts 30564 1726882816.37586: in run() - task 0e448fcc-3ce9-4216-acec-000000000097 30564 1726882816.37606: variable 'ansible_search_path' from source: unknown 30564 1726882816.37612: variable 'ansible_search_path' from source: unknown 30564 1726882816.37890: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882816.40391: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882816.40460: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882816.40510: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882816.40548: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882816.40582: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882816.40666: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882816.40705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882816.40734: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882816.40780: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882816.40799: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882816.40904: variable 'lsr_assert_when' from source: include params 30564 1726882816.41025: variable 'network_provider' from source: set_fact 30564 1726882816.41101: variable 'omit' from source: magic vars 30564 1726882816.41217: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882816.41231: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882816.41250: variable 'omit' from source: magic vars 30564 1726882816.41441: variable 'ansible_distribution_major_version' from source: facts 30564 1726882816.41460: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882816.41577: variable 'item' from source: unknown 30564 1726882816.41588: Evaluated conditional (item['condition']): True 30564 1726882816.41671: variable 'item' from source: unknown 30564 1726882816.41706: variable 'item' from source: unknown 30564 1726882816.41775: variable 'item' from source: unknown 30564 1726882816.41941: dumping result to json 30564 1726882816.41950: done dumping result, returning 30564 1726882816.41959: done running TaskExecutor() for managed_node2/TASK: Conditional asserts [0e448fcc-3ce9-4216-acec-000000000097] 30564 1726882816.41970: sending task result for task 0e448fcc-3ce9-4216-acec-000000000097 30564 1726882816.42059: no more pending results, returning what we have 30564 1726882816.42066: in VariableManager get_vars() 30564 1726882816.42101: Calling all_inventory to load vars for managed_node2 30564 1726882816.42103: Calling groups_inventory to load vars for managed_node2 30564 1726882816.42107: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882816.42118: Calling all_plugins_play to load vars for managed_node2 30564 1726882816.42121: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882816.42124: Calling groups_plugins_play to load vars for managed_node2 30564 1726882816.43181: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000097 30564 1726882816.43184: WORKER PROCESS EXITING 30564 1726882816.43878: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882816.45522: done with get_vars() 30564 1726882816.45541: variable 'ansible_search_path' from source: unknown 30564 1726882816.45543: variable 'ansible_search_path' from source: unknown 30564 1726882816.45586: we have included files to process 30564 1726882816.45587: generating all_blocks data 30564 1726882816.45589: done generating all_blocks data 30564 1726882816.45594: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 30564 1726882816.45596: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 30564 1726882816.45598: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 30564 1726882816.45761: in VariableManager get_vars() 30564 1726882816.45785: done with get_vars() 30564 1726882816.45901: done processing included file 30564 1726882816.45903: iterating over new_blocks loaded from include file 30564 1726882816.45905: in VariableManager get_vars() 30564 1726882816.45918: done with get_vars() 30564 1726882816.45920: filtering new block on tags 30564 1726882816.45954: done filtering new block on tags 30564 1726882816.45957: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed_node2 => (item={'what': 'tasks/assert_device_present.yml', 'condition': True}) 30564 1726882816.45962: extending task lists for all hosts with included blocks 30564 1726882816.47227: done extending task lists 30564 1726882816.47228: done processing included files 30564 1726882816.47229: results queue empty 30564 1726882816.47230: checking for any_errors_fatal 30564 1726882816.47233: done checking for any_errors_fatal 30564 1726882816.47233: checking for max_fail_percentage 30564 1726882816.47234: done checking for max_fail_percentage 30564 1726882816.47235: checking to see if all hosts have failed and the running result is not ok 30564 1726882816.47236: done checking to see if all hosts have failed 30564 1726882816.47237: getting the remaining hosts for this loop 30564 1726882816.47238: done getting the remaining hosts for this loop 30564 1726882816.47241: getting the next task for host managed_node2 30564 1726882816.47245: done getting next task for host managed_node2 30564 1726882816.47247: ^ task is: TASK: Include the task 'get_interface_stat.yml' 30564 1726882816.47249: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882816.47257: getting variables 30564 1726882816.47258: in VariableManager get_vars() 30564 1726882816.47269: Calling all_inventory to load vars for managed_node2 30564 1726882816.47271: Calling groups_inventory to load vars for managed_node2 30564 1726882816.47273: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882816.47278: Calling all_plugins_play to load vars for managed_node2 30564 1726882816.47281: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882816.47283: Calling groups_plugins_play to load vars for managed_node2 30564 1726882816.52402: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882816.54102: done with get_vars() 30564 1726882816.54125: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 21:40:16 -0400 (0:00:00.172) 0:00:15.123 ****** 30564 1726882816.54201: entering _queue_task() for managed_node2/include_tasks 30564 1726882816.54524: worker is 1 (out of 1 available) 30564 1726882816.54537: exiting _queue_task() for managed_node2/include_tasks 30564 1726882816.54549: done queuing things up, now waiting for results queue to drain 30564 1726882816.54552: waiting for pending results... 30564 1726882816.54828: running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' 30564 1726882816.54940: in run() - task 0e448fcc-3ce9-4216-acec-000000000452 30564 1726882816.54967: variable 'ansible_search_path' from source: unknown 30564 1726882816.54978: variable 'ansible_search_path' from source: unknown 30564 1726882816.55025: calling self._execute() 30564 1726882816.55138: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882816.55151: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882816.55169: variable 'omit' from source: magic vars 30564 1726882816.55579: variable 'ansible_distribution_major_version' from source: facts 30564 1726882816.55600: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882816.55610: _execute() done 30564 1726882816.55618: dumping result to json 30564 1726882816.55625: done dumping result, returning 30564 1726882816.55634: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' [0e448fcc-3ce9-4216-acec-000000000452] 30564 1726882816.55650: sending task result for task 0e448fcc-3ce9-4216-acec-000000000452 30564 1726882816.55787: no more pending results, returning what we have 30564 1726882816.55792: in VariableManager get_vars() 30564 1726882816.55831: Calling all_inventory to load vars for managed_node2 30564 1726882816.55834: Calling groups_inventory to load vars for managed_node2 30564 1726882816.55838: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882816.55853: Calling all_plugins_play to load vars for managed_node2 30564 1726882816.55856: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882816.55860: Calling groups_plugins_play to load vars for managed_node2 30564 1726882816.57162: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000452 30564 1726882816.57168: WORKER PROCESS EXITING 30564 1726882816.57585: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882816.59251: done with get_vars() 30564 1726882816.59272: variable 'ansible_search_path' from source: unknown 30564 1726882816.59274: variable 'ansible_search_path' from source: unknown 30564 1726882816.59411: variable 'item' from source: include params 30564 1726882816.59447: we have included files to process 30564 1726882816.59449: generating all_blocks data 30564 1726882816.59450: done generating all_blocks data 30564 1726882816.59452: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30564 1726882816.59453: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30564 1726882816.59455: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30564 1726882816.59630: done processing included file 30564 1726882816.59633: iterating over new_blocks loaded from include file 30564 1726882816.59634: in VariableManager get_vars() 30564 1726882816.59647: done with get_vars() 30564 1726882816.59649: filtering new block on tags 30564 1726882816.59673: done filtering new block on tags 30564 1726882816.59675: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node2 30564 1726882816.59679: extending task lists for all hosts with included blocks 30564 1726882816.59840: done extending task lists 30564 1726882816.59842: done processing included files 30564 1726882816.59843: results queue empty 30564 1726882816.59843: checking for any_errors_fatal 30564 1726882816.59848: done checking for any_errors_fatal 30564 1726882816.59849: checking for max_fail_percentage 30564 1726882816.59850: done checking for max_fail_percentage 30564 1726882816.59851: checking to see if all hosts have failed and the running result is not ok 30564 1726882816.59852: done checking to see if all hosts have failed 30564 1726882816.59852: getting the remaining hosts for this loop 30564 1726882816.59854: done getting the remaining hosts for this loop 30564 1726882816.59856: getting the next task for host managed_node2 30564 1726882816.59861: done getting next task for host managed_node2 30564 1726882816.59865: ^ task is: TASK: Get stat for interface {{ interface }} 30564 1726882816.59870: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882816.59872: getting variables 30564 1726882816.59873: in VariableManager get_vars() 30564 1726882816.59883: Calling all_inventory to load vars for managed_node2 30564 1726882816.59885: Calling groups_inventory to load vars for managed_node2 30564 1726882816.59888: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882816.59893: Calling all_plugins_play to load vars for managed_node2 30564 1726882816.59895: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882816.59898: Calling groups_plugins_play to load vars for managed_node2 30564 1726882816.61150: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882816.62771: done with get_vars() 30564 1726882816.62791: done getting variables 30564 1726882816.62916: variable 'interface' from source: play vars TASK [Get stat for interface statebr] ****************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:40:16 -0400 (0:00:00.087) 0:00:15.210 ****** 30564 1726882816.62945: entering _queue_task() for managed_node2/stat 30564 1726882816.63266: worker is 1 (out of 1 available) 30564 1726882816.63279: exiting _queue_task() for managed_node2/stat 30564 1726882816.63291: done queuing things up, now waiting for results queue to drain 30564 1726882816.63292: waiting for pending results... 30564 1726882816.63571: running TaskExecutor() for managed_node2/TASK: Get stat for interface statebr 30564 1726882816.63686: in run() - task 0e448fcc-3ce9-4216-acec-0000000004e8 30564 1726882816.63708: variable 'ansible_search_path' from source: unknown 30564 1726882816.63716: variable 'ansible_search_path' from source: unknown 30564 1726882816.63762: calling self._execute() 30564 1726882816.63872: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882816.63888: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882816.63906: variable 'omit' from source: magic vars 30564 1726882816.64281: variable 'ansible_distribution_major_version' from source: facts 30564 1726882816.64298: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882816.64308: variable 'omit' from source: magic vars 30564 1726882816.64366: variable 'omit' from source: magic vars 30564 1726882816.64460: variable 'interface' from source: play vars 30564 1726882816.64489: variable 'omit' from source: magic vars 30564 1726882816.64536: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882816.64578: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882816.64607: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882816.64632: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882816.64649: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882816.64685: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882816.64695: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882816.64707: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882816.64808: Set connection var ansible_timeout to 10 30564 1726882816.64822: Set connection var ansible_pipelining to False 30564 1726882816.64829: Set connection var ansible_shell_type to sh 30564 1726882816.64840: Set connection var ansible_shell_executable to /bin/sh 30564 1726882816.64853: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882816.64860: Set connection var ansible_connection to ssh 30564 1726882816.64890: variable 'ansible_shell_executable' from source: unknown 30564 1726882816.64900: variable 'ansible_connection' from source: unknown 30564 1726882816.64907: variable 'ansible_module_compression' from source: unknown 30564 1726882816.64915: variable 'ansible_shell_type' from source: unknown 30564 1726882816.64926: variable 'ansible_shell_executable' from source: unknown 30564 1726882816.64933: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882816.64941: variable 'ansible_pipelining' from source: unknown 30564 1726882816.64949: variable 'ansible_timeout' from source: unknown 30564 1726882816.64956: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882816.65161: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30564 1726882816.65179: variable 'omit' from source: magic vars 30564 1726882816.65189: starting attempt loop 30564 1726882816.65196: running the handler 30564 1726882816.65213: _low_level_execute_command(): starting 30564 1726882816.65226: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882816.65982: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882816.65999: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882816.66017: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882816.66036: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882816.66082: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882816.66095: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882816.66114: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882816.66133: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882816.66147: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882816.66158: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882816.66174: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882816.66189: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882816.66205: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882816.66220: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882816.66235: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882816.66250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882816.66332: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882816.66358: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882816.66379: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882816.66518: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882816.68180: stdout chunk (state=3): >>>/root <<< 30564 1726882816.68279: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882816.68373: stderr chunk (state=3): >>><<< 30564 1726882816.68389: stdout chunk (state=3): >>><<< 30564 1726882816.68471: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882816.68475: _low_level_execute_command(): starting 30564 1726882816.68477: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882816.684289-31254-272204596085169 `" && echo ansible-tmp-1726882816.684289-31254-272204596085169="` echo /root/.ansible/tmp/ansible-tmp-1726882816.684289-31254-272204596085169 `" ) && sleep 0' 30564 1726882816.69127: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882816.69141: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882816.69155: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882816.69175: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882816.69225: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882816.69237: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882816.69251: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882816.69270: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882816.69282: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882816.69292: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882816.69303: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882816.69323: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882816.69339: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882816.69350: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882816.69360: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882816.69376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882816.69453: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882816.69477: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882816.69493: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882816.69625: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882816.71527: stdout chunk (state=3): >>>ansible-tmp-1726882816.684289-31254-272204596085169=/root/.ansible/tmp/ansible-tmp-1726882816.684289-31254-272204596085169 <<< 30564 1726882816.71827: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882816.71907: stderr chunk (state=3): >>><<< 30564 1726882816.71910: stdout chunk (state=3): >>><<< 30564 1726882816.71977: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882816.684289-31254-272204596085169=/root/.ansible/tmp/ansible-tmp-1726882816.684289-31254-272204596085169 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882816.72269: variable 'ansible_module_compression' from source: unknown 30564 1726882816.72272: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 30564 1726882816.72275: variable 'ansible_facts' from source: unknown 30564 1726882816.72278: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882816.684289-31254-272204596085169/AnsiballZ_stat.py 30564 1726882816.72343: Sending initial data 30564 1726882816.72346: Sent initial data (152 bytes) 30564 1726882816.73356: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882816.73377: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882816.73392: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882816.73410: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882816.73453: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882816.73467: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882816.73488: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882816.73506: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882816.73516: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882816.73527: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882816.73537: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882816.73549: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882816.73565: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882816.73577: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882816.73587: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882816.73605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882816.73684: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882816.73712: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882816.73728: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882816.73863: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882816.75700: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882816.75791: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882816.75893: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmpd0ox032p /root/.ansible/tmp/ansible-tmp-1726882816.684289-31254-272204596085169/AnsiballZ_stat.py <<< 30564 1726882816.75991: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882816.77339: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882816.77609: stderr chunk (state=3): >>><<< 30564 1726882816.77612: stdout chunk (state=3): >>><<< 30564 1726882816.77614: done transferring module to remote 30564 1726882816.77620: _low_level_execute_command(): starting 30564 1726882816.77623: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882816.684289-31254-272204596085169/ /root/.ansible/tmp/ansible-tmp-1726882816.684289-31254-272204596085169/AnsiballZ_stat.py && sleep 0' 30564 1726882816.78233: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882816.78248: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882816.78265: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882816.78290: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882816.78331: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882816.78345: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882816.78359: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882816.78381: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882816.78395: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882816.78412: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882816.78425: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882816.78439: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882816.78455: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882816.78472: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882816.78485: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882816.78499: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882816.78583: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882816.78605: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882816.78629: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882816.78762: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882816.80644: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882816.80647: stdout chunk (state=3): >>><<< 30564 1726882816.80650: stderr chunk (state=3): >>><<< 30564 1726882816.80750: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882816.80754: _low_level_execute_command(): starting 30564 1726882816.80757: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882816.684289-31254-272204596085169/AnsiballZ_stat.py && sleep 0' 30564 1726882816.81541: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882816.81550: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882816.81560: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882816.81579: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882816.81619: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882816.81626: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882816.81636: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882816.81649: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882816.81657: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882816.81665: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882816.81678: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882816.81688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882816.81702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882816.81712: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882816.81719: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882816.81728: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882816.81801: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882816.81813: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882816.81833: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882816.81975: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882816.95434: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/statebr", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 32032, "dev": 21, "nlink": 1, "atime": 1726882813.9452877, "mtime": 1726882813.9452877, "ctime": 1726882813.9452877, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/statebr", "lnk_target": "../../devices/virtual/net/statebr", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 30564 1726882816.96471: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882816.96536: stderr chunk (state=3): >>><<< 30564 1726882816.96539: stdout chunk (state=3): >>><<< 30564 1726882816.96563: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/statebr", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 32032, "dev": 21, "nlink": 1, "atime": 1726882813.9452877, "mtime": 1726882813.9452877, "ctime": 1726882813.9452877, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/statebr", "lnk_target": "../../devices/virtual/net/statebr", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882816.96622: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882816.684289-31254-272204596085169/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882816.96631: _low_level_execute_command(): starting 30564 1726882816.96636: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882816.684289-31254-272204596085169/ > /dev/null 2>&1 && sleep 0' 30564 1726882816.98513: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882816.98519: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882816.98560: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 30564 1726882816.98567: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30564 1726882816.98576: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882816.98581: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882816.98597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 30564 1726882816.98601: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882816.98679: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882816.98685: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882816.98692: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882816.98821: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882817.00701: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882817.00705: stderr chunk (state=3): >>><<< 30564 1726882817.00710: stdout chunk (state=3): >>><<< 30564 1726882817.00730: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882817.00735: handler run complete 30564 1726882817.00792: attempt loop complete, returning result 30564 1726882817.00796: _execute() done 30564 1726882817.00798: dumping result to json 30564 1726882817.00803: done dumping result, returning 30564 1726882817.00812: done running TaskExecutor() for managed_node2/TASK: Get stat for interface statebr [0e448fcc-3ce9-4216-acec-0000000004e8] 30564 1726882817.00819: sending task result for task 0e448fcc-3ce9-4216-acec-0000000004e8 30564 1726882817.00935: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000004e8 30564 1726882817.00937: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "atime": 1726882813.9452877, "block_size": 4096, "blocks": 0, "ctime": 1726882813.9452877, "dev": 21, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 32032, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/statebr", "lnk_target": "../../devices/virtual/net/statebr", "mode": "0777", "mtime": 1726882813.9452877, "nlink": 1, "path": "/sys/class/net/statebr", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 30564 1726882817.01165: no more pending results, returning what we have 30564 1726882817.01172: results queue empty 30564 1726882817.01173: checking for any_errors_fatal 30564 1726882817.01175: done checking for any_errors_fatal 30564 1726882817.01176: checking for max_fail_percentage 30564 1726882817.01178: done checking for max_fail_percentage 30564 1726882817.01178: checking to see if all hosts have failed and the running result is not ok 30564 1726882817.01179: done checking to see if all hosts have failed 30564 1726882817.01180: getting the remaining hosts for this loop 30564 1726882817.01181: done getting the remaining hosts for this loop 30564 1726882817.01185: getting the next task for host managed_node2 30564 1726882817.01195: done getting next task for host managed_node2 30564 1726882817.01198: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 30564 1726882817.01202: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882817.01207: getting variables 30564 1726882817.01209: in VariableManager get_vars() 30564 1726882817.01241: Calling all_inventory to load vars for managed_node2 30564 1726882817.01244: Calling groups_inventory to load vars for managed_node2 30564 1726882817.01248: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882817.01266: Calling all_plugins_play to load vars for managed_node2 30564 1726882817.01273: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882817.01277: Calling groups_plugins_play to load vars for managed_node2 30564 1726882817.03835: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882817.05853: done with get_vars() 30564 1726882817.05882: done getting variables 30564 1726882817.05947: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30564 1726882817.06078: variable 'interface' from source: play vars TASK [Assert that the interface is present - 'statebr'] ************************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 21:40:17 -0400 (0:00:00.431) 0:00:15.642 ****** 30564 1726882817.06110: entering _queue_task() for managed_node2/assert 30564 1726882817.06449: worker is 1 (out of 1 available) 30564 1726882817.06461: exiting _queue_task() for managed_node2/assert 30564 1726882817.06481: done queuing things up, now waiting for results queue to drain 30564 1726882817.06483: waiting for pending results... 30564 1726882817.06775: running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'statebr' 30564 1726882817.06898: in run() - task 0e448fcc-3ce9-4216-acec-000000000453 30564 1726882817.06927: variable 'ansible_search_path' from source: unknown 30564 1726882817.06935: variable 'ansible_search_path' from source: unknown 30564 1726882817.06980: calling self._execute() 30564 1726882817.07180: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882817.07192: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882817.07205: variable 'omit' from source: magic vars 30564 1726882817.07773: variable 'ansible_distribution_major_version' from source: facts 30564 1726882817.07798: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882817.07816: variable 'omit' from source: magic vars 30564 1726882817.07862: variable 'omit' from source: magic vars 30564 1726882817.07980: variable 'interface' from source: play vars 30564 1726882817.08007: variable 'omit' from source: magic vars 30564 1726882817.08059: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882817.08101: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882817.08133: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882817.08161: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882817.08184: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882817.08222: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882817.08231: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882817.08239: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882817.08371: Set connection var ansible_timeout to 10 30564 1726882817.08383: Set connection var ansible_pipelining to False 30564 1726882817.08389: Set connection var ansible_shell_type to sh 30564 1726882817.08398: Set connection var ansible_shell_executable to /bin/sh 30564 1726882817.08408: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882817.08414: Set connection var ansible_connection to ssh 30564 1726882817.08447: variable 'ansible_shell_executable' from source: unknown 30564 1726882817.08454: variable 'ansible_connection' from source: unknown 30564 1726882817.08471: variable 'ansible_module_compression' from source: unknown 30564 1726882817.08478: variable 'ansible_shell_type' from source: unknown 30564 1726882817.08485: variable 'ansible_shell_executable' from source: unknown 30564 1726882817.08490: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882817.08497: variable 'ansible_pipelining' from source: unknown 30564 1726882817.08503: variable 'ansible_timeout' from source: unknown 30564 1726882817.08510: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882817.08669: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882817.08694: variable 'omit' from source: magic vars 30564 1726882817.08705: starting attempt loop 30564 1726882817.08712: running the handler 30564 1726882817.08994: variable 'interface_stat' from source: set_fact 30564 1726882817.09026: Evaluated conditional (interface_stat.stat.exists): True 30564 1726882817.09038: handler run complete 30564 1726882817.09083: attempt loop complete, returning result 30564 1726882817.09091: _execute() done 30564 1726882817.09098: dumping result to json 30564 1726882817.09107: done dumping result, returning 30564 1726882817.09122: done running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'statebr' [0e448fcc-3ce9-4216-acec-000000000453] 30564 1726882817.09138: sending task result for task 0e448fcc-3ce9-4216-acec-000000000453 ok: [managed_node2] => { "changed": false } MSG: All assertions passed 30564 1726882817.09348: no more pending results, returning what we have 30564 1726882817.09352: results queue empty 30564 1726882817.09353: checking for any_errors_fatal 30564 1726882817.09371: done checking for any_errors_fatal 30564 1726882817.09372: checking for max_fail_percentage 30564 1726882817.09374: done checking for max_fail_percentage 30564 1726882817.09375: checking to see if all hosts have failed and the running result is not ok 30564 1726882817.09376: done checking to see if all hosts have failed 30564 1726882817.09377: getting the remaining hosts for this loop 30564 1726882817.09379: done getting the remaining hosts for this loop 30564 1726882817.09383: getting the next task for host managed_node2 30564 1726882817.09395: done getting next task for host managed_node2 30564 1726882817.09398: ^ task is: TASK: Success in test '{{ lsr_description }}' 30564 1726882817.09402: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882817.09407: getting variables 30564 1726882817.09409: in VariableManager get_vars() 30564 1726882817.09454: Calling all_inventory to load vars for managed_node2 30564 1726882817.09457: Calling groups_inventory to load vars for managed_node2 30564 1726882817.09461: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882817.09477: Calling all_plugins_play to load vars for managed_node2 30564 1726882817.09481: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882817.09485: Calling groups_plugins_play to load vars for managed_node2 30564 1726882817.10536: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000453 30564 1726882817.10540: WORKER PROCESS EXITING 30564 1726882817.11454: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882817.13878: done with get_vars() 30564 1726882817.13915: done getting variables 30564 1726882817.13975: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30564 1726882817.14103: variable 'lsr_description' from source: include params TASK [Success in test 'I can create a profile'] ******************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:47 Friday 20 September 2024 21:40:17 -0400 (0:00:00.080) 0:00:15.722 ****** 30564 1726882817.14141: entering _queue_task() for managed_node2/debug 30564 1726882817.14728: worker is 1 (out of 1 available) 30564 1726882817.14740: exiting _queue_task() for managed_node2/debug 30564 1726882817.14752: done queuing things up, now waiting for results queue to drain 30564 1726882817.14754: waiting for pending results... 30564 1726882817.15102: running TaskExecutor() for managed_node2/TASK: Success in test 'I can create a profile' 30564 1726882817.15275: in run() - task 0e448fcc-3ce9-4216-acec-000000000098 30564 1726882817.15290: variable 'ansible_search_path' from source: unknown 30564 1726882817.15297: variable 'ansible_search_path' from source: unknown 30564 1726882817.15338: calling self._execute() 30564 1726882817.15454: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882817.15460: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882817.15472: variable 'omit' from source: magic vars 30564 1726882817.15765: variable 'ansible_distribution_major_version' from source: facts 30564 1726882817.15780: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882817.15786: variable 'omit' from source: magic vars 30564 1726882817.15813: variable 'omit' from source: magic vars 30564 1726882817.15884: variable 'lsr_description' from source: include params 30564 1726882817.15900: variable 'omit' from source: magic vars 30564 1726882817.15931: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882817.15958: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882817.15978: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882817.15995: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882817.16005: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882817.16027: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882817.16030: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882817.16033: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882817.16105: Set connection var ansible_timeout to 10 30564 1726882817.16109: Set connection var ansible_pipelining to False 30564 1726882817.16111: Set connection var ansible_shell_type to sh 30564 1726882817.16116: Set connection var ansible_shell_executable to /bin/sh 30564 1726882817.16123: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882817.16125: Set connection var ansible_connection to ssh 30564 1726882817.16143: variable 'ansible_shell_executable' from source: unknown 30564 1726882817.16147: variable 'ansible_connection' from source: unknown 30564 1726882817.16149: variable 'ansible_module_compression' from source: unknown 30564 1726882817.16152: variable 'ansible_shell_type' from source: unknown 30564 1726882817.16154: variable 'ansible_shell_executable' from source: unknown 30564 1726882817.16156: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882817.16158: variable 'ansible_pipelining' from source: unknown 30564 1726882817.16160: variable 'ansible_timeout' from source: unknown 30564 1726882817.16165: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882817.16268: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882817.16281: variable 'omit' from source: magic vars 30564 1726882817.16284: starting attempt loop 30564 1726882817.16287: running the handler 30564 1726882817.16324: handler run complete 30564 1726882817.16335: attempt loop complete, returning result 30564 1726882817.16338: _execute() done 30564 1726882817.16340: dumping result to json 30564 1726882817.16343: done dumping result, returning 30564 1726882817.16348: done running TaskExecutor() for managed_node2/TASK: Success in test 'I can create a profile' [0e448fcc-3ce9-4216-acec-000000000098] 30564 1726882817.16353: sending task result for task 0e448fcc-3ce9-4216-acec-000000000098 30564 1726882817.16442: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000098 30564 1726882817.16444: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: +++++ Success in test 'I can create a profile' +++++ 30564 1726882817.16489: no more pending results, returning what we have 30564 1726882817.16497: results queue empty 30564 1726882817.16498: checking for any_errors_fatal 30564 1726882817.16505: done checking for any_errors_fatal 30564 1726882817.16505: checking for max_fail_percentage 30564 1726882817.16507: done checking for max_fail_percentage 30564 1726882817.16508: checking to see if all hosts have failed and the running result is not ok 30564 1726882817.16509: done checking to see if all hosts have failed 30564 1726882817.16510: getting the remaining hosts for this loop 30564 1726882817.16511: done getting the remaining hosts for this loop 30564 1726882817.16517: getting the next task for host managed_node2 30564 1726882817.16527: done getting next task for host managed_node2 30564 1726882817.16530: ^ task is: TASK: Cleanup 30564 1726882817.16533: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882817.16537: getting variables 30564 1726882817.16539: in VariableManager get_vars() 30564 1726882817.16596: Calling all_inventory to load vars for managed_node2 30564 1726882817.16600: Calling groups_inventory to load vars for managed_node2 30564 1726882817.16603: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882817.16615: Calling all_plugins_play to load vars for managed_node2 30564 1726882817.16618: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882817.16620: Calling groups_plugins_play to load vars for managed_node2 30564 1726882817.18787: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882817.20586: done with get_vars() 30564 1726882817.20617: done getting variables TASK [Cleanup] ***************************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:66 Friday 20 September 2024 21:40:17 -0400 (0:00:00.065) 0:00:15.788 ****** 30564 1726882817.20733: entering _queue_task() for managed_node2/include_tasks 30564 1726882817.21073: worker is 1 (out of 1 available) 30564 1726882817.21087: exiting _queue_task() for managed_node2/include_tasks 30564 1726882817.21098: done queuing things up, now waiting for results queue to drain 30564 1726882817.21100: waiting for pending results... 30564 1726882817.21301: running TaskExecutor() for managed_node2/TASK: Cleanup 30564 1726882817.21398: in run() - task 0e448fcc-3ce9-4216-acec-00000000009c 30564 1726882817.21415: variable 'ansible_search_path' from source: unknown 30564 1726882817.21422: variable 'ansible_search_path' from source: unknown 30564 1726882817.21504: variable 'lsr_cleanup' from source: include params 30564 1726882817.21749: variable 'lsr_cleanup' from source: include params 30564 1726882817.21844: variable 'omit' from source: magic vars 30564 1726882817.22012: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882817.22027: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882817.22052: variable 'omit' from source: magic vars 30564 1726882817.22314: variable 'ansible_distribution_major_version' from source: facts 30564 1726882817.22335: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882817.22345: variable 'item' from source: unknown 30564 1726882817.22416: variable 'item' from source: unknown 30564 1726882817.22458: variable 'item' from source: unknown 30564 1726882817.22531: variable 'item' from source: unknown 30564 1726882817.22695: dumping result to json 30564 1726882817.22703: done dumping result, returning 30564 1726882817.22711: done running TaskExecutor() for managed_node2/TASK: Cleanup [0e448fcc-3ce9-4216-acec-00000000009c] 30564 1726882817.22721: sending task result for task 0e448fcc-3ce9-4216-acec-00000000009c 30564 1726882817.22841: no more pending results, returning what we have 30564 1726882817.22846: in VariableManager get_vars() 30564 1726882817.22902: Calling all_inventory to load vars for managed_node2 30564 1726882817.22905: Calling groups_inventory to load vars for managed_node2 30564 1726882817.22909: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882817.22923: Calling all_plugins_play to load vars for managed_node2 30564 1726882817.22927: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882817.22930: Calling groups_plugins_play to load vars for managed_node2 30564 1726882817.24289: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000009c 30564 1726882817.24293: WORKER PROCESS EXITING 30564 1726882817.24547: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882817.25865: done with get_vars() 30564 1726882817.25886: variable 'ansible_search_path' from source: unknown 30564 1726882817.25887: variable 'ansible_search_path' from source: unknown 30564 1726882817.25929: we have included files to process 30564 1726882817.25931: generating all_blocks data 30564 1726882817.25932: done generating all_blocks data 30564 1726882817.25937: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30564 1726882817.25938: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30564 1726882817.25940: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30564 1726882817.26188: done processing included file 30564 1726882817.26190: iterating over new_blocks loaded from include file 30564 1726882817.26191: in VariableManager get_vars() 30564 1726882817.26208: done with get_vars() 30564 1726882817.26210: filtering new block on tags 30564 1726882817.26238: done filtering new block on tags 30564 1726882817.26240: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml for managed_node2 => (item=tasks/cleanup_profile+device.yml) 30564 1726882817.26245: extending task lists for all hosts with included blocks 30564 1726882817.27684: done extending task lists 30564 1726882817.27685: done processing included files 30564 1726882817.27686: results queue empty 30564 1726882817.27688: checking for any_errors_fatal 30564 1726882817.27690: done checking for any_errors_fatal 30564 1726882817.27691: checking for max_fail_percentage 30564 1726882817.27691: done checking for max_fail_percentage 30564 1726882817.27692: checking to see if all hosts have failed and the running result is not ok 30564 1726882817.27692: done checking to see if all hosts have failed 30564 1726882817.27693: getting the remaining hosts for this loop 30564 1726882817.27694: done getting the remaining hosts for this loop 30564 1726882817.27696: getting the next task for host managed_node2 30564 1726882817.27699: done getting next task for host managed_node2 30564 1726882817.27701: ^ task is: TASK: Cleanup profile and device 30564 1726882817.27702: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882817.27704: getting variables 30564 1726882817.27705: in VariableManager get_vars() 30564 1726882817.27712: Calling all_inventory to load vars for managed_node2 30564 1726882817.27713: Calling groups_inventory to load vars for managed_node2 30564 1726882817.27715: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882817.27719: Calling all_plugins_play to load vars for managed_node2 30564 1726882817.27720: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882817.27722: Calling groups_plugins_play to load vars for managed_node2 30564 1726882817.28828: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882817.29992: done with get_vars() 30564 1726882817.30006: done getting variables 30564 1726882817.30036: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Cleanup profile and device] ********************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml:3 Friday 20 September 2024 21:40:17 -0400 (0:00:00.093) 0:00:15.881 ****** 30564 1726882817.30058: entering _queue_task() for managed_node2/shell 30564 1726882817.30281: worker is 1 (out of 1 available) 30564 1726882817.30294: exiting _queue_task() for managed_node2/shell 30564 1726882817.30306: done queuing things up, now waiting for results queue to drain 30564 1726882817.30307: waiting for pending results... 30564 1726882817.30492: running TaskExecutor() for managed_node2/TASK: Cleanup profile and device 30564 1726882817.30561: in run() - task 0e448fcc-3ce9-4216-acec-00000000050b 30564 1726882817.30576: variable 'ansible_search_path' from source: unknown 30564 1726882817.30580: variable 'ansible_search_path' from source: unknown 30564 1726882817.30609: calling self._execute() 30564 1726882817.30684: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882817.30688: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882817.30697: variable 'omit' from source: magic vars 30564 1726882817.30955: variable 'ansible_distribution_major_version' from source: facts 30564 1726882817.30969: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882817.30975: variable 'omit' from source: magic vars 30564 1726882817.31171: variable 'omit' from source: magic vars 30564 1726882817.31218: variable 'interface' from source: play vars 30564 1726882817.31221: variable 'omit' from source: magic vars 30564 1726882817.31236: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882817.31275: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882817.31297: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882817.31308: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882817.31319: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882817.31348: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882817.31351: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882817.31354: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882817.31454: Set connection var ansible_timeout to 10 30564 1726882817.31457: Set connection var ansible_pipelining to False 30564 1726882817.31460: Set connection var ansible_shell_type to sh 30564 1726882817.31471: Set connection var ansible_shell_executable to /bin/sh 30564 1726882817.31477: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882817.31479: Set connection var ansible_connection to ssh 30564 1726882817.31506: variable 'ansible_shell_executable' from source: unknown 30564 1726882817.31510: variable 'ansible_connection' from source: unknown 30564 1726882817.31512: variable 'ansible_module_compression' from source: unknown 30564 1726882817.31514: variable 'ansible_shell_type' from source: unknown 30564 1726882817.31517: variable 'ansible_shell_executable' from source: unknown 30564 1726882817.31519: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882817.31521: variable 'ansible_pipelining' from source: unknown 30564 1726882817.31523: variable 'ansible_timeout' from source: unknown 30564 1726882817.31528: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882817.31654: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882817.31664: variable 'omit' from source: magic vars 30564 1726882817.31673: starting attempt loop 30564 1726882817.31676: running the handler 30564 1726882817.31684: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882817.31723: _low_level_execute_command(): starting 30564 1726882817.31730: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882817.32491: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882817.32500: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882817.32544: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882817.32547: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882817.32550: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882817.32610: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882817.32620: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882817.32626: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882817.32724: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882817.34388: stdout chunk (state=3): >>>/root <<< 30564 1726882817.34491: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882817.34540: stderr chunk (state=3): >>><<< 30564 1726882817.34543: stdout chunk (state=3): >>><<< 30564 1726882817.34573: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882817.34578: _low_level_execute_command(): starting 30564 1726882817.34582: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882817.3455498-31289-260385999167627 `" && echo ansible-tmp-1726882817.3455498-31289-260385999167627="` echo /root/.ansible/tmp/ansible-tmp-1726882817.3455498-31289-260385999167627 `" ) && sleep 0' 30564 1726882817.34997: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882817.35008: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882817.35035: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 30564 1726882817.35041: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882817.35049: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882817.35056: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882817.35060: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882817.35075: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882817.35082: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 30564 1726882817.35087: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882817.35135: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882817.35155: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882817.35158: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882817.35261: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882817.37135: stdout chunk (state=3): >>>ansible-tmp-1726882817.3455498-31289-260385999167627=/root/.ansible/tmp/ansible-tmp-1726882817.3455498-31289-260385999167627 <<< 30564 1726882817.37245: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882817.37294: stderr chunk (state=3): >>><<< 30564 1726882817.37297: stdout chunk (state=3): >>><<< 30564 1726882817.37311: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882817.3455498-31289-260385999167627=/root/.ansible/tmp/ansible-tmp-1726882817.3455498-31289-260385999167627 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882817.37335: variable 'ansible_module_compression' from source: unknown 30564 1726882817.37377: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30564 1726882817.37409: variable 'ansible_facts' from source: unknown 30564 1726882817.37455: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882817.3455498-31289-260385999167627/AnsiballZ_command.py 30564 1726882817.37555: Sending initial data 30564 1726882817.37558: Sent initial data (156 bytes) 30564 1726882817.38212: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882817.38216: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882817.38247: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882817.38250: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882817.38254: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882817.38314: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882817.38317: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882817.38322: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882817.38420: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882817.40160: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882817.40254: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882817.40354: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmp1woa26_y /root/.ansible/tmp/ansible-tmp-1726882817.3455498-31289-260385999167627/AnsiballZ_command.py <<< 30564 1726882817.40447: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882817.41462: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882817.41568: stderr chunk (state=3): >>><<< 30564 1726882817.41574: stdout chunk (state=3): >>><<< 30564 1726882817.41593: done transferring module to remote 30564 1726882817.41606: _low_level_execute_command(): starting 30564 1726882817.41609: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882817.3455498-31289-260385999167627/ /root/.ansible/tmp/ansible-tmp-1726882817.3455498-31289-260385999167627/AnsiballZ_command.py && sleep 0' 30564 1726882817.42049: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882817.42093: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 30564 1726882817.42096: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30564 1726882817.42098: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882817.42104: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882817.42149: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882817.42161: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882817.42272: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882817.44068: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882817.44118: stderr chunk (state=3): >>><<< 30564 1726882817.44122: stdout chunk (state=3): >>><<< 30564 1726882817.44135: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882817.44137: _low_level_execute_command(): starting 30564 1726882817.44142: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882817.3455498-31289-260385999167627/AnsiballZ_command.py && sleep 0' 30564 1726882817.44565: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882817.44574: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882817.44601: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882817.44614: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882817.44678: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882817.44682: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882817.44799: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882817.64804: stdout chunk (state=3): >>> {"changed": true, "stdout": "Connection 'statebr' (ef1ddb35-9196-4b00-9c2c-f98653d92d9c) successfully deleted.", "stderr": "Cannot find device \"statebr\"", "rc": 1, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-20 21:40:17.579198", "end": "2024-09-20 21:40:17.645912", "delta": "0:00:00.066714", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30564 1726882817.66121: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.11.158 closed. <<< 30564 1726882817.66125: stdout chunk (state=3): >>><<< 30564 1726882817.66127: stderr chunk (state=3): >>><<< 30564 1726882817.66281: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "Connection 'statebr' (ef1ddb35-9196-4b00-9c2c-f98653d92d9c) successfully deleted.", "stderr": "Cannot find device \"statebr\"", "rc": 1, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-20 21:40:17.579198", "end": "2024-09-20 21:40:17.645912", "delta": "0:00:00.066714", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.11.158 closed. 30564 1726882817.66285: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882817.3455498-31289-260385999167627/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882817.66292: _low_level_execute_command(): starting 30564 1726882817.66295: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882817.3455498-31289-260385999167627/ > /dev/null 2>&1 && sleep 0' 30564 1726882817.66936: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882817.66948: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882817.66966: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882817.66985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882817.67024: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882817.67034: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882817.67051: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882817.67074: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882817.67085: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882817.67095: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882817.67104: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882817.67114: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882817.67127: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882817.67135: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882817.67144: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882817.67160: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882817.67239: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882817.67269: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882817.67289: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882817.67415: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882817.69321: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882817.69325: stdout chunk (state=3): >>><<< 30564 1726882817.69327: stderr chunk (state=3): >>><<< 30564 1726882817.69739: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882817.69742: handler run complete 30564 1726882817.69745: Evaluated conditional (False): False 30564 1726882817.69747: attempt loop complete, returning result 30564 1726882817.69749: _execute() done 30564 1726882817.69751: dumping result to json 30564 1726882817.69753: done dumping result, returning 30564 1726882817.69755: done running TaskExecutor() for managed_node2/TASK: Cleanup profile and device [0e448fcc-3ce9-4216-acec-00000000050b] 30564 1726882817.69756: sending task result for task 0e448fcc-3ce9-4216-acec-00000000050b fatal: [managed_node2]: FAILED! => { "changed": false, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "delta": "0:00:00.066714", "end": "2024-09-20 21:40:17.645912", "rc": 1, "start": "2024-09-20 21:40:17.579198" } STDOUT: Connection 'statebr' (ef1ddb35-9196-4b00-9c2c-f98653d92d9c) successfully deleted. STDERR: Cannot find device "statebr" MSG: non-zero return code ...ignoring 30564 1726882817.69916: no more pending results, returning what we have 30564 1726882817.69923: results queue empty 30564 1726882817.69926: checking for any_errors_fatal 30564 1726882817.69928: done checking for any_errors_fatal 30564 1726882817.69929: checking for max_fail_percentage 30564 1726882817.69931: done checking for max_fail_percentage 30564 1726882817.69932: checking to see if all hosts have failed and the running result is not ok 30564 1726882817.69934: done checking to see if all hosts have failed 30564 1726882817.69935: getting the remaining hosts for this loop 30564 1726882817.69936: done getting the remaining hosts for this loop 30564 1726882817.69943: getting the next task for host managed_node2 30564 1726882817.69954: done getting next task for host managed_node2 30564 1726882817.69957: ^ task is: TASK: Include the task 'run_test.yml' 30564 1726882817.69961: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882817.70034: getting variables 30564 1726882817.70036: in VariableManager get_vars() 30564 1726882817.70095: Calling all_inventory to load vars for managed_node2 30564 1726882817.70099: Calling groups_inventory to load vars for managed_node2 30564 1726882817.70102: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882817.70116: Calling all_plugins_play to load vars for managed_node2 30564 1726882817.70125: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882817.70130: Calling groups_plugins_play to load vars for managed_node2 30564 1726882817.70654: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000050b 30564 1726882817.70658: WORKER PROCESS EXITING 30564 1726882817.72942: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882817.75750: done with get_vars() 30564 1726882817.75789: done getting variables TASK [Include the task 'run_test.yml'] ***************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_states.yml:45 Friday 20 September 2024 21:40:17 -0400 (0:00:00.458) 0:00:16.340 ****** 30564 1726882817.75888: entering _queue_task() for managed_node2/include_tasks 30564 1726882817.76620: worker is 1 (out of 1 available) 30564 1726882817.76630: exiting _queue_task() for managed_node2/include_tasks 30564 1726882817.76642: done queuing things up, now waiting for results queue to drain 30564 1726882817.76643: waiting for pending results... 30564 1726882817.76984: running TaskExecutor() for managed_node2/TASK: Include the task 'run_test.yml' 30564 1726882817.77088: in run() - task 0e448fcc-3ce9-4216-acec-00000000000f 30564 1726882817.77111: variable 'ansible_search_path' from source: unknown 30564 1726882817.77149: calling self._execute() 30564 1726882817.77246: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882817.77260: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882817.77281: variable 'omit' from source: magic vars 30564 1726882817.77680: variable 'ansible_distribution_major_version' from source: facts 30564 1726882817.77699: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882817.77709: _execute() done 30564 1726882817.77717: dumping result to json 30564 1726882817.77723: done dumping result, returning 30564 1726882817.77732: done running TaskExecutor() for managed_node2/TASK: Include the task 'run_test.yml' [0e448fcc-3ce9-4216-acec-00000000000f] 30564 1726882817.77741: sending task result for task 0e448fcc-3ce9-4216-acec-00000000000f 30564 1726882817.77892: no more pending results, returning what we have 30564 1726882817.77898: in VariableManager get_vars() 30564 1726882817.77933: Calling all_inventory to load vars for managed_node2 30564 1726882817.77936: Calling groups_inventory to load vars for managed_node2 30564 1726882817.77940: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882817.77955: Calling all_plugins_play to load vars for managed_node2 30564 1726882817.77958: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882817.77961: Calling groups_plugins_play to load vars for managed_node2 30564 1726882817.79057: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000000f 30564 1726882817.79060: WORKER PROCESS EXITING 30564 1726882817.79997: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882817.83243: done with get_vars() 30564 1726882817.83473: variable 'ansible_search_path' from source: unknown 30564 1726882817.83491: we have included files to process 30564 1726882817.83492: generating all_blocks data 30564 1726882817.83494: done generating all_blocks data 30564 1726882817.83500: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30564 1726882817.83501: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30564 1726882817.83504: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30564 1726882817.84322: in VariableManager get_vars() 30564 1726882817.84340: done with get_vars() 30564 1726882817.84386: in VariableManager get_vars() 30564 1726882817.84403: done with get_vars() 30564 1726882817.84442: in VariableManager get_vars() 30564 1726882817.84457: done with get_vars() 30564 1726882817.84502: in VariableManager get_vars() 30564 1726882817.84518: done with get_vars() 30564 1726882817.84559: in VariableManager get_vars() 30564 1726882817.84579: done with get_vars() 30564 1726882817.85542: in VariableManager get_vars() 30564 1726882817.85558: done with get_vars() 30564 1726882817.85573: done processing included file 30564 1726882817.85575: iterating over new_blocks loaded from include file 30564 1726882817.85577: in VariableManager get_vars() 30564 1726882817.85587: done with get_vars() 30564 1726882817.85589: filtering new block on tags 30564 1726882817.85682: done filtering new block on tags 30564 1726882817.85685: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml for managed_node2 30564 1726882817.85690: extending task lists for all hosts with included blocks 30564 1726882817.85722: done extending task lists 30564 1726882817.85724: done processing included files 30564 1726882817.85724: results queue empty 30564 1726882817.85725: checking for any_errors_fatal 30564 1726882817.85728: done checking for any_errors_fatal 30564 1726882817.85729: checking for max_fail_percentage 30564 1726882817.85730: done checking for max_fail_percentage 30564 1726882817.85731: checking to see if all hosts have failed and the running result is not ok 30564 1726882817.85731: done checking to see if all hosts have failed 30564 1726882817.85732: getting the remaining hosts for this loop 30564 1726882817.85733: done getting the remaining hosts for this loop 30564 1726882817.85736: getting the next task for host managed_node2 30564 1726882817.85739: done getting next task for host managed_node2 30564 1726882817.85741: ^ task is: TASK: TEST: {{ lsr_description }} 30564 1726882817.85743: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882817.85745: getting variables 30564 1726882817.85746: in VariableManager get_vars() 30564 1726882817.85754: Calling all_inventory to load vars for managed_node2 30564 1726882817.85756: Calling groups_inventory to load vars for managed_node2 30564 1726882817.85759: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882817.85766: Calling all_plugins_play to load vars for managed_node2 30564 1726882817.85771: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882817.85774: Calling groups_plugins_play to load vars for managed_node2 30564 1726882817.88380: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882817.92388: done with get_vars() 30564 1726882817.92417: done getting variables 30564 1726882817.92463: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30564 1726882817.92887: variable 'lsr_description' from source: include params TASK [TEST: I can create a profile without autoconnect] ************************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:5 Friday 20 September 2024 21:40:17 -0400 (0:00:00.170) 0:00:16.510 ****** 30564 1726882817.92917: entering _queue_task() for managed_node2/debug 30564 1726882817.93684: worker is 1 (out of 1 available) 30564 1726882817.93696: exiting _queue_task() for managed_node2/debug 30564 1726882817.93707: done queuing things up, now waiting for results queue to drain 30564 1726882817.93708: waiting for pending results... 30564 1726882817.94606: running TaskExecutor() for managed_node2/TASK: TEST: I can create a profile without autoconnect 30564 1726882817.95071: in run() - task 0e448fcc-3ce9-4216-acec-0000000005b4 30564 1726882817.95077: variable 'ansible_search_path' from source: unknown 30564 1726882817.95081: variable 'ansible_search_path' from source: unknown 30564 1726882817.95084: calling self._execute() 30564 1726882817.95087: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882817.95090: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882817.95094: variable 'omit' from source: magic vars 30564 1726882817.95769: variable 'ansible_distribution_major_version' from source: facts 30564 1726882817.95897: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882817.95902: variable 'omit' from source: magic vars 30564 1726882817.95940: variable 'omit' from source: magic vars 30564 1726882817.96151: variable 'lsr_description' from source: include params 30564 1726882817.96173: variable 'omit' from source: magic vars 30564 1726882817.96331: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882817.96370: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882817.96386: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882817.96404: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882817.96417: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882817.96583: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882817.96587: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882817.96589: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882817.96811: Set connection var ansible_timeout to 10 30564 1726882817.96814: Set connection var ansible_pipelining to False 30564 1726882817.96817: Set connection var ansible_shell_type to sh 30564 1726882817.96824: Set connection var ansible_shell_executable to /bin/sh 30564 1726882817.96831: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882817.96834: Set connection var ansible_connection to ssh 30564 1726882817.96857: variable 'ansible_shell_executable' from source: unknown 30564 1726882817.96861: variable 'ansible_connection' from source: unknown 30564 1726882817.96979: variable 'ansible_module_compression' from source: unknown 30564 1726882817.96982: variable 'ansible_shell_type' from source: unknown 30564 1726882817.96985: variable 'ansible_shell_executable' from source: unknown 30564 1726882817.96987: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882817.96990: variable 'ansible_pipelining' from source: unknown 30564 1726882817.96992: variable 'ansible_timeout' from source: unknown 30564 1726882817.96994: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882817.97283: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882817.97294: variable 'omit' from source: magic vars 30564 1726882817.97413: starting attempt loop 30564 1726882817.97416: running the handler 30564 1726882817.97458: handler run complete 30564 1726882817.97474: attempt loop complete, returning result 30564 1726882817.97478: _execute() done 30564 1726882817.97481: dumping result to json 30564 1726882817.97483: done dumping result, returning 30564 1726882817.97490: done running TaskExecutor() for managed_node2/TASK: TEST: I can create a profile without autoconnect [0e448fcc-3ce9-4216-acec-0000000005b4] 30564 1726882817.97496: sending task result for task 0e448fcc-3ce9-4216-acec-0000000005b4 30564 1726882817.97703: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000005b4 30564 1726882817.97705: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: ########## I can create a profile without autoconnect ########## 30564 1726882817.97751: no more pending results, returning what we have 30564 1726882817.97756: results queue empty 30564 1726882817.97758: checking for any_errors_fatal 30564 1726882817.97759: done checking for any_errors_fatal 30564 1726882817.97760: checking for max_fail_percentage 30564 1726882817.97762: done checking for max_fail_percentage 30564 1726882817.97765: checking to see if all hosts have failed and the running result is not ok 30564 1726882817.97766: done checking to see if all hosts have failed 30564 1726882817.97769: getting the remaining hosts for this loop 30564 1726882817.97771: done getting the remaining hosts for this loop 30564 1726882817.97775: getting the next task for host managed_node2 30564 1726882817.97783: done getting next task for host managed_node2 30564 1726882817.97786: ^ task is: TASK: Show item 30564 1726882817.97789: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882817.97794: getting variables 30564 1726882817.97795: in VariableManager get_vars() 30564 1726882817.97826: Calling all_inventory to load vars for managed_node2 30564 1726882817.97829: Calling groups_inventory to load vars for managed_node2 30564 1726882817.97833: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882817.97845: Calling all_plugins_play to load vars for managed_node2 30564 1726882817.97848: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882817.97850: Calling groups_plugins_play to load vars for managed_node2 30564 1726882818.00656: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882818.02452: done with get_vars() 30564 1726882818.02486: done getting variables 30564 1726882818.02544: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show item] *************************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:9 Friday 20 September 2024 21:40:18 -0400 (0:00:00.096) 0:00:16.607 ****** 30564 1726882818.02578: entering _queue_task() for managed_node2/debug 30564 1726882818.02916: worker is 1 (out of 1 available) 30564 1726882818.02929: exiting _queue_task() for managed_node2/debug 30564 1726882818.02942: done queuing things up, now waiting for results queue to drain 30564 1726882818.02943: waiting for pending results... 30564 1726882818.03239: running TaskExecutor() for managed_node2/TASK: Show item 30564 1726882818.03364: in run() - task 0e448fcc-3ce9-4216-acec-0000000005b5 30564 1726882818.03392: variable 'ansible_search_path' from source: unknown 30564 1726882818.03401: variable 'ansible_search_path' from source: unknown 30564 1726882818.03455: variable 'omit' from source: magic vars 30564 1726882818.03628: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882818.03642: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882818.03657: variable 'omit' from source: magic vars 30564 1726882818.04039: variable 'ansible_distribution_major_version' from source: facts 30564 1726882818.04060: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882818.04077: variable 'omit' from source: magic vars 30564 1726882818.04120: variable 'omit' from source: magic vars 30564 1726882818.04177: variable 'item' from source: unknown 30564 1726882818.04255: variable 'item' from source: unknown 30564 1726882818.04283: variable 'omit' from source: magic vars 30564 1726882818.04329: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882818.04380: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882818.04402: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882818.04424: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882818.04440: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882818.04484: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882818.04492: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882818.04500: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882818.04617: Set connection var ansible_timeout to 10 30564 1726882818.04626: Set connection var ansible_pipelining to False 30564 1726882818.04633: Set connection var ansible_shell_type to sh 30564 1726882818.04642: Set connection var ansible_shell_executable to /bin/sh 30564 1726882818.04653: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882818.04659: Set connection var ansible_connection to ssh 30564 1726882818.05408: variable 'ansible_shell_executable' from source: unknown 30564 1726882818.05416: variable 'ansible_connection' from source: unknown 30564 1726882818.05424: variable 'ansible_module_compression' from source: unknown 30564 1726882818.05430: variable 'ansible_shell_type' from source: unknown 30564 1726882818.05437: variable 'ansible_shell_executable' from source: unknown 30564 1726882818.05443: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882818.05451: variable 'ansible_pipelining' from source: unknown 30564 1726882818.05457: variable 'ansible_timeout' from source: unknown 30564 1726882818.05466: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882818.05723: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882818.05738: variable 'omit' from source: magic vars 30564 1726882818.05747: starting attempt loop 30564 1726882818.05825: running the handler 30564 1726882818.05879: variable 'lsr_description' from source: include params 30564 1726882818.06062: variable 'lsr_description' from source: include params 30564 1726882818.06082: handler run complete 30564 1726882818.06103: attempt loop complete, returning result 30564 1726882818.06122: variable 'item' from source: unknown 30564 1726882818.06304: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_description) => { "ansible_loop_var": "item", "item": "lsr_description", "lsr_description": "I can create a profile without autoconnect" } 30564 1726882818.06538: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882818.06554: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882818.06572: variable 'omit' from source: magic vars 30564 1726882818.06735: variable 'ansible_distribution_major_version' from source: facts 30564 1726882818.06745: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882818.06753: variable 'omit' from source: magic vars 30564 1726882818.06774: variable 'omit' from source: magic vars 30564 1726882818.06822: variable 'item' from source: unknown 30564 1726882818.06888: variable 'item' from source: unknown 30564 1726882818.06906: variable 'omit' from source: magic vars 30564 1726882818.06933: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882818.06945: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882818.06956: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882818.06975: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882818.06983: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882818.06989: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882818.07071: Set connection var ansible_timeout to 10 30564 1726882818.07083: Set connection var ansible_pipelining to False 30564 1726882818.07089: Set connection var ansible_shell_type to sh 30564 1726882818.07097: Set connection var ansible_shell_executable to /bin/sh 30564 1726882818.07107: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882818.07112: Set connection var ansible_connection to ssh 30564 1726882818.07139: variable 'ansible_shell_executable' from source: unknown 30564 1726882818.07146: variable 'ansible_connection' from source: unknown 30564 1726882818.07152: variable 'ansible_module_compression' from source: unknown 30564 1726882818.07158: variable 'ansible_shell_type' from source: unknown 30564 1726882818.07165: variable 'ansible_shell_executable' from source: unknown 30564 1726882818.07175: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882818.07183: variable 'ansible_pipelining' from source: unknown 30564 1726882818.07189: variable 'ansible_timeout' from source: unknown 30564 1726882818.07195: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882818.07289: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882818.07302: variable 'omit' from source: magic vars 30564 1726882818.07310: starting attempt loop 30564 1726882818.07316: running the handler 30564 1726882818.07339: variable 'lsr_setup' from source: include params 30564 1726882818.07414: variable 'lsr_setup' from source: include params 30564 1726882818.07470: handler run complete 30564 1726882818.07490: attempt loop complete, returning result 30564 1726882818.07509: variable 'item' from source: unknown 30564 1726882818.07576: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_setup) => { "ansible_loop_var": "item", "item": "lsr_setup", "lsr_setup": [ "tasks/delete_interface.yml", "tasks/assert_device_absent.yml" ] } 30564 1726882818.07740: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882818.07752: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882818.07766: variable 'omit' from source: magic vars 30564 1726882818.07925: variable 'ansible_distribution_major_version' from source: facts 30564 1726882818.07935: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882818.07942: variable 'omit' from source: magic vars 30564 1726882818.07957: variable 'omit' from source: magic vars 30564 1726882818.08002: variable 'item' from source: unknown 30564 1726882818.08071: variable 'item' from source: unknown 30564 1726882818.08089: variable 'omit' from source: magic vars 30564 1726882818.08110: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882818.08124: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882818.08133: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882818.08148: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882818.08154: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882818.08160: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882818.08242: Set connection var ansible_timeout to 10 30564 1726882818.08252: Set connection var ansible_pipelining to False 30564 1726882818.08258: Set connection var ansible_shell_type to sh 30564 1726882818.08271: Set connection var ansible_shell_executable to /bin/sh 30564 1726882818.08284: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882818.08290: Set connection var ansible_connection to ssh 30564 1726882818.08312: variable 'ansible_shell_executable' from source: unknown 30564 1726882818.08318: variable 'ansible_connection' from source: unknown 30564 1726882818.08324: variable 'ansible_module_compression' from source: unknown 30564 1726882818.08331: variable 'ansible_shell_type' from source: unknown 30564 1726882818.08339: variable 'ansible_shell_executable' from source: unknown 30564 1726882818.08345: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882818.08352: variable 'ansible_pipelining' from source: unknown 30564 1726882818.08358: variable 'ansible_timeout' from source: unknown 30564 1726882818.08369: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882818.08458: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882818.08476: variable 'omit' from source: magic vars 30564 1726882818.08484: starting attempt loop 30564 1726882818.08490: running the handler 30564 1726882818.08512: variable 'lsr_test' from source: include params 30564 1726882818.08587: variable 'lsr_test' from source: include params 30564 1726882818.08608: handler run complete 30564 1726882818.08625: attempt loop complete, returning result 30564 1726882818.08643: variable 'item' from source: unknown 30564 1726882818.08714: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_test) => { "ansible_loop_var": "item", "item": "lsr_test", "lsr_test": [ "tasks/create_bridge_profile_no_autoconnect.yml" ] } 30564 1726882818.08860: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882818.08878: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882818.08894: variable 'omit' from source: magic vars 30564 1726882818.09054: variable 'ansible_distribution_major_version' from source: facts 30564 1726882818.09069: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882818.09080: variable 'omit' from source: magic vars 30564 1726882818.09097: variable 'omit' from source: magic vars 30564 1726882818.09143: variable 'item' from source: unknown 30564 1726882818.09210: variable 'item' from source: unknown 30564 1726882818.09228: variable 'omit' from source: magic vars 30564 1726882818.09253: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882818.09269: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882818.09281: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882818.09296: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882818.09303: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882818.09310: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882818.09390: Set connection var ansible_timeout to 10 30564 1726882818.09400: Set connection var ansible_pipelining to False 30564 1726882818.09405: Set connection var ansible_shell_type to sh 30564 1726882818.09414: Set connection var ansible_shell_executable to /bin/sh 30564 1726882818.09424: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882818.09429: Set connection var ansible_connection to ssh 30564 1726882818.09454: variable 'ansible_shell_executable' from source: unknown 30564 1726882818.09465: variable 'ansible_connection' from source: unknown 30564 1726882818.09475: variable 'ansible_module_compression' from source: unknown 30564 1726882818.09483: variable 'ansible_shell_type' from source: unknown 30564 1726882818.09490: variable 'ansible_shell_executable' from source: unknown 30564 1726882818.09496: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882818.09503: variable 'ansible_pipelining' from source: unknown 30564 1726882818.09508: variable 'ansible_timeout' from source: unknown 30564 1726882818.09514: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882818.09603: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882818.09615: variable 'omit' from source: magic vars 30564 1726882818.09622: starting attempt loop 30564 1726882818.09628: running the handler 30564 1726882818.09647: variable 'lsr_assert' from source: include params 30564 1726882818.09715: variable 'lsr_assert' from source: include params 30564 1726882818.09736: handler run complete 30564 1726882818.09753: attempt loop complete, returning result 30564 1726882818.09775: variable 'item' from source: unknown 30564 1726882818.09839: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_assert) => { "ansible_loop_var": "item", "item": "lsr_assert", "lsr_assert": [ "tasks/assert_device_absent.yml", "tasks/assert_profile_present.yml" ] } 30564 1726882818.09989: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882818.10003: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882818.10017: variable 'omit' from source: magic vars 30564 1726882818.10207: variable 'ansible_distribution_major_version' from source: facts 30564 1726882818.10217: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882818.10230: variable 'omit' from source: magic vars 30564 1726882818.10247: variable 'omit' from source: magic vars 30564 1726882818.10299: variable 'item' from source: unknown 30564 1726882818.10362: variable 'item' from source: unknown 30564 1726882818.10386: variable 'omit' from source: magic vars 30564 1726882818.10407: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882818.10418: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882818.10429: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882818.10443: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882818.10451: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882818.10457: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882818.10537: Set connection var ansible_timeout to 10 30564 1726882818.10547: Set connection var ansible_pipelining to False 30564 1726882818.10554: Set connection var ansible_shell_type to sh 30564 1726882818.10566: Set connection var ansible_shell_executable to /bin/sh 30564 1726882818.10581: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882818.10592: Set connection var ansible_connection to ssh 30564 1726882818.10617: variable 'ansible_shell_executable' from source: unknown 30564 1726882818.10625: variable 'ansible_connection' from source: unknown 30564 1726882818.10633: variable 'ansible_module_compression' from source: unknown 30564 1726882818.10640: variable 'ansible_shell_type' from source: unknown 30564 1726882818.10646: variable 'ansible_shell_executable' from source: unknown 30564 1726882818.10652: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882818.10659: variable 'ansible_pipelining' from source: unknown 30564 1726882818.10666: variable 'ansible_timeout' from source: unknown 30564 1726882818.10679: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882818.10769: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882818.10783: variable 'omit' from source: magic vars 30564 1726882818.10791: starting attempt loop 30564 1726882818.10797: running the handler 30564 1726882818.10902: handler run complete 30564 1726882818.10923: attempt loop complete, returning result 30564 1726882818.10943: variable 'item' from source: unknown 30564 1726882818.11008: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_assert_when) => { "ansible_loop_var": "item", "item": "lsr_assert_when", "lsr_assert_when": "VARIABLE IS NOT DEFINED!: 'lsr_assert_when' is undefined" } 30564 1726882818.11151: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882818.11165: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882818.11181: variable 'omit' from source: magic vars 30564 1726882818.11329: variable 'ansible_distribution_major_version' from source: facts 30564 1726882818.11340: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882818.11352: variable 'omit' from source: magic vars 30564 1726882818.11374: variable 'omit' from source: magic vars 30564 1726882818.11416: variable 'item' from source: unknown 30564 1726882818.11486: variable 'item' from source: unknown 30564 1726882818.11505: variable 'omit' from source: magic vars 30564 1726882818.11526: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882818.11537: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882818.11545: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882818.11556: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882818.11566: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882818.11576: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882818.11634: Set connection var ansible_timeout to 10 30564 1726882818.11681: Set connection var ansible_pipelining to False 30564 1726882818.11689: Set connection var ansible_shell_type to sh 30564 1726882818.11698: Set connection var ansible_shell_executable to /bin/sh 30564 1726882818.11792: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882818.11800: Set connection var ansible_connection to ssh 30564 1726882818.11826: variable 'ansible_shell_executable' from source: unknown 30564 1726882818.11834: variable 'ansible_connection' from source: unknown 30564 1726882818.11840: variable 'ansible_module_compression' from source: unknown 30564 1726882818.11847: variable 'ansible_shell_type' from source: unknown 30564 1726882818.11853: variable 'ansible_shell_executable' from source: unknown 30564 1726882818.11859: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882818.11872: variable 'ansible_pipelining' from source: unknown 30564 1726882818.11880: variable 'ansible_timeout' from source: unknown 30564 1726882818.11892: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882818.12057: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882818.12116: variable 'omit' from source: magic vars 30564 1726882818.12126: starting attempt loop 30564 1726882818.12132: running the handler 30564 1726882818.12153: variable 'lsr_fail_debug' from source: play vars 30564 1726882818.12278: variable 'lsr_fail_debug' from source: play vars 30564 1726882818.12342: handler run complete 30564 1726882818.12447: attempt loop complete, returning result 30564 1726882818.12470: variable 'item' from source: unknown 30564 1726882818.12531: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_fail_debug) => { "ansible_loop_var": "item", "item": "lsr_fail_debug", "lsr_fail_debug": [ "__network_connections_result" ] } 30564 1726882818.12798: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882818.12812: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882818.12825: variable 'omit' from source: magic vars 30564 1726882818.13122: variable 'ansible_distribution_major_version' from source: facts 30564 1726882818.13196: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882818.13205: variable 'omit' from source: magic vars 30564 1726882818.13222: variable 'omit' from source: magic vars 30564 1726882818.13270: variable 'item' from source: unknown 30564 1726882818.13378: variable 'item' from source: unknown 30564 1726882818.13403: variable 'omit' from source: magic vars 30564 1726882818.13433: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882818.13446: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882818.13456: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882818.13483: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882818.13491: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882818.13498: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882818.13592: Set connection var ansible_timeout to 10 30564 1726882818.13607: Set connection var ansible_pipelining to False 30564 1726882818.13615: Set connection var ansible_shell_type to sh 30564 1726882818.13629: Set connection var ansible_shell_executable to /bin/sh 30564 1726882818.13641: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882818.13648: Set connection var ansible_connection to ssh 30564 1726882818.13679: variable 'ansible_shell_executable' from source: unknown 30564 1726882818.13686: variable 'ansible_connection' from source: unknown 30564 1726882818.13693: variable 'ansible_module_compression' from source: unknown 30564 1726882818.13698: variable 'ansible_shell_type' from source: unknown 30564 1726882818.13704: variable 'ansible_shell_executable' from source: unknown 30564 1726882818.13709: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882818.13716: variable 'ansible_pipelining' from source: unknown 30564 1726882818.13726: variable 'ansible_timeout' from source: unknown 30564 1726882818.13739: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882818.13826: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882818.13838: variable 'omit' from source: magic vars 30564 1726882818.13850: starting attempt loop 30564 1726882818.13856: running the handler 30564 1726882818.13882: variable 'lsr_cleanup' from source: include params 30564 1726882818.13946: variable 'lsr_cleanup' from source: include params 30564 1726882818.13976: handler run complete 30564 1726882818.13995: attempt loop complete, returning result 30564 1726882818.14016: variable 'item' from source: unknown 30564 1726882818.14087: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_cleanup) => { "ansible_loop_var": "item", "item": "lsr_cleanup", "lsr_cleanup": [ "tasks/cleanup_profile+device.yml" ] } 30564 1726882818.14175: dumping result to json 30564 1726882818.14186: done dumping result, returning 30564 1726882818.14199: done running TaskExecutor() for managed_node2/TASK: Show item [0e448fcc-3ce9-4216-acec-0000000005b5] 30564 1726882818.14210: sending task result for task 0e448fcc-3ce9-4216-acec-0000000005b5 30564 1726882818.14339: no more pending results, returning what we have 30564 1726882818.14343: results queue empty 30564 1726882818.14344: checking for any_errors_fatal 30564 1726882818.14350: done checking for any_errors_fatal 30564 1726882818.14350: checking for max_fail_percentage 30564 1726882818.14352: done checking for max_fail_percentage 30564 1726882818.14353: checking to see if all hosts have failed and the running result is not ok 30564 1726882818.14354: done checking to see if all hosts have failed 30564 1726882818.14355: getting the remaining hosts for this loop 30564 1726882818.14357: done getting the remaining hosts for this loop 30564 1726882818.14360: getting the next task for host managed_node2 30564 1726882818.14371: done getting next task for host managed_node2 30564 1726882818.14374: ^ task is: TASK: Include the task 'show_interfaces.yml' 30564 1726882818.14377: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882818.14381: getting variables 30564 1726882818.14383: in VariableManager get_vars() 30564 1726882818.14414: Calling all_inventory to load vars for managed_node2 30564 1726882818.14417: Calling groups_inventory to load vars for managed_node2 30564 1726882818.14420: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882818.14432: Calling all_plugins_play to load vars for managed_node2 30564 1726882818.14436: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882818.14438: Calling groups_plugins_play to load vars for managed_node2 30564 1726882818.15484: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000005b5 30564 1726882818.15487: WORKER PROCESS EXITING 30564 1726882818.17047: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882818.18818: done with get_vars() 30564 1726882818.18849: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:21 Friday 20 September 2024 21:40:18 -0400 (0:00:00.163) 0:00:16.770 ****** 30564 1726882818.18953: entering _queue_task() for managed_node2/include_tasks 30564 1726882818.19416: worker is 1 (out of 1 available) 30564 1726882818.19428: exiting _queue_task() for managed_node2/include_tasks 30564 1726882818.19441: done queuing things up, now waiting for results queue to drain 30564 1726882818.19442: waiting for pending results... 30564 1726882818.19732: running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' 30564 1726882818.19850: in run() - task 0e448fcc-3ce9-4216-acec-0000000005b6 30564 1726882818.19878: variable 'ansible_search_path' from source: unknown 30564 1726882818.19890: variable 'ansible_search_path' from source: unknown 30564 1726882818.19929: calling self._execute() 30564 1726882818.20038: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882818.20050: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882818.20070: variable 'omit' from source: magic vars 30564 1726882818.20678: variable 'ansible_distribution_major_version' from source: facts 30564 1726882818.20697: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882818.20707: _execute() done 30564 1726882818.20714: dumping result to json 30564 1726882818.20722: done dumping result, returning 30564 1726882818.20729: done running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' [0e448fcc-3ce9-4216-acec-0000000005b6] 30564 1726882818.20738: sending task result for task 0e448fcc-3ce9-4216-acec-0000000005b6 30564 1726882818.20960: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000005b6 30564 1726882818.20993: no more pending results, returning what we have 30564 1726882818.20999: in VariableManager get_vars() 30564 1726882818.21036: Calling all_inventory to load vars for managed_node2 30564 1726882818.21039: Calling groups_inventory to load vars for managed_node2 30564 1726882818.21043: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882818.21057: Calling all_plugins_play to load vars for managed_node2 30564 1726882818.21061: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882818.21065: Calling groups_plugins_play to load vars for managed_node2 30564 1726882818.22186: WORKER PROCESS EXITING 30564 1726882818.22942: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882818.24733: done with get_vars() 30564 1726882818.24753: variable 'ansible_search_path' from source: unknown 30564 1726882818.24755: variable 'ansible_search_path' from source: unknown 30564 1726882818.24799: we have included files to process 30564 1726882818.24800: generating all_blocks data 30564 1726882818.24802: done generating all_blocks data 30564 1726882818.24809: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30564 1726882818.24810: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30564 1726882818.24812: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30564 1726882818.24925: in VariableManager get_vars() 30564 1726882818.24943: done with get_vars() 30564 1726882818.25054: done processing included file 30564 1726882818.25056: iterating over new_blocks loaded from include file 30564 1726882818.25057: in VariableManager get_vars() 30564 1726882818.25076: done with get_vars() 30564 1726882818.25078: filtering new block on tags 30564 1726882818.25113: done filtering new block on tags 30564 1726882818.25115: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node2 30564 1726882818.25120: extending task lists for all hosts with included blocks 30564 1726882818.25553: done extending task lists 30564 1726882818.25554: done processing included files 30564 1726882818.25555: results queue empty 30564 1726882818.25556: checking for any_errors_fatal 30564 1726882818.25565: done checking for any_errors_fatal 30564 1726882818.25566: checking for max_fail_percentage 30564 1726882818.25570: done checking for max_fail_percentage 30564 1726882818.25571: checking to see if all hosts have failed and the running result is not ok 30564 1726882818.25571: done checking to see if all hosts have failed 30564 1726882818.25572: getting the remaining hosts for this loop 30564 1726882818.25574: done getting the remaining hosts for this loop 30564 1726882818.25577: getting the next task for host managed_node2 30564 1726882818.25581: done getting next task for host managed_node2 30564 1726882818.25583: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 30564 1726882818.25587: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882818.25590: getting variables 30564 1726882818.25591: in VariableManager get_vars() 30564 1726882818.25601: Calling all_inventory to load vars for managed_node2 30564 1726882818.25603: Calling groups_inventory to load vars for managed_node2 30564 1726882818.25605: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882818.25611: Calling all_plugins_play to load vars for managed_node2 30564 1726882818.25613: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882818.25616: Calling groups_plugins_play to load vars for managed_node2 30564 1726882818.28494: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882818.30290: done with get_vars() 30564 1726882818.30318: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 21:40:18 -0400 (0:00:00.114) 0:00:16.885 ****** 30564 1726882818.30406: entering _queue_task() for managed_node2/include_tasks 30564 1726882818.30751: worker is 1 (out of 1 available) 30564 1726882818.30769: exiting _queue_task() for managed_node2/include_tasks 30564 1726882818.30782: done queuing things up, now waiting for results queue to drain 30564 1726882818.30783: waiting for pending results... 30564 1726882818.31116: running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' 30564 1726882818.31240: in run() - task 0e448fcc-3ce9-4216-acec-0000000005dd 30564 1726882818.31259: variable 'ansible_search_path' from source: unknown 30564 1726882818.31280: variable 'ansible_search_path' from source: unknown 30564 1726882818.31320: calling self._execute() 30564 1726882818.31422: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882818.31435: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882818.31456: variable 'omit' from source: magic vars 30564 1726882818.31844: variable 'ansible_distribution_major_version' from source: facts 30564 1726882818.31862: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882818.31880: _execute() done 30564 1726882818.31894: dumping result to json 30564 1726882818.31902: done dumping result, returning 30564 1726882818.31912: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' [0e448fcc-3ce9-4216-acec-0000000005dd] 30564 1726882818.31922: sending task result for task 0e448fcc-3ce9-4216-acec-0000000005dd 30564 1726882818.32047: no more pending results, returning what we have 30564 1726882818.32052: in VariableManager get_vars() 30564 1726882818.32089: Calling all_inventory to load vars for managed_node2 30564 1726882818.32091: Calling groups_inventory to load vars for managed_node2 30564 1726882818.32095: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882818.32110: Calling all_plugins_play to load vars for managed_node2 30564 1726882818.32114: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882818.32117: Calling groups_plugins_play to load vars for managed_node2 30564 1726882818.33257: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000005dd 30564 1726882818.33260: WORKER PROCESS EXITING 30564 1726882818.33904: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882818.35647: done with get_vars() 30564 1726882818.35669: variable 'ansible_search_path' from source: unknown 30564 1726882818.35670: variable 'ansible_search_path' from source: unknown 30564 1726882818.35707: we have included files to process 30564 1726882818.35708: generating all_blocks data 30564 1726882818.35710: done generating all_blocks data 30564 1726882818.35712: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30564 1726882818.35713: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30564 1726882818.35715: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30564 1726882818.35971: done processing included file 30564 1726882818.35973: iterating over new_blocks loaded from include file 30564 1726882818.35975: in VariableManager get_vars() 30564 1726882818.35990: done with get_vars() 30564 1726882818.35991: filtering new block on tags 30564 1726882818.36027: done filtering new block on tags 30564 1726882818.36030: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node2 30564 1726882818.36035: extending task lists for all hosts with included blocks 30564 1726882818.36196: done extending task lists 30564 1726882818.36197: done processing included files 30564 1726882818.36198: results queue empty 30564 1726882818.36199: checking for any_errors_fatal 30564 1726882818.36202: done checking for any_errors_fatal 30564 1726882818.36202: checking for max_fail_percentage 30564 1726882818.36204: done checking for max_fail_percentage 30564 1726882818.36204: checking to see if all hosts have failed and the running result is not ok 30564 1726882818.36205: done checking to see if all hosts have failed 30564 1726882818.36206: getting the remaining hosts for this loop 30564 1726882818.36207: done getting the remaining hosts for this loop 30564 1726882818.36210: getting the next task for host managed_node2 30564 1726882818.36214: done getting next task for host managed_node2 30564 1726882818.36217: ^ task is: TASK: Gather current interface info 30564 1726882818.36220: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882818.36222: getting variables 30564 1726882818.36223: in VariableManager get_vars() 30564 1726882818.36231: Calling all_inventory to load vars for managed_node2 30564 1726882818.36233: Calling groups_inventory to load vars for managed_node2 30564 1726882818.36235: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882818.36240: Calling all_plugins_play to load vars for managed_node2 30564 1726882818.36243: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882818.36246: Calling groups_plugins_play to load vars for managed_node2 30564 1726882818.37415: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882818.39056: done with get_vars() 30564 1726882818.39083: done getting variables 30564 1726882818.39126: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 21:40:18 -0400 (0:00:00.087) 0:00:16.972 ****** 30564 1726882818.39157: entering _queue_task() for managed_node2/command 30564 1726882818.39484: worker is 1 (out of 1 available) 30564 1726882818.39498: exiting _queue_task() for managed_node2/command 30564 1726882818.39511: done queuing things up, now waiting for results queue to drain 30564 1726882818.39512: waiting for pending results... 30564 1726882818.39791: running TaskExecutor() for managed_node2/TASK: Gather current interface info 30564 1726882818.39913: in run() - task 0e448fcc-3ce9-4216-acec-000000000618 30564 1726882818.39930: variable 'ansible_search_path' from source: unknown 30564 1726882818.39937: variable 'ansible_search_path' from source: unknown 30564 1726882818.39978: calling self._execute() 30564 1726882818.40065: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882818.40079: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882818.40093: variable 'omit' from source: magic vars 30564 1726882818.40445: variable 'ansible_distribution_major_version' from source: facts 30564 1726882818.40462: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882818.40474: variable 'omit' from source: magic vars 30564 1726882818.40527: variable 'omit' from source: magic vars 30564 1726882818.40563: variable 'omit' from source: magic vars 30564 1726882818.40617: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882818.40657: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882818.40688: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882818.40715: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882818.40731: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882818.40765: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882818.40775: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882818.40783: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882818.40894: Set connection var ansible_timeout to 10 30564 1726882818.40907: Set connection var ansible_pipelining to False 30564 1726882818.40914: Set connection var ansible_shell_type to sh 30564 1726882818.40926: Set connection var ansible_shell_executable to /bin/sh 30564 1726882818.40941: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882818.40948: Set connection var ansible_connection to ssh 30564 1726882818.40977: variable 'ansible_shell_executable' from source: unknown 30564 1726882818.40986: variable 'ansible_connection' from source: unknown 30564 1726882818.40995: variable 'ansible_module_compression' from source: unknown 30564 1726882818.41002: variable 'ansible_shell_type' from source: unknown 30564 1726882818.41009: variable 'ansible_shell_executable' from source: unknown 30564 1726882818.41016: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882818.41025: variable 'ansible_pipelining' from source: unknown 30564 1726882818.41032: variable 'ansible_timeout' from source: unknown 30564 1726882818.41045: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882818.41192: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882818.41210: variable 'omit' from source: magic vars 30564 1726882818.41221: starting attempt loop 30564 1726882818.41228: running the handler 30564 1726882818.41248: _low_level_execute_command(): starting 30564 1726882818.41266: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882818.42016: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882818.42034: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882818.42050: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882818.42073: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882818.42117: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882818.42133: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882818.42147: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882818.42170: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882818.42183: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882818.42195: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882818.42208: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882818.42222: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882818.42242: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882818.42255: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882818.42270: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882818.42286: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882818.42366: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882818.42391: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882818.42407: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882818.42551: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882818.44215: stdout chunk (state=3): >>>/root <<< 30564 1726882818.44323: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882818.44404: stderr chunk (state=3): >>><<< 30564 1726882818.44421: stdout chunk (state=3): >>><<< 30564 1726882818.44540: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882818.44544: _low_level_execute_command(): starting 30564 1726882818.44546: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882818.4444876-31327-753378963954 `" && echo ansible-tmp-1726882818.4444876-31327-753378963954="` echo /root/.ansible/tmp/ansible-tmp-1726882818.4444876-31327-753378963954 `" ) && sleep 0' 30564 1726882818.45122: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882818.45135: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882818.45148: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882818.45169: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882818.45216: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882818.45227: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882818.45239: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882818.45255: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882818.45269: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882818.45280: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882818.45294: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882818.45309: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882818.45323: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882818.45334: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882818.45343: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882818.45354: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882818.45435: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882818.45455: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882818.45473: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882818.45601: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882818.47490: stdout chunk (state=3): >>>ansible-tmp-1726882818.4444876-31327-753378963954=/root/.ansible/tmp/ansible-tmp-1726882818.4444876-31327-753378963954 <<< 30564 1726882818.47679: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882818.47682: stdout chunk (state=3): >>><<< 30564 1726882818.47685: stderr chunk (state=3): >>><<< 30564 1726882818.47770: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882818.4444876-31327-753378963954=/root/.ansible/tmp/ansible-tmp-1726882818.4444876-31327-753378963954 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882818.47773: variable 'ansible_module_compression' from source: unknown 30564 1726882818.47870: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30564 1726882818.47874: variable 'ansible_facts' from source: unknown 30564 1726882818.47922: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882818.4444876-31327-753378963954/AnsiballZ_command.py 30564 1726882818.48074: Sending initial data 30564 1726882818.48078: Sent initial data (153 bytes) 30564 1726882818.49036: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882818.49048: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882818.49059: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882818.49079: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882818.49120: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882818.49130: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882818.49143: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882818.49159: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882818.49172: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882818.49185: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882818.49197: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882818.49215: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882818.49233: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882818.49246: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882818.49257: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882818.49275: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882818.49352: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882818.49378: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882818.49395: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882818.49523: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882818.51278: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882818.51376: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882818.51480: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmpoyhznz43 /root/.ansible/tmp/ansible-tmp-1726882818.4444876-31327-753378963954/AnsiballZ_command.py <<< 30564 1726882818.51576: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882818.52929: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882818.53071: stderr chunk (state=3): >>><<< 30564 1726882818.53075: stdout chunk (state=3): >>><<< 30564 1726882818.53078: done transferring module to remote 30564 1726882818.53084: _low_level_execute_command(): starting 30564 1726882818.53086: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882818.4444876-31327-753378963954/ /root/.ansible/tmp/ansible-tmp-1726882818.4444876-31327-753378963954/AnsiballZ_command.py && sleep 0' 30564 1726882818.53767: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882818.53784: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882818.53799: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882818.53819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882818.53873: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882818.53887: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882818.53903: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882818.53921: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882818.53933: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882818.53948: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882818.53965: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882818.53982: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882818.53999: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882818.54013: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882818.54024: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882818.54039: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882818.54122: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882818.54143: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882818.54162: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882818.54305: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882818.56139: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882818.56215: stderr chunk (state=3): >>><<< 30564 1726882818.56226: stdout chunk (state=3): >>><<< 30564 1726882818.56273: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882818.56277: _low_level_execute_command(): starting 30564 1726882818.56280: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882818.4444876-31327-753378963954/AnsiballZ_command.py && sleep 0' 30564 1726882818.56873: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882818.56887: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882818.56903: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882818.56922: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882818.56963: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882818.56979: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882818.56993: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882818.57011: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882818.57024: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882818.57036: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882818.57049: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882818.57063: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882818.57082: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882818.57094: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882818.57105: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882818.57119: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882818.57193: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882818.57210: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882818.57225: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882818.57381: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882818.70887: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo\nrpltstbr", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:40:18.703494", "end": "2024-09-20 21:40:18.706817", "delta": "0:00:00.003323", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30564 1726882818.72136: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882818.72140: stdout chunk (state=3): >>><<< 30564 1726882818.72145: stderr chunk (state=3): >>><<< 30564 1726882818.72167: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo\nrpltstbr", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:40:18.703494", "end": "2024-09-20 21:40:18.706817", "delta": "0:00:00.003323", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882818.72209: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882818.4444876-31327-753378963954/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882818.72215: _low_level_execute_command(): starting 30564 1726882818.72221: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882818.4444876-31327-753378963954/ > /dev/null 2>&1 && sleep 0' 30564 1726882818.73607: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882818.73655: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882818.73667: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882818.73731: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882818.73777: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882818.73785: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882818.73853: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882818.73869: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882818.73880: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882818.73887: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882818.73898: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882818.73903: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882818.73915: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882818.73922: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882818.73931: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882818.73937: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882818.74047: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882818.74100: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882818.74104: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882818.74350: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882818.76457: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882818.76461: stdout chunk (state=3): >>><<< 30564 1726882818.76471: stderr chunk (state=3): >>><<< 30564 1726882818.76488: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882818.76496: handler run complete 30564 1726882818.76521: Evaluated conditional (False): False 30564 1726882818.76532: attempt loop complete, returning result 30564 1726882818.76535: _execute() done 30564 1726882818.76537: dumping result to json 30564 1726882818.76543: done dumping result, returning 30564 1726882818.76550: done running TaskExecutor() for managed_node2/TASK: Gather current interface info [0e448fcc-3ce9-4216-acec-000000000618] 30564 1726882818.76557: sending task result for task 0e448fcc-3ce9-4216-acec-000000000618 30564 1726882818.76670: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000618 30564 1726882818.76675: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003323", "end": "2024-09-20 21:40:18.706817", "rc": 0, "start": "2024-09-20 21:40:18.703494" } STDOUT: bonding_masters eth0 lo rpltstbr 30564 1726882818.76744: no more pending results, returning what we have 30564 1726882818.76748: results queue empty 30564 1726882818.76749: checking for any_errors_fatal 30564 1726882818.76751: done checking for any_errors_fatal 30564 1726882818.76751: checking for max_fail_percentage 30564 1726882818.76753: done checking for max_fail_percentage 30564 1726882818.76754: checking to see if all hosts have failed and the running result is not ok 30564 1726882818.76755: done checking to see if all hosts have failed 30564 1726882818.76756: getting the remaining hosts for this loop 30564 1726882818.76758: done getting the remaining hosts for this loop 30564 1726882818.76761: getting the next task for host managed_node2 30564 1726882818.76770: done getting next task for host managed_node2 30564 1726882818.76773: ^ task is: TASK: Set current_interfaces 30564 1726882818.76778: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882818.76782: getting variables 30564 1726882818.76783: in VariableManager get_vars() 30564 1726882818.76813: Calling all_inventory to load vars for managed_node2 30564 1726882818.76815: Calling groups_inventory to load vars for managed_node2 30564 1726882818.76819: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882818.76828: Calling all_plugins_play to load vars for managed_node2 30564 1726882818.76831: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882818.76838: Calling groups_plugins_play to load vars for managed_node2 30564 1726882818.88706: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882818.92550: done with get_vars() 30564 1726882818.92689: done getting variables 30564 1726882818.92732: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 21:40:18 -0400 (0:00:00.535) 0:00:17.508 ****** 30564 1726882818.92759: entering _queue_task() for managed_node2/set_fact 30564 1726882818.93175: worker is 1 (out of 1 available) 30564 1726882818.93187: exiting _queue_task() for managed_node2/set_fact 30564 1726882818.93201: done queuing things up, now waiting for results queue to drain 30564 1726882818.93202: waiting for pending results... 30564 1726882818.95035: running TaskExecutor() for managed_node2/TASK: Set current_interfaces 30564 1726882818.95366: in run() - task 0e448fcc-3ce9-4216-acec-000000000619 30564 1726882818.95381: variable 'ansible_search_path' from source: unknown 30564 1726882818.95386: variable 'ansible_search_path' from source: unknown 30564 1726882818.95419: calling self._execute() 30564 1726882818.95615: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882818.95621: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882818.95629: variable 'omit' from source: magic vars 30564 1726882818.96590: variable 'ansible_distribution_major_version' from source: facts 30564 1726882818.96605: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882818.96610: variable 'omit' from source: magic vars 30564 1726882818.96661: variable 'omit' from source: magic vars 30564 1726882818.96767: variable '_current_interfaces' from source: set_fact 30564 1726882818.96830: variable 'omit' from source: magic vars 30564 1726882818.96874: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882818.96908: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882818.96926: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882818.96944: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882818.96954: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882818.98189: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882818.98193: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882818.98196: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882818.98298: Set connection var ansible_timeout to 10 30564 1726882818.98302: Set connection var ansible_pipelining to False 30564 1726882818.98305: Set connection var ansible_shell_type to sh 30564 1726882818.98312: Set connection var ansible_shell_executable to /bin/sh 30564 1726882818.98320: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882818.98323: Set connection var ansible_connection to ssh 30564 1726882818.98348: variable 'ansible_shell_executable' from source: unknown 30564 1726882818.98352: variable 'ansible_connection' from source: unknown 30564 1726882818.98355: variable 'ansible_module_compression' from source: unknown 30564 1726882818.98357: variable 'ansible_shell_type' from source: unknown 30564 1726882818.98360: variable 'ansible_shell_executable' from source: unknown 30564 1726882818.98362: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882818.98367: variable 'ansible_pipelining' from source: unknown 30564 1726882818.98369: variable 'ansible_timeout' from source: unknown 30564 1726882818.98376: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882818.98510: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882818.98520: variable 'omit' from source: magic vars 30564 1726882818.98526: starting attempt loop 30564 1726882818.98529: running the handler 30564 1726882818.98541: handler run complete 30564 1726882818.98549: attempt loop complete, returning result 30564 1726882818.98553: _execute() done 30564 1726882818.98556: dumping result to json 30564 1726882818.98558: done dumping result, returning 30564 1726882818.98567: done running TaskExecutor() for managed_node2/TASK: Set current_interfaces [0e448fcc-3ce9-4216-acec-000000000619] 30564 1726882818.98575: sending task result for task 0e448fcc-3ce9-4216-acec-000000000619 30564 1726882818.98672: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000619 30564 1726882818.98675: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo", "rpltstbr" ] }, "changed": false } 30564 1726882818.98729: no more pending results, returning what we have 30564 1726882818.98733: results queue empty 30564 1726882818.98734: checking for any_errors_fatal 30564 1726882818.98741: done checking for any_errors_fatal 30564 1726882818.98742: checking for max_fail_percentage 30564 1726882818.98743: done checking for max_fail_percentage 30564 1726882818.98744: checking to see if all hosts have failed and the running result is not ok 30564 1726882818.98745: done checking to see if all hosts have failed 30564 1726882818.98746: getting the remaining hosts for this loop 30564 1726882818.98747: done getting the remaining hosts for this loop 30564 1726882818.98751: getting the next task for host managed_node2 30564 1726882818.98760: done getting next task for host managed_node2 30564 1726882818.98765: ^ task is: TASK: Show current_interfaces 30564 1726882818.98769: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882818.98773: getting variables 30564 1726882818.98775: in VariableManager get_vars() 30564 1726882818.98804: Calling all_inventory to load vars for managed_node2 30564 1726882818.98807: Calling groups_inventory to load vars for managed_node2 30564 1726882818.98810: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882818.98819: Calling all_plugins_play to load vars for managed_node2 30564 1726882818.98821: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882818.98824: Calling groups_plugins_play to load vars for managed_node2 30564 1726882819.00997: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882819.02924: done with get_vars() 30564 1726882819.02945: done getting variables 30564 1726882819.03010: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 21:40:19 -0400 (0:00:00.102) 0:00:17.611 ****** 30564 1726882819.03042: entering _queue_task() for managed_node2/debug 30564 1726882819.03371: worker is 1 (out of 1 available) 30564 1726882819.03389: exiting _queue_task() for managed_node2/debug 30564 1726882819.03402: done queuing things up, now waiting for results queue to drain 30564 1726882819.03403: waiting for pending results... 30564 1726882819.03691: running TaskExecutor() for managed_node2/TASK: Show current_interfaces 30564 1726882819.03829: in run() - task 0e448fcc-3ce9-4216-acec-0000000005de 30564 1726882819.03854: variable 'ansible_search_path' from source: unknown 30564 1726882819.03861: variable 'ansible_search_path' from source: unknown 30564 1726882819.03903: calling self._execute() 30564 1726882819.04002: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882819.04011: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882819.04024: variable 'omit' from source: magic vars 30564 1726882819.04403: variable 'ansible_distribution_major_version' from source: facts 30564 1726882819.04421: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882819.04432: variable 'omit' from source: magic vars 30564 1726882819.04488: variable 'omit' from source: magic vars 30564 1726882819.04591: variable 'current_interfaces' from source: set_fact 30564 1726882819.04627: variable 'omit' from source: magic vars 30564 1726882819.04671: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882819.04721: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882819.04746: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882819.04772: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882819.04789: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882819.04829: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882819.04836: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882819.04842: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882819.04946: Set connection var ansible_timeout to 10 30564 1726882819.04956: Set connection var ansible_pipelining to False 30564 1726882819.04965: Set connection var ansible_shell_type to sh 30564 1726882819.04977: Set connection var ansible_shell_executable to /bin/sh 30564 1726882819.04990: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882819.04996: Set connection var ansible_connection to ssh 30564 1726882819.05031: variable 'ansible_shell_executable' from source: unknown 30564 1726882819.05042: variable 'ansible_connection' from source: unknown 30564 1726882819.05050: variable 'ansible_module_compression' from source: unknown 30564 1726882819.05057: variable 'ansible_shell_type' from source: unknown 30564 1726882819.05065: variable 'ansible_shell_executable' from source: unknown 30564 1726882819.05073: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882819.05082: variable 'ansible_pipelining' from source: unknown 30564 1726882819.05089: variable 'ansible_timeout' from source: unknown 30564 1726882819.05096: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882819.05243: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882819.05268: variable 'omit' from source: magic vars 30564 1726882819.05278: starting attempt loop 30564 1726882819.05285: running the handler 30564 1726882819.05334: handler run complete 30564 1726882819.05360: attempt loop complete, returning result 30564 1726882819.05372: _execute() done 30564 1726882819.05380: dumping result to json 30564 1726882819.05386: done dumping result, returning 30564 1726882819.05397: done running TaskExecutor() for managed_node2/TASK: Show current_interfaces [0e448fcc-3ce9-4216-acec-0000000005de] 30564 1726882819.05406: sending task result for task 0e448fcc-3ce9-4216-acec-0000000005de ok: [managed_node2] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo', 'rpltstbr'] 30564 1726882819.05548: no more pending results, returning what we have 30564 1726882819.05551: results queue empty 30564 1726882819.05552: checking for any_errors_fatal 30564 1726882819.05559: done checking for any_errors_fatal 30564 1726882819.05560: checking for max_fail_percentage 30564 1726882819.05562: done checking for max_fail_percentage 30564 1726882819.05563: checking to see if all hosts have failed and the running result is not ok 30564 1726882819.05566: done checking to see if all hosts have failed 30564 1726882819.05567: getting the remaining hosts for this loop 30564 1726882819.05569: done getting the remaining hosts for this loop 30564 1726882819.05573: getting the next task for host managed_node2 30564 1726882819.05583: done getting next task for host managed_node2 30564 1726882819.05586: ^ task is: TASK: Setup 30564 1726882819.05589: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882819.05594: getting variables 30564 1726882819.05596: in VariableManager get_vars() 30564 1726882819.05629: Calling all_inventory to load vars for managed_node2 30564 1726882819.05632: Calling groups_inventory to load vars for managed_node2 30564 1726882819.05636: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882819.05647: Calling all_plugins_play to load vars for managed_node2 30564 1726882819.05650: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882819.05653: Calling groups_plugins_play to load vars for managed_node2 30564 1726882819.06704: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000005de 30564 1726882819.06707: WORKER PROCESS EXITING 30564 1726882819.07392: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882819.09203: done with get_vars() 30564 1726882819.09228: done getting variables TASK [Setup] ******************************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:24 Friday 20 September 2024 21:40:19 -0400 (0:00:00.062) 0:00:17.674 ****** 30564 1726882819.09317: entering _queue_task() for managed_node2/include_tasks 30564 1726882819.09593: worker is 1 (out of 1 available) 30564 1726882819.09607: exiting _queue_task() for managed_node2/include_tasks 30564 1726882819.09619: done queuing things up, now waiting for results queue to drain 30564 1726882819.09620: waiting for pending results... 30564 1726882819.09907: running TaskExecutor() for managed_node2/TASK: Setup 30564 1726882819.10011: in run() - task 0e448fcc-3ce9-4216-acec-0000000005b7 30564 1726882819.10029: variable 'ansible_search_path' from source: unknown 30564 1726882819.10035: variable 'ansible_search_path' from source: unknown 30564 1726882819.10090: variable 'lsr_setup' from source: include params 30564 1726882819.10307: variable 'lsr_setup' from source: include params 30564 1726882819.10379: variable 'omit' from source: magic vars 30564 1726882819.10511: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882819.10524: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882819.10540: variable 'omit' from source: magic vars 30564 1726882819.10775: variable 'ansible_distribution_major_version' from source: facts 30564 1726882819.10792: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882819.10803: variable 'item' from source: unknown 30564 1726882819.10876: variable 'item' from source: unknown 30564 1726882819.10911: variable 'item' from source: unknown 30564 1726882819.10987: variable 'item' from source: unknown 30564 1726882819.11176: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882819.11189: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882819.11204: variable 'omit' from source: magic vars 30564 1726882819.11583: variable 'ansible_distribution_major_version' from source: facts 30564 1726882819.11594: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882819.11604: variable 'item' from source: unknown 30564 1726882819.11730: variable 'item' from source: unknown 30564 1726882819.11761: variable 'item' from source: unknown 30564 1726882819.11826: variable 'item' from source: unknown 30564 1726882819.12024: dumping result to json 30564 1726882819.12034: done dumping result, returning 30564 1726882819.12045: done running TaskExecutor() for managed_node2/TASK: Setup [0e448fcc-3ce9-4216-acec-0000000005b7] 30564 1726882819.12055: sending task result for task 0e448fcc-3ce9-4216-acec-0000000005b7 30564 1726882819.12143: no more pending results, returning what we have 30564 1726882819.12148: in VariableManager get_vars() 30564 1726882819.12186: Calling all_inventory to load vars for managed_node2 30564 1726882819.12189: Calling groups_inventory to load vars for managed_node2 30564 1726882819.12193: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882819.12208: Calling all_plugins_play to load vars for managed_node2 30564 1726882819.12211: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882819.12214: Calling groups_plugins_play to load vars for managed_node2 30564 1726882819.13831: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000005b7 30564 1726882819.13834: WORKER PROCESS EXITING 30564 1726882819.14451: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882819.18134: done with get_vars() 30564 1726882819.18161: variable 'ansible_search_path' from source: unknown 30564 1726882819.18163: variable 'ansible_search_path' from source: unknown 30564 1726882819.18212: variable 'ansible_search_path' from source: unknown 30564 1726882819.18213: variable 'ansible_search_path' from source: unknown 30564 1726882819.18244: we have included files to process 30564 1726882819.18246: generating all_blocks data 30564 1726882819.18248: done generating all_blocks data 30564 1726882819.18253: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 30564 1726882819.18254: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 30564 1726882819.18256: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 30564 1726882819.18484: done processing included file 30564 1726882819.18486: iterating over new_blocks loaded from include file 30564 1726882819.18488: in VariableManager get_vars() 30564 1726882819.18504: done with get_vars() 30564 1726882819.18506: filtering new block on tags 30564 1726882819.18532: done filtering new block on tags 30564 1726882819.18534: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml for managed_node2 => (item=tasks/delete_interface.yml) 30564 1726882819.18540: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 30564 1726882819.18541: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 30564 1726882819.18543: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 30564 1726882819.18628: in VariableManager get_vars() 30564 1726882819.18646: done with get_vars() 30564 1726882819.18741: done processing included file 30564 1726882819.18743: iterating over new_blocks loaded from include file 30564 1726882819.18744: in VariableManager get_vars() 30564 1726882819.18757: done with get_vars() 30564 1726882819.18759: filtering new block on tags 30564 1726882819.18792: done filtering new block on tags 30564 1726882819.18795: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed_node2 => (item=tasks/assert_device_absent.yml) 30564 1726882819.18799: extending task lists for all hosts with included blocks 30564 1726882819.19413: done extending task lists 30564 1726882819.19415: done processing included files 30564 1726882819.19415: results queue empty 30564 1726882819.19416: checking for any_errors_fatal 30564 1726882819.19420: done checking for any_errors_fatal 30564 1726882819.19421: checking for max_fail_percentage 30564 1726882819.19422: done checking for max_fail_percentage 30564 1726882819.19423: checking to see if all hosts have failed and the running result is not ok 30564 1726882819.19424: done checking to see if all hosts have failed 30564 1726882819.19425: getting the remaining hosts for this loop 30564 1726882819.19426: done getting the remaining hosts for this loop 30564 1726882819.19428: getting the next task for host managed_node2 30564 1726882819.19432: done getting next task for host managed_node2 30564 1726882819.19434: ^ task is: TASK: Remove test interface if necessary 30564 1726882819.19437: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882819.19439: getting variables 30564 1726882819.19440: in VariableManager get_vars() 30564 1726882819.19454: Calling all_inventory to load vars for managed_node2 30564 1726882819.19456: Calling groups_inventory to load vars for managed_node2 30564 1726882819.19459: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882819.19466: Calling all_plugins_play to load vars for managed_node2 30564 1726882819.19469: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882819.19472: Calling groups_plugins_play to load vars for managed_node2 30564 1726882819.20909: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882819.22857: done with get_vars() 30564 1726882819.22880: done getting variables 30564 1726882819.22922: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Remove test interface if necessary] ************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml:3 Friday 20 September 2024 21:40:19 -0400 (0:00:00.136) 0:00:17.810 ****** 30564 1726882819.22951: entering _queue_task() for managed_node2/command 30564 1726882819.23273: worker is 1 (out of 1 available) 30564 1726882819.23286: exiting _queue_task() for managed_node2/command 30564 1726882819.23299: done queuing things up, now waiting for results queue to drain 30564 1726882819.23301: waiting for pending results... 30564 1726882819.23640: running TaskExecutor() for managed_node2/TASK: Remove test interface if necessary 30564 1726882819.23833: in run() - task 0e448fcc-3ce9-4216-acec-00000000063e 30564 1726882819.23852: variable 'ansible_search_path' from source: unknown 30564 1726882819.23863: variable 'ansible_search_path' from source: unknown 30564 1726882819.23942: calling self._execute() 30564 1726882819.24042: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882819.24054: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882819.24071: variable 'omit' from source: magic vars 30564 1726882819.24462: variable 'ansible_distribution_major_version' from source: facts 30564 1726882819.24484: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882819.24495: variable 'omit' from source: magic vars 30564 1726882819.24549: variable 'omit' from source: magic vars 30564 1726882819.24648: variable 'interface' from source: play vars 30564 1726882819.24671: variable 'omit' from source: magic vars 30564 1726882819.24718: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882819.24761: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882819.24788: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882819.24811: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882819.24829: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882819.24868: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882819.24878: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882819.24886: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882819.24995: Set connection var ansible_timeout to 10 30564 1726882819.25007: Set connection var ansible_pipelining to False 30564 1726882819.25014: Set connection var ansible_shell_type to sh 30564 1726882819.25025: Set connection var ansible_shell_executable to /bin/sh 30564 1726882819.25037: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882819.25044: Set connection var ansible_connection to ssh 30564 1726882819.25077: variable 'ansible_shell_executable' from source: unknown 30564 1726882819.25087: variable 'ansible_connection' from source: unknown 30564 1726882819.25094: variable 'ansible_module_compression' from source: unknown 30564 1726882819.25102: variable 'ansible_shell_type' from source: unknown 30564 1726882819.25109: variable 'ansible_shell_executable' from source: unknown 30564 1726882819.25115: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882819.25123: variable 'ansible_pipelining' from source: unknown 30564 1726882819.25129: variable 'ansible_timeout' from source: unknown 30564 1726882819.25137: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882819.25286: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882819.25305: variable 'omit' from source: magic vars 30564 1726882819.25315: starting attempt loop 30564 1726882819.25323: running the handler 30564 1726882819.25342: _low_level_execute_command(): starting 30564 1726882819.25355: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882819.26197: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882819.26209: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882819.26225: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882819.26250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882819.26298: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882819.26312: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882819.26326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882819.26346: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882819.26359: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882819.26374: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882819.26392: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882819.26407: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882819.26424: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882819.26437: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882819.26451: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882819.26469: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882819.26546: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882819.26572: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882819.26591: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882819.26733: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882819.28396: stdout chunk (state=3): >>>/root <<< 30564 1726882819.28583: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882819.28586: stdout chunk (state=3): >>><<< 30564 1726882819.28589: stderr chunk (state=3): >>><<< 30564 1726882819.28709: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882819.28713: _low_level_execute_command(): starting 30564 1726882819.28716: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882819.2861266-31360-108099771885262 `" && echo ansible-tmp-1726882819.2861266-31360-108099771885262="` echo /root/.ansible/tmp/ansible-tmp-1726882819.2861266-31360-108099771885262 `" ) && sleep 0' 30564 1726882819.29336: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882819.29353: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882819.29377: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882819.29395: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882819.29437: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882819.29451: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882819.29470: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882819.29495: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882819.29507: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882819.29518: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882819.29532: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882819.29546: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882819.29562: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882819.29582: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882819.29600: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882819.29615: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882819.29733: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882819.29772: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882819.29802: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882819.29944: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882819.31830: stdout chunk (state=3): >>>ansible-tmp-1726882819.2861266-31360-108099771885262=/root/.ansible/tmp/ansible-tmp-1726882819.2861266-31360-108099771885262 <<< 30564 1726882819.32015: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882819.32019: stdout chunk (state=3): >>><<< 30564 1726882819.32021: stderr chunk (state=3): >>><<< 30564 1726882819.32309: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882819.2861266-31360-108099771885262=/root/.ansible/tmp/ansible-tmp-1726882819.2861266-31360-108099771885262 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882819.32318: variable 'ansible_module_compression' from source: unknown 30564 1726882819.32320: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30564 1726882819.32322: variable 'ansible_facts' from source: unknown 30564 1726882819.32324: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882819.2861266-31360-108099771885262/AnsiballZ_command.py 30564 1726882819.32566: Sending initial data 30564 1726882819.32570: Sent initial data (156 bytes) 30564 1726882819.33521: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882819.33525: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882819.33556: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 30564 1726882819.33559: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882819.33563: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882819.33612: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882819.33620: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882819.33731: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882819.35516: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882819.35613: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882819.35708: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmpae0l8_7f /root/.ansible/tmp/ansible-tmp-1726882819.2861266-31360-108099771885262/AnsiballZ_command.py <<< 30564 1726882819.35816: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882819.36852: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882819.36934: stderr chunk (state=3): >>><<< 30564 1726882819.36938: stdout chunk (state=3): >>><<< 30564 1726882819.36960: done transferring module to remote 30564 1726882819.36972: _low_level_execute_command(): starting 30564 1726882819.36975: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882819.2861266-31360-108099771885262/ /root/.ansible/tmp/ansible-tmp-1726882819.2861266-31360-108099771885262/AnsiballZ_command.py && sleep 0' 30564 1726882819.37377: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882819.37395: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882819.37434: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882819.37437: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882819.37439: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882819.37494: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882819.37496: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882819.37615: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882819.39397: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882819.39439: stderr chunk (state=3): >>><<< 30564 1726882819.39441: stdout chunk (state=3): >>><<< 30564 1726882819.39471: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882819.39475: _low_level_execute_command(): starting 30564 1726882819.39479: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882819.2861266-31360-108099771885262/AnsiballZ_command.py && sleep 0' 30564 1726882819.39878: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882819.39884: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882819.39895: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882819.39926: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882819.39932: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882819.39941: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882819.39946: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882819.39951: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882819.39961: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882819.39972: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882819.39975: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882819.39983: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882819.40035: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882819.40056: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882819.40061: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882819.40166: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882819.53855: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "Cannot find device \"statebr\"", "rc": 1, "cmd": ["ip", "link", "del", "statebr"], "start": "2024-09-20 21:40:19.530543", "end": "2024-09-20 21:40:19.536563", "delta": "0:00:00.006020", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del statebr", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30564 1726882819.54930: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.11.158 closed. <<< 30564 1726882819.54994: stderr chunk (state=3): >>><<< 30564 1726882819.54998: stdout chunk (state=3): >>><<< 30564 1726882819.55016: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "Cannot find device \"statebr\"", "rc": 1, "cmd": ["ip", "link", "del", "statebr"], "start": "2024-09-20 21:40:19.530543", "end": "2024-09-20 21:40:19.536563", "delta": "0:00:00.006020", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del statebr", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.11.158 closed. 30564 1726882819.55057: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882819.2861266-31360-108099771885262/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882819.55068: _low_level_execute_command(): starting 30564 1726882819.55073: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882819.2861266-31360-108099771885262/ > /dev/null 2>&1 && sleep 0' 30564 1726882819.55688: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882819.55697: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882819.55708: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882819.55722: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882819.55760: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882819.55768: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882819.55782: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882819.55795: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882819.55802: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882819.55810: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882819.55817: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882819.55825: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882819.55837: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882819.55844: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882819.55851: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882819.55860: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882819.55938: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882819.55954: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882819.55967: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882819.56095: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882819.57981: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882819.58010: stderr chunk (state=3): >>><<< 30564 1726882819.58013: stdout chunk (state=3): >>><<< 30564 1726882819.58369: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882819.58373: handler run complete 30564 1726882819.58375: Evaluated conditional (False): False 30564 1726882819.58377: attempt loop complete, returning result 30564 1726882819.58379: _execute() done 30564 1726882819.58381: dumping result to json 30564 1726882819.58383: done dumping result, returning 30564 1726882819.58385: done running TaskExecutor() for managed_node2/TASK: Remove test interface if necessary [0e448fcc-3ce9-4216-acec-00000000063e] 30564 1726882819.58387: sending task result for task 0e448fcc-3ce9-4216-acec-00000000063e 30564 1726882819.58460: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000063e 30564 1726882819.58465: WORKER PROCESS EXITING fatal: [managed_node2]: FAILED! => { "changed": false, "cmd": [ "ip", "link", "del", "statebr" ], "delta": "0:00:00.006020", "end": "2024-09-20 21:40:19.536563", "rc": 1, "start": "2024-09-20 21:40:19.530543" } STDERR: Cannot find device "statebr" MSG: non-zero return code ...ignoring 30564 1726882819.58528: no more pending results, returning what we have 30564 1726882819.58533: results queue empty 30564 1726882819.58534: checking for any_errors_fatal 30564 1726882819.58535: done checking for any_errors_fatal 30564 1726882819.58536: checking for max_fail_percentage 30564 1726882819.58538: done checking for max_fail_percentage 30564 1726882819.58538: checking to see if all hosts have failed and the running result is not ok 30564 1726882819.58539: done checking to see if all hosts have failed 30564 1726882819.58540: getting the remaining hosts for this loop 30564 1726882819.58541: done getting the remaining hosts for this loop 30564 1726882819.58544: getting the next task for host managed_node2 30564 1726882819.58554: done getting next task for host managed_node2 30564 1726882819.58556: ^ task is: TASK: Include the task 'get_interface_stat.yml' 30564 1726882819.58560: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882819.58566: getting variables 30564 1726882819.58567: in VariableManager get_vars() 30564 1726882819.58596: Calling all_inventory to load vars for managed_node2 30564 1726882819.58599: Calling groups_inventory to load vars for managed_node2 30564 1726882819.58602: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882819.58612: Calling all_plugins_play to load vars for managed_node2 30564 1726882819.58615: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882819.58618: Calling groups_plugins_play to load vars for managed_node2 30564 1726882819.60117: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882819.61936: done with get_vars() 30564 1726882819.61961: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Friday 20 September 2024 21:40:19 -0400 (0:00:00.391) 0:00:18.202 ****** 30564 1726882819.62071: entering _queue_task() for managed_node2/include_tasks 30564 1726882819.62390: worker is 1 (out of 1 available) 30564 1726882819.62403: exiting _queue_task() for managed_node2/include_tasks 30564 1726882819.62416: done queuing things up, now waiting for results queue to drain 30564 1726882819.62417: waiting for pending results... 30564 1726882819.62728: running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' 30564 1726882819.62843: in run() - task 0e448fcc-3ce9-4216-acec-000000000642 30564 1726882819.62867: variable 'ansible_search_path' from source: unknown 30564 1726882819.62876: variable 'ansible_search_path' from source: unknown 30564 1726882819.62917: calling self._execute() 30564 1726882819.63017: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882819.63027: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882819.63042: variable 'omit' from source: magic vars 30564 1726882819.63426: variable 'ansible_distribution_major_version' from source: facts 30564 1726882819.63451: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882819.63465: _execute() done 30564 1726882819.63476: dumping result to json 30564 1726882819.63485: done dumping result, returning 30564 1726882819.63497: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' [0e448fcc-3ce9-4216-acec-000000000642] 30564 1726882819.63507: sending task result for task 0e448fcc-3ce9-4216-acec-000000000642 30564 1726882819.63647: no more pending results, returning what we have 30564 1726882819.63652: in VariableManager get_vars() 30564 1726882819.63693: Calling all_inventory to load vars for managed_node2 30564 1726882819.63698: Calling groups_inventory to load vars for managed_node2 30564 1726882819.63702: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882819.63719: Calling all_plugins_play to load vars for managed_node2 30564 1726882819.63723: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882819.63727: Calling groups_plugins_play to load vars for managed_node2 30564 1726882819.64812: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000642 30564 1726882819.64816: WORKER PROCESS EXITING 30564 1726882819.65631: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882819.67408: done with get_vars() 30564 1726882819.67428: variable 'ansible_search_path' from source: unknown 30564 1726882819.67429: variable 'ansible_search_path' from source: unknown 30564 1726882819.67443: variable 'item' from source: include params 30564 1726882819.67556: variable 'item' from source: include params 30564 1726882819.67592: we have included files to process 30564 1726882819.67593: generating all_blocks data 30564 1726882819.67594: done generating all_blocks data 30564 1726882819.67599: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30564 1726882819.67600: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30564 1726882819.67602: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30564 1726882819.67789: done processing included file 30564 1726882819.67791: iterating over new_blocks loaded from include file 30564 1726882819.67793: in VariableManager get_vars() 30564 1726882819.67808: done with get_vars() 30564 1726882819.67810: filtering new block on tags 30564 1726882819.67836: done filtering new block on tags 30564 1726882819.67838: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node2 30564 1726882819.67843: extending task lists for all hosts with included blocks 30564 1726882819.68010: done extending task lists 30564 1726882819.68012: done processing included files 30564 1726882819.68013: results queue empty 30564 1726882819.68013: checking for any_errors_fatal 30564 1726882819.68018: done checking for any_errors_fatal 30564 1726882819.68019: checking for max_fail_percentage 30564 1726882819.68020: done checking for max_fail_percentage 30564 1726882819.68020: checking to see if all hosts have failed and the running result is not ok 30564 1726882819.68021: done checking to see if all hosts have failed 30564 1726882819.68022: getting the remaining hosts for this loop 30564 1726882819.68023: done getting the remaining hosts for this loop 30564 1726882819.68026: getting the next task for host managed_node2 30564 1726882819.68030: done getting next task for host managed_node2 30564 1726882819.68032: ^ task is: TASK: Get stat for interface {{ interface }} 30564 1726882819.68036: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882819.68038: getting variables 30564 1726882819.68038: in VariableManager get_vars() 30564 1726882819.68047: Calling all_inventory to load vars for managed_node2 30564 1726882819.68050: Calling groups_inventory to load vars for managed_node2 30564 1726882819.68052: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882819.68057: Calling all_plugins_play to load vars for managed_node2 30564 1726882819.68059: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882819.68062: Calling groups_plugins_play to load vars for managed_node2 30564 1726882819.69360: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882819.71251: done with get_vars() 30564 1726882819.71273: done getting variables 30564 1726882819.71400: variable 'interface' from source: play vars TASK [Get stat for interface statebr] ****************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:40:19 -0400 (0:00:00.093) 0:00:18.295 ****** 30564 1726882819.71428: entering _queue_task() for managed_node2/stat 30564 1726882819.71748: worker is 1 (out of 1 available) 30564 1726882819.71760: exiting _queue_task() for managed_node2/stat 30564 1726882819.71777: done queuing things up, now waiting for results queue to drain 30564 1726882819.71779: waiting for pending results... 30564 1726882819.72059: running TaskExecutor() for managed_node2/TASK: Get stat for interface statebr 30564 1726882819.72196: in run() - task 0e448fcc-3ce9-4216-acec-000000000691 30564 1726882819.72225: variable 'ansible_search_path' from source: unknown 30564 1726882819.72233: variable 'ansible_search_path' from source: unknown 30564 1726882819.72277: calling self._execute() 30564 1726882819.72388: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882819.72400: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882819.72416: variable 'omit' from source: magic vars 30564 1726882819.73267: variable 'ansible_distribution_major_version' from source: facts 30564 1726882819.73296: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882819.73308: variable 'omit' from source: magic vars 30564 1726882819.73366: variable 'omit' from source: magic vars 30564 1726882819.73591: variable 'interface' from source: play vars 30564 1726882819.73731: variable 'omit' from source: magic vars 30564 1726882819.73778: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882819.73881: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882819.73909: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882819.73977: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882819.73993: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882819.74072: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882819.74150: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882819.74164: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882819.74391: Set connection var ansible_timeout to 10 30564 1726882819.74402: Set connection var ansible_pipelining to False 30564 1726882819.74409: Set connection var ansible_shell_type to sh 30564 1726882819.74420: Set connection var ansible_shell_executable to /bin/sh 30564 1726882819.74432: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882819.74438: Set connection var ansible_connection to ssh 30564 1726882819.74468: variable 'ansible_shell_executable' from source: unknown 30564 1726882819.74496: variable 'ansible_connection' from source: unknown 30564 1726882819.74504: variable 'ansible_module_compression' from source: unknown 30564 1726882819.74587: variable 'ansible_shell_type' from source: unknown 30564 1726882819.74599: variable 'ansible_shell_executable' from source: unknown 30564 1726882819.74607: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882819.74616: variable 'ansible_pipelining' from source: unknown 30564 1726882819.74623: variable 'ansible_timeout' from source: unknown 30564 1726882819.74631: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882819.75219: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30564 1726882819.75282: variable 'omit' from source: magic vars 30564 1726882819.75303: starting attempt loop 30564 1726882819.75310: running the handler 30564 1726882819.75327: _low_level_execute_command(): starting 30564 1726882819.75338: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882819.76069: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882819.76096: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882819.76119: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882819.76144: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882819.76148: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882819.76151: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882819.76197: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882819.76209: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882819.76325: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882819.78079: stdout chunk (state=3): >>>/root <<< 30564 1726882819.78337: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882819.78400: stderr chunk (state=3): >>><<< 30564 1726882819.78404: stdout chunk (state=3): >>><<< 30564 1726882819.78428: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882819.78439: _low_level_execute_command(): starting 30564 1726882819.78444: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882819.784255-31381-190438394205193 `" && echo ansible-tmp-1726882819.784255-31381-190438394205193="` echo /root/.ansible/tmp/ansible-tmp-1726882819.784255-31381-190438394205193 `" ) && sleep 0' 30564 1726882819.79682: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882819.79691: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882819.79701: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882819.79714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882819.79749: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882819.79760: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882819.79765: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882819.79782: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882819.79788: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882819.79796: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882819.79803: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882819.79812: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882819.79823: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882819.79830: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882819.79837: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882819.79848: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882819.79926: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882819.79939: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882819.79948: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882819.80085: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882819.81943: stdout chunk (state=3): >>>ansible-tmp-1726882819.784255-31381-190438394205193=/root/.ansible/tmp/ansible-tmp-1726882819.784255-31381-190438394205193 <<< 30564 1726882819.82126: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882819.82129: stdout chunk (state=3): >>><<< 30564 1726882819.82131: stderr chunk (state=3): >>><<< 30564 1726882819.82372: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882819.784255-31381-190438394205193=/root/.ansible/tmp/ansible-tmp-1726882819.784255-31381-190438394205193 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882819.82376: variable 'ansible_module_compression' from source: unknown 30564 1726882819.82378: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 30564 1726882819.82380: variable 'ansible_facts' from source: unknown 30564 1726882819.82396: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882819.784255-31381-190438394205193/AnsiballZ_stat.py 30564 1726882819.83202: Sending initial data 30564 1726882819.83205: Sent initial data (152 bytes) 30564 1726882819.84169: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882819.84173: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882819.84212: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882819.84216: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882819.84219: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882819.84292: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882819.84295: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882819.84308: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882819.84435: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882819.86794: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882819.86890: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882819.87001: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmp5cl48ixj /root/.ansible/tmp/ansible-tmp-1726882819.784255-31381-190438394205193/AnsiballZ_stat.py <<< 30564 1726882819.87092: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882819.88914: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882819.88970: stderr chunk (state=3): >>><<< 30564 1726882819.88973: stdout chunk (state=3): >>><<< 30564 1726882819.88975: done transferring module to remote 30564 1726882819.88977: _low_level_execute_command(): starting 30564 1726882819.88979: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882819.784255-31381-190438394205193/ /root/.ansible/tmp/ansible-tmp-1726882819.784255-31381-190438394205193/AnsiballZ_stat.py && sleep 0' 30564 1726882819.89627: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882819.89746: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882819.89761: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882819.89782: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882819.89824: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882819.89835: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882819.89852: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882819.89871: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882819.89888: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882819.89898: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882819.89909: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882819.89921: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882819.89935: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882819.89963: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882819.89979: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882819.89993: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882819.90182: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882819.90198: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882819.90212: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882819.90400: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882819.92219: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882819.92222: stdout chunk (state=3): >>><<< 30564 1726882819.92225: stderr chunk (state=3): >>><<< 30564 1726882819.92269: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882819.92273: _low_level_execute_command(): starting 30564 1726882819.92275: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882819.784255-31381-190438394205193/AnsiballZ_stat.py && sleep 0' 30564 1726882819.93727: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882819.93750: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882819.93776: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882819.93795: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882819.93841: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882819.93878: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882819.93893: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882819.93913: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882819.93928: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882819.93939: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882819.93951: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882819.93966: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882819.93981: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882819.94041: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882819.94053: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882819.94069: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882819.94154: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882819.94183: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882819.94203: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882819.94341: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882820.07488: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 30564 1726882820.08581: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882820.08585: stdout chunk (state=3): >>><<< 30564 1726882820.08608: stderr chunk (state=3): >>><<< 30564 1726882820.08712: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882820.08718: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882819.784255-31381-190438394205193/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882820.08721: _low_level_execute_command(): starting 30564 1726882820.08728: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882819.784255-31381-190438394205193/ > /dev/null 2>&1 && sleep 0' 30564 1726882820.09147: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882820.09151: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882820.09186: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 30564 1726882820.09190: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882820.09192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882820.09232: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882820.09238: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882820.09354: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882820.11199: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882820.11237: stderr chunk (state=3): >>><<< 30564 1726882820.11240: stdout chunk (state=3): >>><<< 30564 1726882820.11252: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882820.11257: handler run complete 30564 1726882820.11276: attempt loop complete, returning result 30564 1726882820.11279: _execute() done 30564 1726882820.11282: dumping result to json 30564 1726882820.11284: done dumping result, returning 30564 1726882820.11294: done running TaskExecutor() for managed_node2/TASK: Get stat for interface statebr [0e448fcc-3ce9-4216-acec-000000000691] 30564 1726882820.11314: sending task result for task 0e448fcc-3ce9-4216-acec-000000000691 30564 1726882820.11413: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000691 30564 1726882820.11416: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 30564 1726882820.11515: no more pending results, returning what we have 30564 1726882820.11519: results queue empty 30564 1726882820.11520: checking for any_errors_fatal 30564 1726882820.11522: done checking for any_errors_fatal 30564 1726882820.11522: checking for max_fail_percentage 30564 1726882820.11524: done checking for max_fail_percentage 30564 1726882820.11525: checking to see if all hosts have failed and the running result is not ok 30564 1726882820.11525: done checking to see if all hosts have failed 30564 1726882820.11526: getting the remaining hosts for this loop 30564 1726882820.11528: done getting the remaining hosts for this loop 30564 1726882820.11533: getting the next task for host managed_node2 30564 1726882820.11541: done getting next task for host managed_node2 30564 1726882820.11543: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 30564 1726882820.11546: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882820.11550: getting variables 30564 1726882820.11551: in VariableManager get_vars() 30564 1726882820.11582: Calling all_inventory to load vars for managed_node2 30564 1726882820.11585: Calling groups_inventory to load vars for managed_node2 30564 1726882820.11588: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882820.11598: Calling all_plugins_play to load vars for managed_node2 30564 1726882820.11600: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882820.11602: Calling groups_plugins_play to load vars for managed_node2 30564 1726882820.12427: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882820.13518: done with get_vars() 30564 1726882820.13540: done getting variables 30564 1726882820.13611: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30564 1726882820.13740: variable 'interface' from source: play vars TASK [Assert that the interface is absent - 'statebr'] ************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Friday 20 September 2024 21:40:20 -0400 (0:00:00.423) 0:00:18.719 ****** 30564 1726882820.13773: entering _queue_task() for managed_node2/assert 30564 1726882820.14082: worker is 1 (out of 1 available) 30564 1726882820.14095: exiting _queue_task() for managed_node2/assert 30564 1726882820.14107: done queuing things up, now waiting for results queue to drain 30564 1726882820.14109: waiting for pending results... 30564 1726882820.14417: running TaskExecutor() for managed_node2/TASK: Assert that the interface is absent - 'statebr' 30564 1726882820.14526: in run() - task 0e448fcc-3ce9-4216-acec-000000000643 30564 1726882820.14538: variable 'ansible_search_path' from source: unknown 30564 1726882820.14541: variable 'ansible_search_path' from source: unknown 30564 1726882820.14588: calling self._execute() 30564 1726882820.14687: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882820.14699: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882820.14720: variable 'omit' from source: magic vars 30564 1726882820.15099: variable 'ansible_distribution_major_version' from source: facts 30564 1726882820.15119: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882820.15127: variable 'omit' from source: magic vars 30564 1726882820.15172: variable 'omit' from source: magic vars 30564 1726882820.15260: variable 'interface' from source: play vars 30564 1726882820.15277: variable 'omit' from source: magic vars 30564 1726882820.15310: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882820.15336: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882820.15356: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882820.15374: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882820.15384: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882820.15415: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882820.15420: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882820.15423: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882820.15504: Set connection var ansible_timeout to 10 30564 1726882820.15507: Set connection var ansible_pipelining to False 30564 1726882820.15510: Set connection var ansible_shell_type to sh 30564 1726882820.15516: Set connection var ansible_shell_executable to /bin/sh 30564 1726882820.15522: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882820.15524: Set connection var ansible_connection to ssh 30564 1726882820.15544: variable 'ansible_shell_executable' from source: unknown 30564 1726882820.15547: variable 'ansible_connection' from source: unknown 30564 1726882820.15550: variable 'ansible_module_compression' from source: unknown 30564 1726882820.15552: variable 'ansible_shell_type' from source: unknown 30564 1726882820.15554: variable 'ansible_shell_executable' from source: unknown 30564 1726882820.15558: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882820.15565: variable 'ansible_pipelining' from source: unknown 30564 1726882820.15569: variable 'ansible_timeout' from source: unknown 30564 1726882820.15575: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882820.15671: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882820.15683: variable 'omit' from source: magic vars 30564 1726882820.15688: starting attempt loop 30564 1726882820.15691: running the handler 30564 1726882820.15794: variable 'interface_stat' from source: set_fact 30564 1726882820.15802: Evaluated conditional (not interface_stat.stat.exists): True 30564 1726882820.15807: handler run complete 30564 1726882820.15817: attempt loop complete, returning result 30564 1726882820.15820: _execute() done 30564 1726882820.15822: dumping result to json 30564 1726882820.15825: done dumping result, returning 30564 1726882820.15831: done running TaskExecutor() for managed_node2/TASK: Assert that the interface is absent - 'statebr' [0e448fcc-3ce9-4216-acec-000000000643] 30564 1726882820.15836: sending task result for task 0e448fcc-3ce9-4216-acec-000000000643 30564 1726882820.15921: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000643 30564 1726882820.15923: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 30564 1726882820.15968: no more pending results, returning what we have 30564 1726882820.15972: results queue empty 30564 1726882820.15972: checking for any_errors_fatal 30564 1726882820.15981: done checking for any_errors_fatal 30564 1726882820.15982: checking for max_fail_percentage 30564 1726882820.15983: done checking for max_fail_percentage 30564 1726882820.15984: checking to see if all hosts have failed and the running result is not ok 30564 1726882820.15985: done checking to see if all hosts have failed 30564 1726882820.15986: getting the remaining hosts for this loop 30564 1726882820.15988: done getting the remaining hosts for this loop 30564 1726882820.15991: getting the next task for host managed_node2 30564 1726882820.16000: done getting next task for host managed_node2 30564 1726882820.16003: ^ task is: TASK: Test 30564 1726882820.16006: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882820.16011: getting variables 30564 1726882820.16012: in VariableManager get_vars() 30564 1726882820.16039: Calling all_inventory to load vars for managed_node2 30564 1726882820.16041: Calling groups_inventory to load vars for managed_node2 30564 1726882820.16044: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882820.16052: Calling all_plugins_play to load vars for managed_node2 30564 1726882820.16055: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882820.16058: Calling groups_plugins_play to load vars for managed_node2 30564 1726882820.16995: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882820.18585: done with get_vars() 30564 1726882820.18605: done getting variables TASK [Test] ******************************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:30 Friday 20 September 2024 21:40:20 -0400 (0:00:00.049) 0:00:18.768 ****** 30564 1726882820.18685: entering _queue_task() for managed_node2/include_tasks 30564 1726882820.18895: worker is 1 (out of 1 available) 30564 1726882820.18911: exiting _queue_task() for managed_node2/include_tasks 30564 1726882820.18923: done queuing things up, now waiting for results queue to drain 30564 1726882820.18924: waiting for pending results... 30564 1726882820.19122: running TaskExecutor() for managed_node2/TASK: Test 30564 1726882820.19230: in run() - task 0e448fcc-3ce9-4216-acec-0000000005b8 30564 1726882820.19248: variable 'ansible_search_path' from source: unknown 30564 1726882820.19257: variable 'ansible_search_path' from source: unknown 30564 1726882820.19314: variable 'lsr_test' from source: include params 30564 1726882820.19530: variable 'lsr_test' from source: include params 30564 1726882820.19606: variable 'omit' from source: magic vars 30564 1726882820.19750: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882820.19769: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882820.19788: variable 'omit' from source: magic vars 30564 1726882820.20039: variable 'ansible_distribution_major_version' from source: facts 30564 1726882820.20065: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882820.20079: variable 'item' from source: unknown 30564 1726882820.20142: variable 'item' from source: unknown 30564 1726882820.20188: variable 'item' from source: unknown 30564 1726882820.20250: variable 'item' from source: unknown 30564 1726882820.20406: dumping result to json 30564 1726882820.20409: done dumping result, returning 30564 1726882820.20411: done running TaskExecutor() for managed_node2/TASK: Test [0e448fcc-3ce9-4216-acec-0000000005b8] 30564 1726882820.20414: sending task result for task 0e448fcc-3ce9-4216-acec-0000000005b8 30564 1726882820.20470: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000005b8 30564 1726882820.20473: WORKER PROCESS EXITING 30564 1726882820.20493: no more pending results, returning what we have 30564 1726882820.20497: in VariableManager get_vars() 30564 1726882820.20545: Calling all_inventory to load vars for managed_node2 30564 1726882820.20549: Calling groups_inventory to load vars for managed_node2 30564 1726882820.20552: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882820.20564: Calling all_plugins_play to load vars for managed_node2 30564 1726882820.20574: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882820.20578: Calling groups_plugins_play to load vars for managed_node2 30564 1726882820.21373: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882820.22388: done with get_vars() 30564 1726882820.22406: variable 'ansible_search_path' from source: unknown 30564 1726882820.22407: variable 'ansible_search_path' from source: unknown 30564 1726882820.22441: we have included files to process 30564 1726882820.22442: generating all_blocks data 30564 1726882820.22444: done generating all_blocks data 30564 1726882820.22448: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile_no_autoconnect.yml 30564 1726882820.22449: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile_no_autoconnect.yml 30564 1726882820.22451: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile_no_autoconnect.yml 30564 1726882820.22748: done processing included file 30564 1726882820.22750: iterating over new_blocks loaded from include file 30564 1726882820.22752: in VariableManager get_vars() 30564 1726882820.22771: done with get_vars() 30564 1726882820.22772: filtering new block on tags 30564 1726882820.22804: done filtering new block on tags 30564 1726882820.22807: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile_no_autoconnect.yml for managed_node2 => (item=tasks/create_bridge_profile_no_autoconnect.yml) 30564 1726882820.22811: extending task lists for all hosts with included blocks 30564 1726882820.23614: done extending task lists 30564 1726882820.23615: done processing included files 30564 1726882820.23616: results queue empty 30564 1726882820.23616: checking for any_errors_fatal 30564 1726882820.23619: done checking for any_errors_fatal 30564 1726882820.23620: checking for max_fail_percentage 30564 1726882820.23620: done checking for max_fail_percentage 30564 1726882820.23621: checking to see if all hosts have failed and the running result is not ok 30564 1726882820.23621: done checking to see if all hosts have failed 30564 1726882820.23622: getting the remaining hosts for this loop 30564 1726882820.23623: done getting the remaining hosts for this loop 30564 1726882820.23625: getting the next task for host managed_node2 30564 1726882820.23628: done getting next task for host managed_node2 30564 1726882820.23629: ^ task is: TASK: Include network role 30564 1726882820.23631: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882820.23632: getting variables 30564 1726882820.23633: in VariableManager get_vars() 30564 1726882820.23640: Calling all_inventory to load vars for managed_node2 30564 1726882820.23641: Calling groups_inventory to load vars for managed_node2 30564 1726882820.23642: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882820.23646: Calling all_plugins_play to load vars for managed_node2 30564 1726882820.23647: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882820.23649: Calling groups_plugins_play to load vars for managed_node2 30564 1726882820.24411: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882820.25325: done with get_vars() 30564 1726882820.25338: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile_no_autoconnect.yml:3 Friday 20 September 2024 21:40:20 -0400 (0:00:00.067) 0:00:18.835 ****** 30564 1726882820.25391: entering _queue_task() for managed_node2/include_role 30564 1726882820.25606: worker is 1 (out of 1 available) 30564 1726882820.25619: exiting _queue_task() for managed_node2/include_role 30564 1726882820.25630: done queuing things up, now waiting for results queue to drain 30564 1726882820.25632: waiting for pending results... 30564 1726882820.25804: running TaskExecutor() for managed_node2/TASK: Include network role 30564 1726882820.25879: in run() - task 0e448fcc-3ce9-4216-acec-0000000006b1 30564 1726882820.25891: variable 'ansible_search_path' from source: unknown 30564 1726882820.25895: variable 'ansible_search_path' from source: unknown 30564 1726882820.25921: calling self._execute() 30564 1726882820.25989: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882820.25993: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882820.26002: variable 'omit' from source: magic vars 30564 1726882820.26251: variable 'ansible_distribution_major_version' from source: facts 30564 1726882820.26260: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882820.26270: _execute() done 30564 1726882820.26273: dumping result to json 30564 1726882820.26278: done dumping result, returning 30564 1726882820.26281: done running TaskExecutor() for managed_node2/TASK: Include network role [0e448fcc-3ce9-4216-acec-0000000006b1] 30564 1726882820.26289: sending task result for task 0e448fcc-3ce9-4216-acec-0000000006b1 30564 1726882820.26391: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000006b1 30564 1726882820.26394: WORKER PROCESS EXITING 30564 1726882820.26427: no more pending results, returning what we have 30564 1726882820.26432: in VariableManager get_vars() 30564 1726882820.26466: Calling all_inventory to load vars for managed_node2 30564 1726882820.26471: Calling groups_inventory to load vars for managed_node2 30564 1726882820.26474: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882820.26483: Calling all_plugins_play to load vars for managed_node2 30564 1726882820.26486: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882820.26489: Calling groups_plugins_play to load vars for managed_node2 30564 1726882820.27358: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882820.28292: done with get_vars() 30564 1726882820.28304: variable 'ansible_search_path' from source: unknown 30564 1726882820.28305: variable 'ansible_search_path' from source: unknown 30564 1726882820.28416: variable 'omit' from source: magic vars 30564 1726882820.28440: variable 'omit' from source: magic vars 30564 1726882820.28449: variable 'omit' from source: magic vars 30564 1726882820.28452: we have included files to process 30564 1726882820.28453: generating all_blocks data 30564 1726882820.28454: done generating all_blocks data 30564 1726882820.28456: processing included file: fedora.linux_system_roles.network 30564 1726882820.28474: in VariableManager get_vars() 30564 1726882820.28482: done with get_vars() 30564 1726882820.28500: in VariableManager get_vars() 30564 1726882820.28510: done with get_vars() 30564 1726882820.28535: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 30564 1726882820.28609: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 30564 1726882820.28654: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 30564 1726882820.28920: in VariableManager get_vars() 30564 1726882820.28933: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30564 1726882820.30174: iterating over new_blocks loaded from include file 30564 1726882820.30175: in VariableManager get_vars() 30564 1726882820.30186: done with get_vars() 30564 1726882820.30188: filtering new block on tags 30564 1726882820.30351: done filtering new block on tags 30564 1726882820.30354: in VariableManager get_vars() 30564 1726882820.30362: done with get_vars() 30564 1726882820.30365: filtering new block on tags 30564 1726882820.30377: done filtering new block on tags 30564 1726882820.30378: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node2 30564 1726882820.30382: extending task lists for all hosts with included blocks 30564 1726882820.30481: done extending task lists 30564 1726882820.30482: done processing included files 30564 1726882820.30482: results queue empty 30564 1726882820.30483: checking for any_errors_fatal 30564 1726882820.30485: done checking for any_errors_fatal 30564 1726882820.30485: checking for max_fail_percentage 30564 1726882820.30486: done checking for max_fail_percentage 30564 1726882820.30486: checking to see if all hosts have failed and the running result is not ok 30564 1726882820.30487: done checking to see if all hosts have failed 30564 1726882820.30487: getting the remaining hosts for this loop 30564 1726882820.30488: done getting the remaining hosts for this loop 30564 1726882820.30490: getting the next task for host managed_node2 30564 1726882820.30493: done getting next task for host managed_node2 30564 1726882820.30495: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30564 1726882820.30497: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882820.30503: getting variables 30564 1726882820.30504: in VariableManager get_vars() 30564 1726882820.30511: Calling all_inventory to load vars for managed_node2 30564 1726882820.30513: Calling groups_inventory to load vars for managed_node2 30564 1726882820.30514: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882820.30517: Calling all_plugins_play to load vars for managed_node2 30564 1726882820.30518: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882820.30520: Calling groups_plugins_play to load vars for managed_node2 30564 1726882820.31250: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882820.32181: done with get_vars() 30564 1726882820.32195: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:40:20 -0400 (0:00:00.068) 0:00:18.903 ****** 30564 1726882820.32241: entering _queue_task() for managed_node2/include_tasks 30564 1726882820.32482: worker is 1 (out of 1 available) 30564 1726882820.32495: exiting _queue_task() for managed_node2/include_tasks 30564 1726882820.32508: done queuing things up, now waiting for results queue to drain 30564 1726882820.32509: waiting for pending results... 30564 1726882820.32687: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30564 1726882820.32772: in run() - task 0e448fcc-3ce9-4216-acec-00000000072f 30564 1726882820.32781: variable 'ansible_search_path' from source: unknown 30564 1726882820.32785: variable 'ansible_search_path' from source: unknown 30564 1726882820.32814: calling self._execute() 30564 1726882820.32886: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882820.32890: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882820.32900: variable 'omit' from source: magic vars 30564 1726882820.33164: variable 'ansible_distribution_major_version' from source: facts 30564 1726882820.33176: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882820.33182: _execute() done 30564 1726882820.33185: dumping result to json 30564 1726882820.33187: done dumping result, returning 30564 1726882820.33194: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0e448fcc-3ce9-4216-acec-00000000072f] 30564 1726882820.33199: sending task result for task 0e448fcc-3ce9-4216-acec-00000000072f 30564 1726882820.33285: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000072f 30564 1726882820.33287: WORKER PROCESS EXITING 30564 1726882820.33331: no more pending results, returning what we have 30564 1726882820.33335: in VariableManager get_vars() 30564 1726882820.33378: Calling all_inventory to load vars for managed_node2 30564 1726882820.33381: Calling groups_inventory to load vars for managed_node2 30564 1726882820.33383: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882820.33392: Calling all_plugins_play to load vars for managed_node2 30564 1726882820.33395: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882820.33404: Calling groups_plugins_play to load vars for managed_node2 30564 1726882820.34191: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882820.35121: done with get_vars() 30564 1726882820.35136: variable 'ansible_search_path' from source: unknown 30564 1726882820.35137: variable 'ansible_search_path' from source: unknown 30564 1726882820.35162: we have included files to process 30564 1726882820.35163: generating all_blocks data 30564 1726882820.35165: done generating all_blocks data 30564 1726882820.35169: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30564 1726882820.35170: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30564 1726882820.35171: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30564 1726882820.35536: done processing included file 30564 1726882820.35538: iterating over new_blocks loaded from include file 30564 1726882820.35539: in VariableManager get_vars() 30564 1726882820.35552: done with get_vars() 30564 1726882820.35553: filtering new block on tags 30564 1726882820.35577: done filtering new block on tags 30564 1726882820.35579: in VariableManager get_vars() 30564 1726882820.35592: done with get_vars() 30564 1726882820.35593: filtering new block on tags 30564 1726882820.35618: done filtering new block on tags 30564 1726882820.35620: in VariableManager get_vars() 30564 1726882820.35632: done with get_vars() 30564 1726882820.35633: filtering new block on tags 30564 1726882820.35656: done filtering new block on tags 30564 1726882820.35657: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 30564 1726882820.35660: extending task lists for all hosts with included blocks 30564 1726882820.36665: done extending task lists 30564 1726882820.36667: done processing included files 30564 1726882820.36667: results queue empty 30564 1726882820.36668: checking for any_errors_fatal 30564 1726882820.36670: done checking for any_errors_fatal 30564 1726882820.36671: checking for max_fail_percentage 30564 1726882820.36672: done checking for max_fail_percentage 30564 1726882820.36672: checking to see if all hosts have failed and the running result is not ok 30564 1726882820.36673: done checking to see if all hosts have failed 30564 1726882820.36673: getting the remaining hosts for this loop 30564 1726882820.36674: done getting the remaining hosts for this loop 30564 1726882820.36676: getting the next task for host managed_node2 30564 1726882820.36680: done getting next task for host managed_node2 30564 1726882820.36681: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30564 1726882820.36684: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882820.36690: getting variables 30564 1726882820.36690: in VariableManager get_vars() 30564 1726882820.36698: Calling all_inventory to load vars for managed_node2 30564 1726882820.36699: Calling groups_inventory to load vars for managed_node2 30564 1726882820.36701: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882820.36704: Calling all_plugins_play to load vars for managed_node2 30564 1726882820.36705: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882820.36707: Calling groups_plugins_play to load vars for managed_node2 30564 1726882820.37386: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882820.38295: done with get_vars() 30564 1726882820.38309: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:40:20 -0400 (0:00:00.061) 0:00:18.964 ****** 30564 1726882820.38356: entering _queue_task() for managed_node2/setup 30564 1726882820.38576: worker is 1 (out of 1 available) 30564 1726882820.38589: exiting _queue_task() for managed_node2/setup 30564 1726882820.38601: done queuing things up, now waiting for results queue to drain 30564 1726882820.38603: waiting for pending results... 30564 1726882820.38797: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30564 1726882820.38895: in run() - task 0e448fcc-3ce9-4216-acec-00000000078c 30564 1726882820.38907: variable 'ansible_search_path' from source: unknown 30564 1726882820.38910: variable 'ansible_search_path' from source: unknown 30564 1726882820.38940: calling self._execute() 30564 1726882820.39020: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882820.39024: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882820.39033: variable 'omit' from source: magic vars 30564 1726882820.39304: variable 'ansible_distribution_major_version' from source: facts 30564 1726882820.39315: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882820.39462: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882820.41024: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882820.41068: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882820.41097: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882820.41125: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882820.41146: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882820.41205: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882820.41229: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882820.41246: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882820.41277: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882820.41288: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882820.41323: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882820.41342: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882820.41359: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882820.41388: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882820.41398: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882820.41510: variable '__network_required_facts' from source: role '' defaults 30564 1726882820.41517: variable 'ansible_facts' from source: unknown 30564 1726882820.42085: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 30564 1726882820.42090: when evaluation is False, skipping this task 30564 1726882820.42093: _execute() done 30564 1726882820.42096: dumping result to json 30564 1726882820.42098: done dumping result, returning 30564 1726882820.42103: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0e448fcc-3ce9-4216-acec-00000000078c] 30564 1726882820.42110: sending task result for task 0e448fcc-3ce9-4216-acec-00000000078c 30564 1726882820.42194: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000078c 30564 1726882820.42197: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30564 1726882820.42250: no more pending results, returning what we have 30564 1726882820.42254: results queue empty 30564 1726882820.42255: checking for any_errors_fatal 30564 1726882820.42257: done checking for any_errors_fatal 30564 1726882820.42257: checking for max_fail_percentage 30564 1726882820.42259: done checking for max_fail_percentage 30564 1726882820.42260: checking to see if all hosts have failed and the running result is not ok 30564 1726882820.42260: done checking to see if all hosts have failed 30564 1726882820.42261: getting the remaining hosts for this loop 30564 1726882820.42263: done getting the remaining hosts for this loop 30564 1726882820.42269: getting the next task for host managed_node2 30564 1726882820.42281: done getting next task for host managed_node2 30564 1726882820.42286: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 30564 1726882820.42292: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882820.42313: getting variables 30564 1726882820.42315: in VariableManager get_vars() 30564 1726882820.42351: Calling all_inventory to load vars for managed_node2 30564 1726882820.42353: Calling groups_inventory to load vars for managed_node2 30564 1726882820.42355: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882820.42366: Calling all_plugins_play to load vars for managed_node2 30564 1726882820.42369: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882820.42378: Calling groups_plugins_play to load vars for managed_node2 30564 1726882820.46095: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882820.47018: done with get_vars() 30564 1726882820.47034: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:40:20 -0400 (0:00:00.087) 0:00:19.052 ****** 30564 1726882820.47095: entering _queue_task() for managed_node2/stat 30564 1726882820.47355: worker is 1 (out of 1 available) 30564 1726882820.47383: exiting _queue_task() for managed_node2/stat 30564 1726882820.47394: done queuing things up, now waiting for results queue to drain 30564 1726882820.47397: waiting for pending results... 30564 1726882820.47637: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 30564 1726882820.47752: in run() - task 0e448fcc-3ce9-4216-acec-00000000078e 30564 1726882820.47765: variable 'ansible_search_path' from source: unknown 30564 1726882820.47769: variable 'ansible_search_path' from source: unknown 30564 1726882820.47799: calling self._execute() 30564 1726882820.47873: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882820.47880: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882820.47889: variable 'omit' from source: magic vars 30564 1726882820.48160: variable 'ansible_distribution_major_version' from source: facts 30564 1726882820.48174: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882820.48294: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882820.48490: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882820.48527: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882820.48565: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882820.48596: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882820.48658: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882820.48682: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882820.48701: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882820.48718: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882820.48785: variable '__network_is_ostree' from source: set_fact 30564 1726882820.48789: Evaluated conditional (not __network_is_ostree is defined): False 30564 1726882820.48792: when evaluation is False, skipping this task 30564 1726882820.48794: _execute() done 30564 1726882820.48796: dumping result to json 30564 1726882820.48799: done dumping result, returning 30564 1726882820.48807: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0e448fcc-3ce9-4216-acec-00000000078e] 30564 1726882820.48812: sending task result for task 0e448fcc-3ce9-4216-acec-00000000078e 30564 1726882820.49150: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000078e 30564 1726882820.49153: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30564 1726882820.49195: no more pending results, returning what we have 30564 1726882820.49198: results queue empty 30564 1726882820.49199: checking for any_errors_fatal 30564 1726882820.49203: done checking for any_errors_fatal 30564 1726882820.49204: checking for max_fail_percentage 30564 1726882820.49206: done checking for max_fail_percentage 30564 1726882820.49207: checking to see if all hosts have failed and the running result is not ok 30564 1726882820.49207: done checking to see if all hosts have failed 30564 1726882820.49208: getting the remaining hosts for this loop 30564 1726882820.49209: done getting the remaining hosts for this loop 30564 1726882820.49212: getting the next task for host managed_node2 30564 1726882820.49219: done getting next task for host managed_node2 30564 1726882820.49222: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30564 1726882820.49228: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882820.49242: getting variables 30564 1726882820.49244: in VariableManager get_vars() 30564 1726882820.49278: Calling all_inventory to load vars for managed_node2 30564 1726882820.49281: Calling groups_inventory to load vars for managed_node2 30564 1726882820.49283: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882820.49292: Calling all_plugins_play to load vars for managed_node2 30564 1726882820.49295: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882820.49297: Calling groups_plugins_play to load vars for managed_node2 30564 1726882820.50649: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882820.52250: done with get_vars() 30564 1726882820.52268: done getting variables 30564 1726882820.52308: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:40:20 -0400 (0:00:00.052) 0:00:19.104 ****** 30564 1726882820.52335: entering _queue_task() for managed_node2/set_fact 30564 1726882820.52544: worker is 1 (out of 1 available) 30564 1726882820.52559: exiting _queue_task() for managed_node2/set_fact 30564 1726882820.52573: done queuing things up, now waiting for results queue to drain 30564 1726882820.52574: waiting for pending results... 30564 1726882820.52756: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30564 1726882820.52866: in run() - task 0e448fcc-3ce9-4216-acec-00000000078f 30564 1726882820.52881: variable 'ansible_search_path' from source: unknown 30564 1726882820.52885: variable 'ansible_search_path' from source: unknown 30564 1726882820.53082: calling self._execute() 30564 1726882820.53086: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882820.53089: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882820.53092: variable 'omit' from source: magic vars 30564 1726882820.53371: variable 'ansible_distribution_major_version' from source: facts 30564 1726882820.53375: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882820.53772: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882820.54082: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882820.54086: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882820.54088: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882820.54091: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882820.54094: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882820.54096: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882820.54099: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882820.54182: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882820.54290: variable '__network_is_ostree' from source: set_fact 30564 1726882820.54293: Evaluated conditional (not __network_is_ostree is defined): False 30564 1726882820.54295: when evaluation is False, skipping this task 30564 1726882820.54297: _execute() done 30564 1726882820.54299: dumping result to json 30564 1726882820.54301: done dumping result, returning 30564 1726882820.54303: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0e448fcc-3ce9-4216-acec-00000000078f] 30564 1726882820.54305: sending task result for task 0e448fcc-3ce9-4216-acec-00000000078f 30564 1726882820.54362: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000078f 30564 1726882820.54367: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30564 1726882820.54524: no more pending results, returning what we have 30564 1726882820.54527: results queue empty 30564 1726882820.54528: checking for any_errors_fatal 30564 1726882820.54532: done checking for any_errors_fatal 30564 1726882820.54532: checking for max_fail_percentage 30564 1726882820.54534: done checking for max_fail_percentage 30564 1726882820.54535: checking to see if all hosts have failed and the running result is not ok 30564 1726882820.54535: done checking to see if all hosts have failed 30564 1726882820.54536: getting the remaining hosts for this loop 30564 1726882820.54537: done getting the remaining hosts for this loop 30564 1726882820.54540: getting the next task for host managed_node2 30564 1726882820.54549: done getting next task for host managed_node2 30564 1726882820.54552: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 30564 1726882820.54557: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882820.54576: getting variables 30564 1726882820.54577: in VariableManager get_vars() 30564 1726882820.54606: Calling all_inventory to load vars for managed_node2 30564 1726882820.54608: Calling groups_inventory to load vars for managed_node2 30564 1726882820.54610: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882820.54619: Calling all_plugins_play to load vars for managed_node2 30564 1726882820.54621: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882820.54624: Calling groups_plugins_play to load vars for managed_node2 30564 1726882820.56108: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882820.57881: done with get_vars() 30564 1726882820.57903: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:40:20 -0400 (0:00:00.056) 0:00:19.161 ****** 30564 1726882820.57997: entering _queue_task() for managed_node2/service_facts 30564 1726882820.58284: worker is 1 (out of 1 available) 30564 1726882820.58298: exiting _queue_task() for managed_node2/service_facts 30564 1726882820.58309: done queuing things up, now waiting for results queue to drain 30564 1726882820.58310: waiting for pending results... 30564 1726882820.58581: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 30564 1726882820.58717: in run() - task 0e448fcc-3ce9-4216-acec-000000000791 30564 1726882820.58731: variable 'ansible_search_path' from source: unknown 30564 1726882820.58734: variable 'ansible_search_path' from source: unknown 30564 1726882820.58774: calling self._execute() 30564 1726882820.58878: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882820.58884: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882820.58893: variable 'omit' from source: magic vars 30564 1726882820.59259: variable 'ansible_distribution_major_version' from source: facts 30564 1726882820.59276: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882820.59283: variable 'omit' from source: magic vars 30564 1726882820.59365: variable 'omit' from source: magic vars 30564 1726882820.59398: variable 'omit' from source: magic vars 30564 1726882820.59440: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882820.59479: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882820.59496: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882820.59517: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882820.59528: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882820.59557: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882820.59560: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882820.59563: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882820.59667: Set connection var ansible_timeout to 10 30564 1726882820.59675: Set connection var ansible_pipelining to False 30564 1726882820.59678: Set connection var ansible_shell_type to sh 30564 1726882820.59684: Set connection var ansible_shell_executable to /bin/sh 30564 1726882820.59692: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882820.59694: Set connection var ansible_connection to ssh 30564 1726882820.59720: variable 'ansible_shell_executable' from source: unknown 30564 1726882820.59724: variable 'ansible_connection' from source: unknown 30564 1726882820.59731: variable 'ansible_module_compression' from source: unknown 30564 1726882820.59733: variable 'ansible_shell_type' from source: unknown 30564 1726882820.59736: variable 'ansible_shell_executable' from source: unknown 30564 1726882820.59739: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882820.59743: variable 'ansible_pipelining' from source: unknown 30564 1726882820.59745: variable 'ansible_timeout' from source: unknown 30564 1726882820.59749: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882820.59943: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30564 1726882820.59956: variable 'omit' from source: magic vars 30564 1726882820.59961: starting attempt loop 30564 1726882820.59966: running the handler 30564 1726882820.59982: _low_level_execute_command(): starting 30564 1726882820.59989: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882820.60744: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882820.60755: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882820.60768: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882820.60785: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882820.60823: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882820.60831: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882820.60846: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882820.60859: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882820.60869: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882820.60879: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882820.60887: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882820.60897: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882820.60908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882820.60916: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882820.60923: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882820.60933: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882820.61013: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882820.61028: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882820.61031: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882820.61281: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882820.62842: stdout chunk (state=3): >>>/root <<< 30564 1726882820.63008: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882820.63012: stdout chunk (state=3): >>><<< 30564 1726882820.63021: stderr chunk (state=3): >>><<< 30564 1726882820.63045: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882820.63058: _low_level_execute_command(): starting 30564 1726882820.63065: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882820.6304421-31423-43337230171331 `" && echo ansible-tmp-1726882820.6304421-31423-43337230171331="` echo /root/.ansible/tmp/ansible-tmp-1726882820.6304421-31423-43337230171331 `" ) && sleep 0' 30564 1726882820.65427: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882820.65433: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882820.65476: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 30564 1726882820.65483: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882820.65498: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration <<< 30564 1726882820.65504: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882820.65518: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 30564 1726882820.65521: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882820.65597: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882820.65610: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882820.65619: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882820.65744: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882820.67656: stdout chunk (state=3): >>>ansible-tmp-1726882820.6304421-31423-43337230171331=/root/.ansible/tmp/ansible-tmp-1726882820.6304421-31423-43337230171331 <<< 30564 1726882820.67833: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882820.67836: stderr chunk (state=3): >>><<< 30564 1726882820.67841: stdout chunk (state=3): >>><<< 30564 1726882820.67860: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882820.6304421-31423-43337230171331=/root/.ansible/tmp/ansible-tmp-1726882820.6304421-31423-43337230171331 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882820.67911: variable 'ansible_module_compression' from source: unknown 30564 1726882820.67957: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 30564 1726882820.67997: variable 'ansible_facts' from source: unknown 30564 1726882820.68079: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882820.6304421-31423-43337230171331/AnsiballZ_service_facts.py 30564 1726882820.68619: Sending initial data 30564 1726882820.68626: Sent initial data (161 bytes) 30564 1726882820.70943: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882820.70981: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882820.71025: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882820.71040: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882820.71083: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882820.71131: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882820.71141: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882820.71154: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882820.71162: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882820.71172: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882820.71178: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882820.71188: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882820.71204: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882820.71290: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882820.71293: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882820.71296: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882820.72137: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882820.72154: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882820.72162: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882820.72336: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882820.74096: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 <<< 30564 1726882820.74101: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882820.74193: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882820.74297: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmp_4qkjhm8 /root/.ansible/tmp/ansible-tmp-1726882820.6304421-31423-43337230171331/AnsiballZ_service_facts.py <<< 30564 1726882820.74393: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882820.76085: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882820.76091: stderr chunk (state=3): >>><<< 30564 1726882820.76095: stdout chunk (state=3): >>><<< 30564 1726882820.76114: done transferring module to remote 30564 1726882820.76125: _low_level_execute_command(): starting 30564 1726882820.76131: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882820.6304421-31423-43337230171331/ /root/.ansible/tmp/ansible-tmp-1726882820.6304421-31423-43337230171331/AnsiballZ_service_facts.py && sleep 0' 30564 1726882820.77635: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882820.77643: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882820.77880: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 30564 1726882820.77887: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 30564 1726882820.77992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882820.78155: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882820.78781: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882820.78788: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882820.78908: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882820.80720: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882820.80724: stderr chunk (state=3): >>><<< 30564 1726882820.80729: stdout chunk (state=3): >>><<< 30564 1726882820.80745: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882820.80748: _low_level_execute_command(): starting 30564 1726882820.80756: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882820.6304421-31423-43337230171331/AnsiballZ_service_facts.py && sleep 0' 30564 1726882820.82574: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882820.82583: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882820.82594: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882820.82607: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882820.82643: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882820.82650: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882820.82661: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882820.82684: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882820.82691: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882820.82697: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882820.82705: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882820.82714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882820.82725: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882820.82732: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882820.82739: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882820.82747: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882820.82921: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882820.82934: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882820.82945: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882820.83225: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882822.17051: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-qu<<< 30564 1726882822.17121: stdout chunk (state=3): >>>it-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "system<<< 30564 1726882822.17129: stdout chunk (state=3): >>>d"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 30564 1726882822.18409: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882822.18412: stdout chunk (state=3): >>><<< 30564 1726882822.18415: stderr chunk (state=3): >>><<< 30564 1726882822.18774: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882822.19255: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882820.6304421-31423-43337230171331/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882822.19325: _low_level_execute_command(): starting 30564 1726882822.19335: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882820.6304421-31423-43337230171331/ > /dev/null 2>&1 && sleep 0' 30564 1726882822.20614: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882822.20618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882822.20654: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 30564 1726882822.20657: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882822.20660: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 30564 1726882822.20662: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882822.20727: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882822.21702: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882822.21797: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882822.23629: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882822.23707: stderr chunk (state=3): >>><<< 30564 1726882822.23710: stdout chunk (state=3): >>><<< 30564 1726882822.23973: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882822.23976: handler run complete 30564 1726882822.23979: variable 'ansible_facts' from source: unknown 30564 1726882822.24087: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882822.24815: variable 'ansible_facts' from source: unknown 30564 1726882822.25069: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882822.25474: attempt loop complete, returning result 30564 1726882822.25577: _execute() done 30564 1726882822.25584: dumping result to json 30564 1726882822.25648: done dumping result, returning 30564 1726882822.25712: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [0e448fcc-3ce9-4216-acec-000000000791] 30564 1726882822.25722: sending task result for task 0e448fcc-3ce9-4216-acec-000000000791 ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30564 1726882822.27346: no more pending results, returning what we have 30564 1726882822.27349: results queue empty 30564 1726882822.27350: checking for any_errors_fatal 30564 1726882822.27358: done checking for any_errors_fatal 30564 1726882822.27358: checking for max_fail_percentage 30564 1726882822.27361: done checking for max_fail_percentage 30564 1726882822.27362: checking to see if all hosts have failed and the running result is not ok 30564 1726882822.27363: done checking to see if all hosts have failed 30564 1726882822.27365: getting the remaining hosts for this loop 30564 1726882822.27369: done getting the remaining hosts for this loop 30564 1726882822.27373: getting the next task for host managed_node2 30564 1726882822.27382: done getting next task for host managed_node2 30564 1726882822.27385: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 30564 1726882822.27392: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882822.27403: getting variables 30564 1726882822.27405: in VariableManager get_vars() 30564 1726882822.27438: Calling all_inventory to load vars for managed_node2 30564 1726882822.27440: Calling groups_inventory to load vars for managed_node2 30564 1726882822.27443: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882822.27454: Calling all_plugins_play to load vars for managed_node2 30564 1726882822.27457: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882822.27459: Calling groups_plugins_play to load vars for managed_node2 30564 1726882822.29381: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000791 30564 1726882822.29388: WORKER PROCESS EXITING 30564 1726882822.29987: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882822.33128: done with get_vars() 30564 1726882822.33158: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:40:22 -0400 (0:00:01.761) 0:00:20.923 ****** 30564 1726882822.34196: entering _queue_task() for managed_node2/package_facts 30564 1726882822.34499: worker is 1 (out of 1 available) 30564 1726882822.34512: exiting _queue_task() for managed_node2/package_facts 30564 1726882822.34524: done queuing things up, now waiting for results queue to drain 30564 1726882822.34525: waiting for pending results... 30564 1726882822.35235: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 30564 1726882822.36023: in run() - task 0e448fcc-3ce9-4216-acec-000000000792 30564 1726882822.36042: variable 'ansible_search_path' from source: unknown 30564 1726882822.36049: variable 'ansible_search_path' from source: unknown 30564 1726882822.36091: calling self._execute() 30564 1726882822.36192: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882822.36203: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882822.36216: variable 'omit' from source: magic vars 30564 1726882822.36578: variable 'ansible_distribution_major_version' from source: facts 30564 1726882822.37288: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882822.37299: variable 'omit' from source: magic vars 30564 1726882822.37389: variable 'omit' from source: magic vars 30564 1726882822.37426: variable 'omit' from source: magic vars 30564 1726882822.37474: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882822.37515: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882822.37536: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882822.37556: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882822.37577: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882822.37609: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882822.37616: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882822.37623: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882822.37729: Set connection var ansible_timeout to 10 30564 1726882822.37740: Set connection var ansible_pipelining to False 30564 1726882822.37746: Set connection var ansible_shell_type to sh 30564 1726882822.37754: Set connection var ansible_shell_executable to /bin/sh 30564 1726882822.37770: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882822.37781: Set connection var ansible_connection to ssh 30564 1726882822.37808: variable 'ansible_shell_executable' from source: unknown 30564 1726882822.37814: variable 'ansible_connection' from source: unknown 30564 1726882822.37821: variable 'ansible_module_compression' from source: unknown 30564 1726882822.37826: variable 'ansible_shell_type' from source: unknown 30564 1726882822.37832: variable 'ansible_shell_executable' from source: unknown 30564 1726882822.37837: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882822.37844: variable 'ansible_pipelining' from source: unknown 30564 1726882822.37849: variable 'ansible_timeout' from source: unknown 30564 1726882822.37856: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882822.38052: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30564 1726882822.38556: variable 'omit' from source: magic vars 30564 1726882822.38559: starting attempt loop 30564 1726882822.38562: running the handler 30564 1726882822.38579: _low_level_execute_command(): starting 30564 1726882822.38587: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882822.40254: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882822.40263: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882822.40605: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 30564 1726882822.40612: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration <<< 30564 1726882822.40617: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882822.40632: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882822.40640: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882822.40723: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882822.40736: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882822.40742: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882822.40877: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882822.42551: stdout chunk (state=3): >>>/root <<< 30564 1726882822.42686: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882822.42776: stderr chunk (state=3): >>><<< 30564 1726882822.42779: stdout chunk (state=3): >>><<< 30564 1726882822.42896: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882822.42901: _low_level_execute_command(): starting 30564 1726882822.42904: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882822.427989-31486-39072044539591 `" && echo ansible-tmp-1726882822.427989-31486-39072044539591="` echo /root/.ansible/tmp/ansible-tmp-1726882822.427989-31486-39072044539591 `" ) && sleep 0' 30564 1726882822.44042: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882822.44458: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882822.44482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882822.44502: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882822.44555: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882822.44572: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882822.44587: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882822.44605: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882822.44616: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882822.44626: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882822.44636: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882822.44648: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882822.44662: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882822.44680: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882822.44691: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882822.44705: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882822.44786: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882822.44810: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882822.44828: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882822.44971: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882822.46875: stdout chunk (state=3): >>>ansible-tmp-1726882822.427989-31486-39072044539591=/root/.ansible/tmp/ansible-tmp-1726882822.427989-31486-39072044539591 <<< 30564 1726882822.47091: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882822.47095: stdout chunk (state=3): >>><<< 30564 1726882822.47098: stderr chunk (state=3): >>><<< 30564 1726882822.47375: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882822.427989-31486-39072044539591=/root/.ansible/tmp/ansible-tmp-1726882822.427989-31486-39072044539591 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882822.47379: variable 'ansible_module_compression' from source: unknown 30564 1726882822.47382: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 30564 1726882822.47384: variable 'ansible_facts' from source: unknown 30564 1726882822.47501: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882822.427989-31486-39072044539591/AnsiballZ_package_facts.py 30564 1726882822.48145: Sending initial data 30564 1726882822.48148: Sent initial data (160 bytes) 30564 1726882822.50530: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882822.50666: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882822.50686: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882822.50707: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882822.50748: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882822.50765: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882822.50782: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882822.50800: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882822.50820: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882822.50822: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882822.50833: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882822.50846: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882822.50861: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882822.50880: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882822.50891: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882822.50903: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882822.50977: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882822.51113: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882822.51130: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882822.51260: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882822.53095: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882822.53194: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882822.53300: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmp9sexdlb3 /root/.ansible/tmp/ansible-tmp-1726882822.427989-31486-39072044539591/AnsiballZ_package_facts.py <<< 30564 1726882822.53397: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882822.56525: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882822.56614: stderr chunk (state=3): >>><<< 30564 1726882822.56617: stdout chunk (state=3): >>><<< 30564 1726882822.56640: done transferring module to remote 30564 1726882822.56651: _low_level_execute_command(): starting 30564 1726882822.56656: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882822.427989-31486-39072044539591/ /root/.ansible/tmp/ansible-tmp-1726882822.427989-31486-39072044539591/AnsiballZ_package_facts.py && sleep 0' 30564 1726882822.58180: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 30564 1726882822.58497: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882822.58682: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882822.58841: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882822.60660: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882822.60667: stderr chunk (state=3): >>><<< 30564 1726882822.60672: stdout chunk (state=3): >>><<< 30564 1726882822.60689: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882822.60692: _low_level_execute_command(): starting 30564 1726882822.60697: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882822.427989-31486-39072044539591/AnsiballZ_package_facts.py && sleep 0' 30564 1726882822.61723: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882822.61727: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882822.62612: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882822.62616: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882822.62631: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 30564 1726882822.62637: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882822.62707: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882822.62719: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882822.62727: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882822.62859: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882823.08789: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": nu<<< 30564 1726882823.08833: stdout chunk (state=3): >>>ll, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release<<< 30564 1726882823.08842: stdout chunk (state=3): >>>": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]<<< 30564 1726882823.08899: stdout chunk (state=3): >>>, "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.1<<< 30564 1726882823.08940: stdout chunk (state=3): >>>6.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202<<< 30564 1726882823.08944: stdout chunk (state=3): >>>", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-<<< 30564 1726882823.08949: stdout chunk (state=3): >>>base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "a<<< 30564 1726882823.08954: stdout chunk (state=3): >>>rch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "sour<<< 30564 1726882823.08961: stdout chunk (state=3): >>>ce": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, <<< 30564 1726882823.08966: stdout chunk (state=3): >>>"arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300<<< 30564 1726882823.08971: stdout chunk (state=3): >>>", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64"<<< 30564 1726882823.08974: stdout chunk (state=3): >>>, "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_6<<< 30564 1726882823.09000: stdout chunk (state=3): >>>4", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", <<< 30564 1726882823.09020: stdout chunk (state=3): >>>"release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch<<< 30564 1726882823.09025: stdout chunk (state=3): >>>", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 30564 1726882823.10594: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882823.10602: stdout chunk (state=3): >>><<< 30564 1726882823.10614: stderr chunk (state=3): >>><<< 30564 1726882823.10658: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882823.12740: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882822.427989-31486-39072044539591/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882823.12755: _low_level_execute_command(): starting 30564 1726882823.12760: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882822.427989-31486-39072044539591/ > /dev/null 2>&1 && sleep 0' 30564 1726882823.13191: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882823.13194: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882823.13226: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882823.13229: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882823.13231: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882823.13279: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882823.13291: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882823.13398: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882823.15241: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882823.15322: stderr chunk (state=3): >>><<< 30564 1726882823.15325: stdout chunk (state=3): >>><<< 30564 1726882823.15375: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882823.15379: handler run complete 30564 1726882823.16253: variable 'ansible_facts' from source: unknown 30564 1726882823.16596: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882823.17840: variable 'ansible_facts' from source: unknown 30564 1726882823.18136: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882823.18900: attempt loop complete, returning result 30564 1726882823.18912: _execute() done 30564 1726882823.18915: dumping result to json 30564 1726882823.19132: done dumping result, returning 30564 1726882823.19141: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0e448fcc-3ce9-4216-acec-000000000792] 30564 1726882823.19143: sending task result for task 0e448fcc-3ce9-4216-acec-000000000792 30564 1726882823.20634: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000792 ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30564 1726882823.20715: WORKER PROCESS EXITING 30564 1726882823.20723: no more pending results, returning what we have 30564 1726882823.20726: results queue empty 30564 1726882823.20726: checking for any_errors_fatal 30564 1726882823.20730: done checking for any_errors_fatal 30564 1726882823.20731: checking for max_fail_percentage 30564 1726882823.20732: done checking for max_fail_percentage 30564 1726882823.20732: checking to see if all hosts have failed and the running result is not ok 30564 1726882823.20733: done checking to see if all hosts have failed 30564 1726882823.20733: getting the remaining hosts for this loop 30564 1726882823.20734: done getting the remaining hosts for this loop 30564 1726882823.20737: getting the next task for host managed_node2 30564 1726882823.20744: done getting next task for host managed_node2 30564 1726882823.20747: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 30564 1726882823.20751: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882823.20758: getting variables 30564 1726882823.20759: in VariableManager get_vars() 30564 1726882823.20783: Calling all_inventory to load vars for managed_node2 30564 1726882823.20785: Calling groups_inventory to load vars for managed_node2 30564 1726882823.20787: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882823.20793: Calling all_plugins_play to load vars for managed_node2 30564 1726882823.20795: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882823.20796: Calling groups_plugins_play to load vars for managed_node2 30564 1726882823.21515: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882823.22460: done with get_vars() 30564 1726882823.22479: done getting variables 30564 1726882823.22522: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:40:23 -0400 (0:00:00.883) 0:00:21.806 ****** 30564 1726882823.22549: entering _queue_task() for managed_node2/debug 30564 1726882823.22761: worker is 1 (out of 1 available) 30564 1726882823.22777: exiting _queue_task() for managed_node2/debug 30564 1726882823.22789: done queuing things up, now waiting for results queue to drain 30564 1726882823.22791: waiting for pending results... 30564 1726882823.22969: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 30564 1726882823.23060: in run() - task 0e448fcc-3ce9-4216-acec-000000000730 30564 1726882823.23077: variable 'ansible_search_path' from source: unknown 30564 1726882823.23081: variable 'ansible_search_path' from source: unknown 30564 1726882823.23108: calling self._execute() 30564 1726882823.23185: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882823.23190: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882823.23200: variable 'omit' from source: magic vars 30564 1726882823.23470: variable 'ansible_distribution_major_version' from source: facts 30564 1726882823.23485: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882823.23490: variable 'omit' from source: magic vars 30564 1726882823.23529: variable 'omit' from source: magic vars 30564 1726882823.23599: variable 'network_provider' from source: set_fact 30564 1726882823.23611: variable 'omit' from source: magic vars 30564 1726882823.23643: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882823.23683: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882823.23696: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882823.23709: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882823.23718: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882823.23741: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882823.23744: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882823.23747: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882823.23819: Set connection var ansible_timeout to 10 30564 1726882823.23822: Set connection var ansible_pipelining to False 30564 1726882823.23825: Set connection var ansible_shell_type to sh 30564 1726882823.23830: Set connection var ansible_shell_executable to /bin/sh 30564 1726882823.23837: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882823.23839: Set connection var ansible_connection to ssh 30564 1726882823.23856: variable 'ansible_shell_executable' from source: unknown 30564 1726882823.23860: variable 'ansible_connection' from source: unknown 30564 1726882823.23862: variable 'ansible_module_compression' from source: unknown 30564 1726882823.23866: variable 'ansible_shell_type' from source: unknown 30564 1726882823.23868: variable 'ansible_shell_executable' from source: unknown 30564 1726882823.23873: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882823.23879: variable 'ansible_pipelining' from source: unknown 30564 1726882823.23884: variable 'ansible_timeout' from source: unknown 30564 1726882823.23890: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882823.23992: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882823.24002: variable 'omit' from source: magic vars 30564 1726882823.24005: starting attempt loop 30564 1726882823.24008: running the handler 30564 1726882823.24044: handler run complete 30564 1726882823.24054: attempt loop complete, returning result 30564 1726882823.24057: _execute() done 30564 1726882823.24060: dumping result to json 30564 1726882823.24063: done dumping result, returning 30564 1726882823.24073: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [0e448fcc-3ce9-4216-acec-000000000730] 30564 1726882823.24078: sending task result for task 0e448fcc-3ce9-4216-acec-000000000730 30564 1726882823.24162: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000730 30564 1726882823.24169: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: Using network provider: nm 30564 1726882823.24225: no more pending results, returning what we have 30564 1726882823.24235: results queue empty 30564 1726882823.24236: checking for any_errors_fatal 30564 1726882823.24243: done checking for any_errors_fatal 30564 1726882823.24244: checking for max_fail_percentage 30564 1726882823.24246: done checking for max_fail_percentage 30564 1726882823.24246: checking to see if all hosts have failed and the running result is not ok 30564 1726882823.24247: done checking to see if all hosts have failed 30564 1726882823.24248: getting the remaining hosts for this loop 30564 1726882823.24249: done getting the remaining hosts for this loop 30564 1726882823.24253: getting the next task for host managed_node2 30564 1726882823.24259: done getting next task for host managed_node2 30564 1726882823.24263: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30564 1726882823.24271: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882823.24282: getting variables 30564 1726882823.24283: in VariableManager get_vars() 30564 1726882823.24311: Calling all_inventory to load vars for managed_node2 30564 1726882823.24313: Calling groups_inventory to load vars for managed_node2 30564 1726882823.24315: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882823.24323: Calling all_plugins_play to load vars for managed_node2 30564 1726882823.24325: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882823.24327: Calling groups_plugins_play to load vars for managed_node2 30564 1726882823.25194: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882823.26146: done with get_vars() 30564 1726882823.26161: done getting variables 30564 1726882823.26205: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:40:23 -0400 (0:00:00.036) 0:00:21.843 ****** 30564 1726882823.26232: entering _queue_task() for managed_node2/fail 30564 1726882823.26421: worker is 1 (out of 1 available) 30564 1726882823.26434: exiting _queue_task() for managed_node2/fail 30564 1726882823.26445: done queuing things up, now waiting for results queue to drain 30564 1726882823.26446: waiting for pending results... 30564 1726882823.26619: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30564 1726882823.26712: in run() - task 0e448fcc-3ce9-4216-acec-000000000731 30564 1726882823.26724: variable 'ansible_search_path' from source: unknown 30564 1726882823.26728: variable 'ansible_search_path' from source: unknown 30564 1726882823.26756: calling self._execute() 30564 1726882823.26825: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882823.26829: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882823.26837: variable 'omit' from source: magic vars 30564 1726882823.27097: variable 'ansible_distribution_major_version' from source: facts 30564 1726882823.27108: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882823.27192: variable 'network_state' from source: role '' defaults 30564 1726882823.27200: Evaluated conditional (network_state != {}): False 30564 1726882823.27203: when evaluation is False, skipping this task 30564 1726882823.27206: _execute() done 30564 1726882823.27210: dumping result to json 30564 1726882823.27213: done dumping result, returning 30564 1726882823.27219: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0e448fcc-3ce9-4216-acec-000000000731] 30564 1726882823.27224: sending task result for task 0e448fcc-3ce9-4216-acec-000000000731 30564 1726882823.27313: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000731 30564 1726882823.27316: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30564 1726882823.27363: no more pending results, returning what we have 30564 1726882823.27370: results queue empty 30564 1726882823.27371: checking for any_errors_fatal 30564 1726882823.27376: done checking for any_errors_fatal 30564 1726882823.27377: checking for max_fail_percentage 30564 1726882823.27378: done checking for max_fail_percentage 30564 1726882823.27379: checking to see if all hosts have failed and the running result is not ok 30564 1726882823.27380: done checking to see if all hosts have failed 30564 1726882823.27381: getting the remaining hosts for this loop 30564 1726882823.27382: done getting the remaining hosts for this loop 30564 1726882823.27385: getting the next task for host managed_node2 30564 1726882823.27391: done getting next task for host managed_node2 30564 1726882823.27395: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30564 1726882823.27399: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882823.27414: getting variables 30564 1726882823.27416: in VariableManager get_vars() 30564 1726882823.27448: Calling all_inventory to load vars for managed_node2 30564 1726882823.27450: Calling groups_inventory to load vars for managed_node2 30564 1726882823.27452: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882823.27458: Calling all_plugins_play to load vars for managed_node2 30564 1726882823.27459: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882823.27461: Calling groups_plugins_play to load vars for managed_node2 30564 1726882823.28214: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882823.29158: done with get_vars() 30564 1726882823.29176: done getting variables 30564 1726882823.29215: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:40:23 -0400 (0:00:00.030) 0:00:21.873 ****** 30564 1726882823.29236: entering _queue_task() for managed_node2/fail 30564 1726882823.29409: worker is 1 (out of 1 available) 30564 1726882823.29423: exiting _queue_task() for managed_node2/fail 30564 1726882823.29434: done queuing things up, now waiting for results queue to drain 30564 1726882823.29435: waiting for pending results... 30564 1726882823.29618: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30564 1726882823.29712: in run() - task 0e448fcc-3ce9-4216-acec-000000000732 30564 1726882823.29721: variable 'ansible_search_path' from source: unknown 30564 1726882823.29724: variable 'ansible_search_path' from source: unknown 30564 1726882823.29751: calling self._execute() 30564 1726882823.29826: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882823.29831: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882823.29840: variable 'omit' from source: magic vars 30564 1726882823.30110: variable 'ansible_distribution_major_version' from source: facts 30564 1726882823.30121: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882823.30206: variable 'network_state' from source: role '' defaults 30564 1726882823.30217: Evaluated conditional (network_state != {}): False 30564 1726882823.30220: when evaluation is False, skipping this task 30564 1726882823.30224: _execute() done 30564 1726882823.30226: dumping result to json 30564 1726882823.30229: done dumping result, returning 30564 1726882823.30233: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0e448fcc-3ce9-4216-acec-000000000732] 30564 1726882823.30240: sending task result for task 0e448fcc-3ce9-4216-acec-000000000732 30564 1726882823.30335: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000732 30564 1726882823.30338: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30564 1726882823.30391: no more pending results, returning what we have 30564 1726882823.30393: results queue empty 30564 1726882823.30394: checking for any_errors_fatal 30564 1726882823.30399: done checking for any_errors_fatal 30564 1726882823.30399: checking for max_fail_percentage 30564 1726882823.30401: done checking for max_fail_percentage 30564 1726882823.30402: checking to see if all hosts have failed and the running result is not ok 30564 1726882823.30402: done checking to see if all hosts have failed 30564 1726882823.30403: getting the remaining hosts for this loop 30564 1726882823.30404: done getting the remaining hosts for this loop 30564 1726882823.30407: getting the next task for host managed_node2 30564 1726882823.30413: done getting next task for host managed_node2 30564 1726882823.30416: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30564 1726882823.30421: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882823.30435: getting variables 30564 1726882823.30437: in VariableManager get_vars() 30564 1726882823.30468: Calling all_inventory to load vars for managed_node2 30564 1726882823.30470: Calling groups_inventory to load vars for managed_node2 30564 1726882823.30472: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882823.30478: Calling all_plugins_play to load vars for managed_node2 30564 1726882823.30479: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882823.30481: Calling groups_plugins_play to load vars for managed_node2 30564 1726882823.31318: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882823.32256: done with get_vars() 30564 1726882823.32272: done getting variables 30564 1726882823.32314: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:40:23 -0400 (0:00:00.030) 0:00:21.904 ****** 30564 1726882823.32335: entering _queue_task() for managed_node2/fail 30564 1726882823.32516: worker is 1 (out of 1 available) 30564 1726882823.32529: exiting _queue_task() for managed_node2/fail 30564 1726882823.32540: done queuing things up, now waiting for results queue to drain 30564 1726882823.32541: waiting for pending results... 30564 1726882823.32726: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30564 1726882823.32809: in run() - task 0e448fcc-3ce9-4216-acec-000000000733 30564 1726882823.32819: variable 'ansible_search_path' from source: unknown 30564 1726882823.32823: variable 'ansible_search_path' from source: unknown 30564 1726882823.32851: calling self._execute() 30564 1726882823.32924: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882823.32930: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882823.32938: variable 'omit' from source: magic vars 30564 1726882823.33213: variable 'ansible_distribution_major_version' from source: facts 30564 1726882823.33224: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882823.33343: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882823.34947: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882823.34994: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882823.35020: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882823.35045: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882823.35071: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882823.35124: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882823.35158: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882823.35182: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882823.35209: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882823.35220: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882823.35289: variable 'ansible_distribution_major_version' from source: facts 30564 1726882823.35301: Evaluated conditional (ansible_distribution_major_version | int > 9): False 30564 1726882823.35304: when evaluation is False, skipping this task 30564 1726882823.35307: _execute() done 30564 1726882823.35309: dumping result to json 30564 1726882823.35312: done dumping result, returning 30564 1726882823.35319: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0e448fcc-3ce9-4216-acec-000000000733] 30564 1726882823.35324: sending task result for task 0e448fcc-3ce9-4216-acec-000000000733 30564 1726882823.35413: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000733 30564 1726882823.35416: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 30564 1726882823.35456: no more pending results, returning what we have 30564 1726882823.35459: results queue empty 30564 1726882823.35460: checking for any_errors_fatal 30564 1726882823.35469: done checking for any_errors_fatal 30564 1726882823.35470: checking for max_fail_percentage 30564 1726882823.35472: done checking for max_fail_percentage 30564 1726882823.35473: checking to see if all hosts have failed and the running result is not ok 30564 1726882823.35473: done checking to see if all hosts have failed 30564 1726882823.35474: getting the remaining hosts for this loop 30564 1726882823.35477: done getting the remaining hosts for this loop 30564 1726882823.35480: getting the next task for host managed_node2 30564 1726882823.35494: done getting next task for host managed_node2 30564 1726882823.35498: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30564 1726882823.35503: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882823.35517: getting variables 30564 1726882823.35519: in VariableManager get_vars() 30564 1726882823.35608: Calling all_inventory to load vars for managed_node2 30564 1726882823.35611: Calling groups_inventory to load vars for managed_node2 30564 1726882823.35613: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882823.35622: Calling all_plugins_play to load vars for managed_node2 30564 1726882823.35624: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882823.35627: Calling groups_plugins_play to load vars for managed_node2 30564 1726882823.37082: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882823.38029: done with get_vars() 30564 1726882823.38044: done getting variables 30564 1726882823.38087: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:40:23 -0400 (0:00:00.057) 0:00:21.962 ****** 30564 1726882823.38110: entering _queue_task() for managed_node2/dnf 30564 1726882823.38302: worker is 1 (out of 1 available) 30564 1726882823.38316: exiting _queue_task() for managed_node2/dnf 30564 1726882823.38326: done queuing things up, now waiting for results queue to drain 30564 1726882823.38328: waiting for pending results... 30564 1726882823.38511: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30564 1726882823.38599: in run() - task 0e448fcc-3ce9-4216-acec-000000000734 30564 1726882823.38610: variable 'ansible_search_path' from source: unknown 30564 1726882823.38613: variable 'ansible_search_path' from source: unknown 30564 1726882823.38641: calling self._execute() 30564 1726882823.38731: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882823.38742: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882823.38754: variable 'omit' from source: magic vars 30564 1726882823.39233: variable 'ansible_distribution_major_version' from source: facts 30564 1726882823.39252: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882823.39447: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882823.41693: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882823.41758: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882823.41800: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882823.41838: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882823.41871: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882823.41959: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882823.41995: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882823.42025: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882823.42076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882823.42096: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882823.42211: variable 'ansible_distribution' from source: facts 30564 1726882823.42221: variable 'ansible_distribution_major_version' from source: facts 30564 1726882823.42239: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 30564 1726882823.42352: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882823.42485: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882823.42512: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882823.42541: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882823.42587: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882823.42605: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882823.42646: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882823.42675: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882823.42705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882823.42748: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882823.42768: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882823.42809: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882823.42836: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882823.42870: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882823.42913: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882823.42930: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882823.43086: variable 'network_connections' from source: include params 30564 1726882823.43102: variable 'interface' from source: play vars 30564 1726882823.43167: variable 'interface' from source: play vars 30564 1726882823.43238: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882823.43380: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882823.43417: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882823.43451: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882823.43491: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882823.43538: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882823.43568: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882823.43607: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882823.43638: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882823.43705: variable '__network_team_connections_defined' from source: role '' defaults 30564 1726882823.44023: variable 'network_connections' from source: include params 30564 1726882823.44034: variable 'interface' from source: play vars 30564 1726882823.44113: variable 'interface' from source: play vars 30564 1726882823.44156: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30564 1726882823.44166: when evaluation is False, skipping this task 30564 1726882823.44174: _execute() done 30564 1726882823.44181: dumping result to json 30564 1726882823.44192: done dumping result, returning 30564 1726882823.44203: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0e448fcc-3ce9-4216-acec-000000000734] 30564 1726882823.44213: sending task result for task 0e448fcc-3ce9-4216-acec-000000000734 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30564 1726882823.44370: no more pending results, returning what we have 30564 1726882823.44374: results queue empty 30564 1726882823.44375: checking for any_errors_fatal 30564 1726882823.44382: done checking for any_errors_fatal 30564 1726882823.44383: checking for max_fail_percentage 30564 1726882823.44385: done checking for max_fail_percentage 30564 1726882823.44386: checking to see if all hosts have failed and the running result is not ok 30564 1726882823.44387: done checking to see if all hosts have failed 30564 1726882823.44388: getting the remaining hosts for this loop 30564 1726882823.44389: done getting the remaining hosts for this loop 30564 1726882823.44394: getting the next task for host managed_node2 30564 1726882823.44404: done getting next task for host managed_node2 30564 1726882823.44408: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30564 1726882823.44413: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882823.44432: getting variables 30564 1726882823.44434: in VariableManager get_vars() 30564 1726882823.44473: Calling all_inventory to load vars for managed_node2 30564 1726882823.44477: Calling groups_inventory to load vars for managed_node2 30564 1726882823.44480: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882823.44490: Calling all_plugins_play to load vars for managed_node2 30564 1726882823.44494: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882823.44497: Calling groups_plugins_play to load vars for managed_node2 30564 1726882823.45982: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000734 30564 1726882823.45986: WORKER PROCESS EXITING 30564 1726882823.46358: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882823.48104: done with get_vars() 30564 1726882823.48126: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30564 1726882823.48200: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:40:23 -0400 (0:00:00.101) 0:00:22.063 ****** 30564 1726882823.48231: entering _queue_task() for managed_node2/yum 30564 1726882823.48498: worker is 1 (out of 1 available) 30564 1726882823.48510: exiting _queue_task() for managed_node2/yum 30564 1726882823.48522: done queuing things up, now waiting for results queue to drain 30564 1726882823.48524: waiting for pending results... 30564 1726882823.48808: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30564 1726882823.48935: in run() - task 0e448fcc-3ce9-4216-acec-000000000735 30564 1726882823.48956: variable 'ansible_search_path' from source: unknown 30564 1726882823.48969: variable 'ansible_search_path' from source: unknown 30564 1726882823.49011: calling self._execute() 30564 1726882823.49105: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882823.49115: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882823.49126: variable 'omit' from source: magic vars 30564 1726882823.49507: variable 'ansible_distribution_major_version' from source: facts 30564 1726882823.49528: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882823.49754: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882823.52590: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882823.52656: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882823.52700: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882823.52737: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882823.52767: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882823.52847: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882823.52883: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882823.52915: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882823.52959: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882823.52981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882823.53079: variable 'ansible_distribution_major_version' from source: facts 30564 1726882823.53098: Evaluated conditional (ansible_distribution_major_version | int < 8): False 30564 1726882823.53109: when evaluation is False, skipping this task 30564 1726882823.53115: _execute() done 30564 1726882823.53122: dumping result to json 30564 1726882823.53128: done dumping result, returning 30564 1726882823.53139: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0e448fcc-3ce9-4216-acec-000000000735] 30564 1726882823.53148: sending task result for task 0e448fcc-3ce9-4216-acec-000000000735 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 30564 1726882823.53297: no more pending results, returning what we have 30564 1726882823.53301: results queue empty 30564 1726882823.53302: checking for any_errors_fatal 30564 1726882823.53307: done checking for any_errors_fatal 30564 1726882823.53308: checking for max_fail_percentage 30564 1726882823.53310: done checking for max_fail_percentage 30564 1726882823.53311: checking to see if all hosts have failed and the running result is not ok 30564 1726882823.53312: done checking to see if all hosts have failed 30564 1726882823.53313: getting the remaining hosts for this loop 30564 1726882823.53314: done getting the remaining hosts for this loop 30564 1726882823.53318: getting the next task for host managed_node2 30564 1726882823.53327: done getting next task for host managed_node2 30564 1726882823.53332: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30564 1726882823.53337: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882823.53354: getting variables 30564 1726882823.53356: in VariableManager get_vars() 30564 1726882823.53392: Calling all_inventory to load vars for managed_node2 30564 1726882823.53394: Calling groups_inventory to load vars for managed_node2 30564 1726882823.53397: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882823.53407: Calling all_plugins_play to load vars for managed_node2 30564 1726882823.53410: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882823.53413: Calling groups_plugins_play to load vars for managed_node2 30564 1726882823.54387: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000735 30564 1726882823.54391: WORKER PROCESS EXITING 30564 1726882823.55194: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882823.57060: done with get_vars() 30564 1726882823.57087: done getting variables 30564 1726882823.57144: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:40:23 -0400 (0:00:00.089) 0:00:22.153 ****** 30564 1726882823.57181: entering _queue_task() for managed_node2/fail 30564 1726882823.57462: worker is 1 (out of 1 available) 30564 1726882823.57477: exiting _queue_task() for managed_node2/fail 30564 1726882823.57490: done queuing things up, now waiting for results queue to drain 30564 1726882823.57491: waiting for pending results... 30564 1726882823.57776: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30564 1726882823.57920: in run() - task 0e448fcc-3ce9-4216-acec-000000000736 30564 1726882823.57942: variable 'ansible_search_path' from source: unknown 30564 1726882823.57949: variable 'ansible_search_path' from source: unknown 30564 1726882823.57988: calling self._execute() 30564 1726882823.58087: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882823.58098: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882823.58113: variable 'omit' from source: magic vars 30564 1726882823.58478: variable 'ansible_distribution_major_version' from source: facts 30564 1726882823.58499: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882823.58622: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882823.58822: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882823.61150: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882823.61226: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882823.61269: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882823.61312: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882823.61347: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882823.61444: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882823.61494: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882823.61556: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882823.61687: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882823.61708: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882823.61759: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882823.61794: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882823.61825: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882823.61873: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882823.61897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882823.61941: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882823.61971: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882823.62005: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882823.62048: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882823.62067: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882823.62252: variable 'network_connections' from source: include params 30564 1726882823.62270: variable 'interface' from source: play vars 30564 1726882823.62338: variable 'interface' from source: play vars 30564 1726882823.62410: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882823.62574: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882823.62615: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882823.62653: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882823.62690: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882823.62735: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882823.62768: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882823.62798: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882823.62827: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882823.62894: variable '__network_team_connections_defined' from source: role '' defaults 30564 1726882823.63143: variable 'network_connections' from source: include params 30564 1726882823.63153: variable 'interface' from source: play vars 30564 1726882823.63221: variable 'interface' from source: play vars 30564 1726882823.63254: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30564 1726882823.63262: when evaluation is False, skipping this task 30564 1726882823.63272: _execute() done 30564 1726882823.63279: dumping result to json 30564 1726882823.63289: done dumping result, returning 30564 1726882823.63302: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-4216-acec-000000000736] 30564 1726882823.63312: sending task result for task 0e448fcc-3ce9-4216-acec-000000000736 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30564 1726882823.63466: no more pending results, returning what we have 30564 1726882823.63470: results queue empty 30564 1726882823.63472: checking for any_errors_fatal 30564 1726882823.63479: done checking for any_errors_fatal 30564 1726882823.63480: checking for max_fail_percentage 30564 1726882823.63483: done checking for max_fail_percentage 30564 1726882823.63483: checking to see if all hosts have failed and the running result is not ok 30564 1726882823.63484: done checking to see if all hosts have failed 30564 1726882823.63485: getting the remaining hosts for this loop 30564 1726882823.63487: done getting the remaining hosts for this loop 30564 1726882823.63491: getting the next task for host managed_node2 30564 1726882823.63500: done getting next task for host managed_node2 30564 1726882823.63504: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 30564 1726882823.63509: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882823.63526: getting variables 30564 1726882823.63528: in VariableManager get_vars() 30564 1726882823.63563: Calling all_inventory to load vars for managed_node2 30564 1726882823.63567: Calling groups_inventory to load vars for managed_node2 30564 1726882823.63570: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882823.63580: Calling all_plugins_play to load vars for managed_node2 30564 1726882823.63583: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882823.63586: Calling groups_plugins_play to load vars for managed_node2 30564 1726882823.64582: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000736 30564 1726882823.64585: WORKER PROCESS EXITING 30564 1726882823.65392: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882823.67281: done with get_vars() 30564 1726882823.67307: done getting variables 30564 1726882823.67367: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:40:23 -0400 (0:00:00.102) 0:00:22.255 ****** 30564 1726882823.67401: entering _queue_task() for managed_node2/package 30564 1726882823.67689: worker is 1 (out of 1 available) 30564 1726882823.67702: exiting _queue_task() for managed_node2/package 30564 1726882823.67716: done queuing things up, now waiting for results queue to drain 30564 1726882823.67717: waiting for pending results... 30564 1726882823.68017: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 30564 1726882823.68146: in run() - task 0e448fcc-3ce9-4216-acec-000000000737 30564 1726882823.68172: variable 'ansible_search_path' from source: unknown 30564 1726882823.68179: variable 'ansible_search_path' from source: unknown 30564 1726882823.68216: calling self._execute() 30564 1726882823.68313: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882823.68323: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882823.68337: variable 'omit' from source: magic vars 30564 1726882823.68708: variable 'ansible_distribution_major_version' from source: facts 30564 1726882823.68727: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882823.68932: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882823.69201: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882823.69252: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882823.69291: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882823.69346: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882823.69454: variable 'network_packages' from source: role '' defaults 30564 1726882823.69567: variable '__network_provider_setup' from source: role '' defaults 30564 1726882823.69584: variable '__network_service_name_default_nm' from source: role '' defaults 30564 1726882823.69649: variable '__network_service_name_default_nm' from source: role '' defaults 30564 1726882823.69661: variable '__network_packages_default_nm' from source: role '' defaults 30564 1726882823.69731: variable '__network_packages_default_nm' from source: role '' defaults 30564 1726882823.69925: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882823.72053: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882823.72139: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882823.72188: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882823.72226: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882823.72257: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882823.72345: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882823.72382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882823.72419: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882823.72487: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882823.72511: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882823.72567: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882823.72596: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882823.72641: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882823.72689: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882823.72725: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882823.73043: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30564 1726882823.73193: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882823.73222: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882823.73279: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882823.73324: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882823.73344: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882823.73546: variable 'ansible_python' from source: facts 30564 1726882823.73570: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30564 1726882823.73661: variable '__network_wpa_supplicant_required' from source: role '' defaults 30564 1726882823.73749: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30564 1726882823.73884: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882823.73912: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882823.73941: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882823.73990: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882823.74008: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882823.74055: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882823.74097: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882823.74126: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882823.74172: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882823.74195: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882823.74328: variable 'network_connections' from source: include params 30564 1726882823.74338: variable 'interface' from source: play vars 30564 1726882823.74442: variable 'interface' from source: play vars 30564 1726882823.74532: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882823.74562: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882823.74598: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882823.74639: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882823.74693: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882823.74988: variable 'network_connections' from source: include params 30564 1726882823.74998: variable 'interface' from source: play vars 30564 1726882823.75105: variable 'interface' from source: play vars 30564 1726882823.75163: variable '__network_packages_default_wireless' from source: role '' defaults 30564 1726882823.75247: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882823.75572: variable 'network_connections' from source: include params 30564 1726882823.75582: variable 'interface' from source: play vars 30564 1726882823.75655: variable 'interface' from source: play vars 30564 1726882823.75687: variable '__network_packages_default_team' from source: role '' defaults 30564 1726882823.75776: variable '__network_team_connections_defined' from source: role '' defaults 30564 1726882823.76105: variable 'network_connections' from source: include params 30564 1726882823.76115: variable 'interface' from source: play vars 30564 1726882823.76189: variable 'interface' from source: play vars 30564 1726882823.76262: variable '__network_service_name_default_initscripts' from source: role '' defaults 30564 1726882823.76330: variable '__network_service_name_default_initscripts' from source: role '' defaults 30564 1726882823.76342: variable '__network_packages_default_initscripts' from source: role '' defaults 30564 1726882823.76405: variable '__network_packages_default_initscripts' from source: role '' defaults 30564 1726882823.76606: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30564 1726882823.77054: variable 'network_connections' from source: include params 30564 1726882823.77066: variable 'interface' from source: play vars 30564 1726882823.77133: variable 'interface' from source: play vars 30564 1726882823.77149: variable 'ansible_distribution' from source: facts 30564 1726882823.77159: variable '__network_rh_distros' from source: role '' defaults 30564 1726882823.77173: variable 'ansible_distribution_major_version' from source: facts 30564 1726882823.77206: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30564 1726882823.77369: variable 'ansible_distribution' from source: facts 30564 1726882823.77379: variable '__network_rh_distros' from source: role '' defaults 30564 1726882823.77388: variable 'ansible_distribution_major_version' from source: facts 30564 1726882823.77400: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30564 1726882823.77554: variable 'ansible_distribution' from source: facts 30564 1726882823.77565: variable '__network_rh_distros' from source: role '' defaults 30564 1726882823.77574: variable 'ansible_distribution_major_version' from source: facts 30564 1726882823.77610: variable 'network_provider' from source: set_fact 30564 1726882823.77629: variable 'ansible_facts' from source: unknown 30564 1726882823.78421: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 30564 1726882823.78431: when evaluation is False, skipping this task 30564 1726882823.78438: _execute() done 30564 1726882823.78444: dumping result to json 30564 1726882823.78452: done dumping result, returning 30564 1726882823.78467: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [0e448fcc-3ce9-4216-acec-000000000737] 30564 1726882823.78479: sending task result for task 0e448fcc-3ce9-4216-acec-000000000737 skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 30564 1726882823.78642: no more pending results, returning what we have 30564 1726882823.78646: results queue empty 30564 1726882823.78647: checking for any_errors_fatal 30564 1726882823.78653: done checking for any_errors_fatal 30564 1726882823.78654: checking for max_fail_percentage 30564 1726882823.78656: done checking for max_fail_percentage 30564 1726882823.78657: checking to see if all hosts have failed and the running result is not ok 30564 1726882823.78658: done checking to see if all hosts have failed 30564 1726882823.78659: getting the remaining hosts for this loop 30564 1726882823.78661: done getting the remaining hosts for this loop 30564 1726882823.78671: getting the next task for host managed_node2 30564 1726882823.78680: done getting next task for host managed_node2 30564 1726882823.78685: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30564 1726882823.78690: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882823.78709: getting variables 30564 1726882823.78711: in VariableManager get_vars() 30564 1726882823.78748: Calling all_inventory to load vars for managed_node2 30564 1726882823.78751: Calling groups_inventory to load vars for managed_node2 30564 1726882823.78759: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882823.78772: Calling all_plugins_play to load vars for managed_node2 30564 1726882823.78775: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882823.78778: Calling groups_plugins_play to load vars for managed_node2 30564 1726882823.79884: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000737 30564 1726882823.79887: WORKER PROCESS EXITING 30564 1726882823.80569: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882823.82276: done with get_vars() 30564 1726882823.82311: done getting variables 30564 1726882823.82376: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:40:23 -0400 (0:00:00.150) 0:00:22.405 ****** 30564 1726882823.82421: entering _queue_task() for managed_node2/package 30564 1726882823.82754: worker is 1 (out of 1 available) 30564 1726882823.82768: exiting _queue_task() for managed_node2/package 30564 1726882823.82780: done queuing things up, now waiting for results queue to drain 30564 1726882823.82782: waiting for pending results... 30564 1726882823.83201: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30564 1726882823.83351: in run() - task 0e448fcc-3ce9-4216-acec-000000000738 30564 1726882823.83375: variable 'ansible_search_path' from source: unknown 30564 1726882823.83383: variable 'ansible_search_path' from source: unknown 30564 1726882823.83420: calling self._execute() 30564 1726882823.83523: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882823.83534: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882823.83553: variable 'omit' from source: magic vars 30564 1726882823.83915: variable 'ansible_distribution_major_version' from source: facts 30564 1726882823.83934: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882823.84050: variable 'network_state' from source: role '' defaults 30564 1726882823.84067: Evaluated conditional (network_state != {}): False 30564 1726882823.84075: when evaluation is False, skipping this task 30564 1726882823.84085: _execute() done 30564 1726882823.84095: dumping result to json 30564 1726882823.84101: done dumping result, returning 30564 1726882823.84112: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0e448fcc-3ce9-4216-acec-000000000738] 30564 1726882823.84123: sending task result for task 0e448fcc-3ce9-4216-acec-000000000738 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30564 1726882823.84278: no more pending results, returning what we have 30564 1726882823.84283: results queue empty 30564 1726882823.84284: checking for any_errors_fatal 30564 1726882823.84292: done checking for any_errors_fatal 30564 1726882823.84292: checking for max_fail_percentage 30564 1726882823.84294: done checking for max_fail_percentage 30564 1726882823.84295: checking to see if all hosts have failed and the running result is not ok 30564 1726882823.84296: done checking to see if all hosts have failed 30564 1726882823.84297: getting the remaining hosts for this loop 30564 1726882823.84299: done getting the remaining hosts for this loop 30564 1726882823.84303: getting the next task for host managed_node2 30564 1726882823.84311: done getting next task for host managed_node2 30564 1726882823.84315: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30564 1726882823.84320: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882823.84343: getting variables 30564 1726882823.84345: in VariableManager get_vars() 30564 1726882823.84382: Calling all_inventory to load vars for managed_node2 30564 1726882823.84385: Calling groups_inventory to load vars for managed_node2 30564 1726882823.84388: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882823.84400: Calling all_plugins_play to load vars for managed_node2 30564 1726882823.84403: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882823.84406: Calling groups_plugins_play to load vars for managed_node2 30564 1726882823.85384: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000738 30564 1726882823.85387: WORKER PROCESS EXITING 30564 1726882823.86205: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882823.87904: done with get_vars() 30564 1726882823.87929: done getting variables 30564 1726882823.87990: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:40:23 -0400 (0:00:00.056) 0:00:22.461 ****** 30564 1726882823.88024: entering _queue_task() for managed_node2/package 30564 1726882823.88302: worker is 1 (out of 1 available) 30564 1726882823.88314: exiting _queue_task() for managed_node2/package 30564 1726882823.88325: done queuing things up, now waiting for results queue to drain 30564 1726882823.88326: waiting for pending results... 30564 1726882823.88605: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30564 1726882823.88752: in run() - task 0e448fcc-3ce9-4216-acec-000000000739 30564 1726882823.88779: variable 'ansible_search_path' from source: unknown 30564 1726882823.88787: variable 'ansible_search_path' from source: unknown 30564 1726882823.88825: calling self._execute() 30564 1726882823.88919: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882823.88929: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882823.88943: variable 'omit' from source: magic vars 30564 1726882823.89305: variable 'ansible_distribution_major_version' from source: facts 30564 1726882823.89328: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882823.89452: variable 'network_state' from source: role '' defaults 30564 1726882823.89468: Evaluated conditional (network_state != {}): False 30564 1726882823.89476: when evaluation is False, skipping this task 30564 1726882823.89484: _execute() done 30564 1726882823.89490: dumping result to json 30564 1726882823.89497: done dumping result, returning 30564 1726882823.89508: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0e448fcc-3ce9-4216-acec-000000000739] 30564 1726882823.89519: sending task result for task 0e448fcc-3ce9-4216-acec-000000000739 30564 1726882823.89633: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000739 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30564 1726882823.89682: no more pending results, returning what we have 30564 1726882823.89686: results queue empty 30564 1726882823.89687: checking for any_errors_fatal 30564 1726882823.89697: done checking for any_errors_fatal 30564 1726882823.89698: checking for max_fail_percentage 30564 1726882823.89700: done checking for max_fail_percentage 30564 1726882823.89701: checking to see if all hosts have failed and the running result is not ok 30564 1726882823.89701: done checking to see if all hosts have failed 30564 1726882823.89702: getting the remaining hosts for this loop 30564 1726882823.89704: done getting the remaining hosts for this loop 30564 1726882823.89708: getting the next task for host managed_node2 30564 1726882823.89717: done getting next task for host managed_node2 30564 1726882823.89721: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30564 1726882823.89727: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882823.89748: getting variables 30564 1726882823.89750: in VariableManager get_vars() 30564 1726882823.89786: Calling all_inventory to load vars for managed_node2 30564 1726882823.89789: Calling groups_inventory to load vars for managed_node2 30564 1726882823.89792: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882823.89804: Calling all_plugins_play to load vars for managed_node2 30564 1726882823.89807: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882823.89810: Calling groups_plugins_play to load vars for managed_node2 30564 1726882823.90782: WORKER PROCESS EXITING 30564 1726882823.91539: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882823.97527: done with get_vars() 30564 1726882823.97553: done getting variables 30564 1726882823.97601: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:40:23 -0400 (0:00:00.096) 0:00:22.557 ****** 30564 1726882823.97630: entering _queue_task() for managed_node2/service 30564 1726882823.97954: worker is 1 (out of 1 available) 30564 1726882823.97968: exiting _queue_task() for managed_node2/service 30564 1726882823.97979: done queuing things up, now waiting for results queue to drain 30564 1726882823.97981: waiting for pending results... 30564 1726882823.98266: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30564 1726882823.98415: in run() - task 0e448fcc-3ce9-4216-acec-00000000073a 30564 1726882823.98436: variable 'ansible_search_path' from source: unknown 30564 1726882823.98445: variable 'ansible_search_path' from source: unknown 30564 1726882823.98486: calling self._execute() 30564 1726882823.98587: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882823.98600: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882823.98615: variable 'omit' from source: magic vars 30564 1726882823.98987: variable 'ansible_distribution_major_version' from source: facts 30564 1726882823.99006: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882823.99134: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882823.99340: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882824.01846: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882824.01934: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882824.02016: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882824.02055: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882824.02090: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882824.02180: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882824.02217: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882824.02252: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882824.02303: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882824.02323: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882824.02382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882824.02411: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882824.02442: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882824.02489: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882824.02508: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882824.02552: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882824.02584: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882824.02613: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882824.02655: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882824.02681: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882824.02896: variable 'network_connections' from source: include params 30564 1726882824.02919: variable 'interface' from source: play vars 30564 1726882824.03058: variable 'interface' from source: play vars 30564 1726882824.03160: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882824.03304: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882824.03343: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882824.03370: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882824.03392: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882824.03424: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882824.03442: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882824.03468: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882824.03490: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882824.03540: variable '__network_team_connections_defined' from source: role '' defaults 30564 1726882824.03699: variable 'network_connections' from source: include params 30564 1726882824.03704: variable 'interface' from source: play vars 30564 1726882824.03751: variable 'interface' from source: play vars 30564 1726882824.03773: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30564 1726882824.03777: when evaluation is False, skipping this task 30564 1726882824.03779: _execute() done 30564 1726882824.03782: dumping result to json 30564 1726882824.03784: done dumping result, returning 30564 1726882824.03791: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-4216-acec-00000000073a] 30564 1726882824.03796: sending task result for task 0e448fcc-3ce9-4216-acec-00000000073a 30564 1726882824.03885: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000073a 30564 1726882824.03895: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30564 1726882824.03939: no more pending results, returning what we have 30564 1726882824.03943: results queue empty 30564 1726882824.03944: checking for any_errors_fatal 30564 1726882824.03952: done checking for any_errors_fatal 30564 1726882824.03953: checking for max_fail_percentage 30564 1726882824.03954: done checking for max_fail_percentage 30564 1726882824.03955: checking to see if all hosts have failed and the running result is not ok 30564 1726882824.03956: done checking to see if all hosts have failed 30564 1726882824.03956: getting the remaining hosts for this loop 30564 1726882824.03958: done getting the remaining hosts for this loop 30564 1726882824.03962: getting the next task for host managed_node2 30564 1726882824.03972: done getting next task for host managed_node2 30564 1726882824.03976: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30564 1726882824.03981: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882824.03999: getting variables 30564 1726882824.04000: in VariableManager get_vars() 30564 1726882824.04035: Calling all_inventory to load vars for managed_node2 30564 1726882824.04038: Calling groups_inventory to load vars for managed_node2 30564 1726882824.04040: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882824.04050: Calling all_plugins_play to load vars for managed_node2 30564 1726882824.04053: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882824.04056: Calling groups_plugins_play to load vars for managed_node2 30564 1726882824.04900: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882824.06511: done with get_vars() 30564 1726882824.06528: done getting variables 30564 1726882824.06574: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:40:24 -0400 (0:00:00.089) 0:00:22.647 ****** 30564 1726882824.06598: entering _queue_task() for managed_node2/service 30564 1726882824.06815: worker is 1 (out of 1 available) 30564 1726882824.06829: exiting _queue_task() for managed_node2/service 30564 1726882824.06841: done queuing things up, now waiting for results queue to drain 30564 1726882824.06842: waiting for pending results... 30564 1726882824.07062: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30564 1726882824.07222: in run() - task 0e448fcc-3ce9-4216-acec-00000000073b 30564 1726882824.07243: variable 'ansible_search_path' from source: unknown 30564 1726882824.07251: variable 'ansible_search_path' from source: unknown 30564 1726882824.07300: calling self._execute() 30564 1726882824.07406: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882824.07533: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882824.07550: variable 'omit' from source: magic vars 30564 1726882824.08024: variable 'ansible_distribution_major_version' from source: facts 30564 1726882824.08055: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882824.08480: variable 'network_provider' from source: set_fact 30564 1726882824.08522: variable 'network_state' from source: role '' defaults 30564 1726882824.08576: Evaluated conditional (network_provider == "nm" or network_state != {}): True 30564 1726882824.08588: variable 'omit' from source: magic vars 30564 1726882824.08655: variable 'omit' from source: magic vars 30564 1726882824.08706: variable 'network_service_name' from source: role '' defaults 30564 1726882824.08785: variable 'network_service_name' from source: role '' defaults 30564 1726882824.08902: variable '__network_provider_setup' from source: role '' defaults 30564 1726882824.08914: variable '__network_service_name_default_nm' from source: role '' defaults 30564 1726882824.08990: variable '__network_service_name_default_nm' from source: role '' defaults 30564 1726882824.09093: variable '__network_packages_default_nm' from source: role '' defaults 30564 1726882824.09165: variable '__network_packages_default_nm' from source: role '' defaults 30564 1726882824.09402: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882824.12044: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882824.12100: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882824.12126: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882824.12150: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882824.12176: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882824.12229: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882824.12251: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882824.12271: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882824.12304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882824.12314: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882824.12346: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882824.12362: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882824.12388: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882824.12413: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882824.12423: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882824.12569: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30564 1726882824.12647: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882824.12665: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882824.12686: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882824.12714: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882824.12724: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882824.12789: variable 'ansible_python' from source: facts 30564 1726882824.12801: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30564 1726882824.12859: variable '__network_wpa_supplicant_required' from source: role '' defaults 30564 1726882824.12916: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30564 1726882824.13002: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882824.13044: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882824.13070: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882824.13096: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882824.13114: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882824.13168: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882824.13191: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882824.13211: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882824.13260: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882824.13279: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882824.13397: variable 'network_connections' from source: include params 30564 1726882824.13406: variable 'interface' from source: play vars 30564 1726882824.13488: variable 'interface' from source: play vars 30564 1726882824.13596: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882824.13803: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882824.13857: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882824.13913: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882824.13955: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882824.14030: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882824.14062: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882824.14111: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882824.14151: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882824.14210: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882824.14447: variable 'network_connections' from source: include params 30564 1726882824.14451: variable 'interface' from source: play vars 30564 1726882824.14506: variable 'interface' from source: play vars 30564 1726882824.14576: variable '__network_packages_default_wireless' from source: role '' defaults 30564 1726882824.14649: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882824.14960: variable 'network_connections' from source: include params 30564 1726882824.14978: variable 'interface' from source: play vars 30564 1726882824.15060: variable 'interface' from source: play vars 30564 1726882824.15094: variable '__network_packages_default_team' from source: role '' defaults 30564 1726882824.15183: variable '__network_team_connections_defined' from source: role '' defaults 30564 1726882824.15491: variable 'network_connections' from source: include params 30564 1726882824.15501: variable 'interface' from source: play vars 30564 1726882824.15581: variable 'interface' from source: play vars 30564 1726882824.15647: variable '__network_service_name_default_initscripts' from source: role '' defaults 30564 1726882824.15720: variable '__network_service_name_default_initscripts' from source: role '' defaults 30564 1726882824.15733: variable '__network_packages_default_initscripts' from source: role '' defaults 30564 1726882824.15805: variable '__network_packages_default_initscripts' from source: role '' defaults 30564 1726882824.16004: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30564 1726882824.16322: variable 'network_connections' from source: include params 30564 1726882824.16325: variable 'interface' from source: play vars 30564 1726882824.16373: variable 'interface' from source: play vars 30564 1726882824.16378: variable 'ansible_distribution' from source: facts 30564 1726882824.16381: variable '__network_rh_distros' from source: role '' defaults 30564 1726882824.16387: variable 'ansible_distribution_major_version' from source: facts 30564 1726882824.16409: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30564 1726882824.16520: variable 'ansible_distribution' from source: facts 30564 1726882824.16524: variable '__network_rh_distros' from source: role '' defaults 30564 1726882824.16529: variable 'ansible_distribution_major_version' from source: facts 30564 1726882824.16536: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30564 1726882824.16647: variable 'ansible_distribution' from source: facts 30564 1726882824.16651: variable '__network_rh_distros' from source: role '' defaults 30564 1726882824.16660: variable 'ansible_distribution_major_version' from source: facts 30564 1726882824.16687: variable 'network_provider' from source: set_fact 30564 1726882824.16704: variable 'omit' from source: magic vars 30564 1726882824.16724: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882824.16746: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882824.16759: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882824.16778: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882824.16786: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882824.16809: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882824.16812: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882824.16814: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882824.16885: Set connection var ansible_timeout to 10 30564 1726882824.16889: Set connection var ansible_pipelining to False 30564 1726882824.16892: Set connection var ansible_shell_type to sh 30564 1726882824.16897: Set connection var ansible_shell_executable to /bin/sh 30564 1726882824.16904: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882824.16906: Set connection var ansible_connection to ssh 30564 1726882824.16926: variable 'ansible_shell_executable' from source: unknown 30564 1726882824.16928: variable 'ansible_connection' from source: unknown 30564 1726882824.16931: variable 'ansible_module_compression' from source: unknown 30564 1726882824.16933: variable 'ansible_shell_type' from source: unknown 30564 1726882824.16936: variable 'ansible_shell_executable' from source: unknown 30564 1726882824.16938: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882824.16944: variable 'ansible_pipelining' from source: unknown 30564 1726882824.16946: variable 'ansible_timeout' from source: unknown 30564 1726882824.16948: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882824.17023: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882824.17031: variable 'omit' from source: magic vars 30564 1726882824.17036: starting attempt loop 30564 1726882824.17038: running the handler 30564 1726882824.17099: variable 'ansible_facts' from source: unknown 30564 1726882824.17520: _low_level_execute_command(): starting 30564 1726882824.17531: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882824.18187: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882824.18198: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882824.18208: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882824.18222: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882824.18257: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882824.18265: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882824.18275: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882824.18290: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882824.18297: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882824.18304: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882824.18311: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882824.18320: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882824.18331: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882824.18337: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882824.18344: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882824.18353: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882824.18427: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882824.18454: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882824.18457: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882824.18605: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882824.20248: stdout chunk (state=3): >>>/root <<< 30564 1726882824.20351: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882824.20400: stderr chunk (state=3): >>><<< 30564 1726882824.20405: stdout chunk (state=3): >>><<< 30564 1726882824.20424: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882824.20433: _low_level_execute_command(): starting 30564 1726882824.20438: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882824.2042313-31552-232311017189888 `" && echo ansible-tmp-1726882824.2042313-31552-232311017189888="` echo /root/.ansible/tmp/ansible-tmp-1726882824.2042313-31552-232311017189888 `" ) && sleep 0' 30564 1726882824.20877: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882824.20883: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882824.20893: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882824.20923: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882824.20930: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882824.20937: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882824.20948: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882824.20955: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882824.20960: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882824.20972: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882824.20978: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882824.20985: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882824.20990: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882824.20995: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882824.21052: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882824.21059: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882824.21062: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882824.21188: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882824.23047: stdout chunk (state=3): >>>ansible-tmp-1726882824.2042313-31552-232311017189888=/root/.ansible/tmp/ansible-tmp-1726882824.2042313-31552-232311017189888 <<< 30564 1726882824.23156: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882824.23205: stderr chunk (state=3): >>><<< 30564 1726882824.23208: stdout chunk (state=3): >>><<< 30564 1726882824.23219: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882824.2042313-31552-232311017189888=/root/.ansible/tmp/ansible-tmp-1726882824.2042313-31552-232311017189888 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882824.23245: variable 'ansible_module_compression' from source: unknown 30564 1726882824.23290: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 30564 1726882824.23338: variable 'ansible_facts' from source: unknown 30564 1726882824.23477: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882824.2042313-31552-232311017189888/AnsiballZ_systemd.py 30564 1726882824.23585: Sending initial data 30564 1726882824.23589: Sent initial data (156 bytes) 30564 1726882824.24216: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882824.24220: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882824.24250: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882824.24266: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882824.24278: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882824.24323: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882824.24335: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882824.24440: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882824.26227: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 <<< 30564 1726882824.26232: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882824.26324: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882824.26421: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmp0pctn27r /root/.ansible/tmp/ansible-tmp-1726882824.2042313-31552-232311017189888/AnsiballZ_systemd.py <<< 30564 1726882824.26518: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882824.28633: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882824.28726: stderr chunk (state=3): >>><<< 30564 1726882824.28735: stdout chunk (state=3): >>><<< 30564 1726882824.28773: done transferring module to remote 30564 1726882824.28786: _low_level_execute_command(): starting 30564 1726882824.28794: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882824.2042313-31552-232311017189888/ /root/.ansible/tmp/ansible-tmp-1726882824.2042313-31552-232311017189888/AnsiballZ_systemd.py && sleep 0' 30564 1726882824.29444: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882824.29457: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882824.29476: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882824.29498: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882824.29542: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882824.29552: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882824.29569: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882824.29588: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882824.29602: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882824.29615: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882824.29627: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882824.29645: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882824.29661: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882824.29681: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882824.29693: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882824.29708: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882824.29795: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882824.29820: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882824.29841: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882824.29977: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882824.31773: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882824.31829: stderr chunk (state=3): >>><<< 30564 1726882824.31836: stdout chunk (state=3): >>><<< 30564 1726882824.31869: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882824.31872: _low_level_execute_command(): starting 30564 1726882824.31875: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882824.2042313-31552-232311017189888/AnsiballZ_systemd.py && sleep 0' 30564 1726882824.32493: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882824.32535: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882824.32539: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882824.32553: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882824.32683: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882824.57793: stdout chunk (state=3): >>> <<< 30564 1726882824.57837: stdout chunk (state=3): >>>{"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6692", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ExecMainStartTimestampMonotonic": "202392137", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6692", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3602", "MemoryCurrent": "9138176", "MemoryAvailable": "infinity", "CPUUsageNSec": "2089778000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": <<< 30564 1726882824.57875: stdout chunk (state=3): >>>"0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service multi-user.target network.target shutdown.target cloud-init.service", "After": "cloud-init-local.service dbus-broker.service network-pre.target system.slice dbus.socket systemd-journald.socket basic.target sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:57 EDT", "StateChangeTimestampMonotonic": "316658837", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveExitTimestampMonotonic": "202392395", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveEnterTimestampMonotonic": "202472383", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveExitTimestampMonotonic": "202362940", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveEnterTimestampMonotonic": "202381901", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ConditionTimestampMonotonic": "202382734", "AssertTimestamp": "Fri 2024-09-20 21:31:03 EDT", "AssertTimestampMonotonic": "202382737", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "55e27919215348fab37a11b7ea324f90", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 30564 1726882824.59539: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882824.59543: stdout chunk (state=3): >>><<< 30564 1726882824.59546: stderr chunk (state=3): >>><<< 30564 1726882824.59776: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6692", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ExecMainStartTimestampMonotonic": "202392137", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6692", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3602", "MemoryCurrent": "9138176", "MemoryAvailable": "infinity", "CPUUsageNSec": "2089778000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service multi-user.target network.target shutdown.target cloud-init.service", "After": "cloud-init-local.service dbus-broker.service network-pre.target system.slice dbus.socket systemd-journald.socket basic.target sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:57 EDT", "StateChangeTimestampMonotonic": "316658837", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveExitTimestampMonotonic": "202392395", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveEnterTimestampMonotonic": "202472383", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveExitTimestampMonotonic": "202362940", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveEnterTimestampMonotonic": "202381901", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ConditionTimestampMonotonic": "202382734", "AssertTimestamp": "Fri 2024-09-20 21:31:03 EDT", "AssertTimestampMonotonic": "202382737", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "55e27919215348fab37a11b7ea324f90", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882824.59787: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882824.2042313-31552-232311017189888/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882824.59790: _low_level_execute_command(): starting 30564 1726882824.59811: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882824.2042313-31552-232311017189888/ > /dev/null 2>&1 && sleep 0' 30564 1726882824.60454: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882824.60470: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882824.60485: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882824.60503: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882824.60552: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882824.60568: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882824.60588: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882824.60616: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882824.60635: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882824.60667: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882824.60682: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882824.60706: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882824.60739: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882824.60766: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882824.60778: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882824.60792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882824.60890: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882824.60896: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882824.61011: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882824.62845: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882824.62900: stderr chunk (state=3): >>><<< 30564 1726882824.62903: stdout chunk (state=3): >>><<< 30564 1726882824.62919: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882824.62952: handler run complete 30564 1726882824.62981: attempt loop complete, returning result 30564 1726882824.62984: _execute() done 30564 1726882824.62987: dumping result to json 30564 1726882824.62997: done dumping result, returning 30564 1726882824.63006: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0e448fcc-3ce9-4216-acec-00000000073b] 30564 1726882824.63517: sending task result for task 0e448fcc-3ce9-4216-acec-00000000073b 30564 1726882824.63651: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000073b 30564 1726882824.63654: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30564 1726882824.63709: no more pending results, returning what we have 30564 1726882824.63712: results queue empty 30564 1726882824.63713: checking for any_errors_fatal 30564 1726882824.63719: done checking for any_errors_fatal 30564 1726882824.63720: checking for max_fail_percentage 30564 1726882824.63721: done checking for max_fail_percentage 30564 1726882824.63722: checking to see if all hosts have failed and the running result is not ok 30564 1726882824.63723: done checking to see if all hosts have failed 30564 1726882824.63724: getting the remaining hosts for this loop 30564 1726882824.63725: done getting the remaining hosts for this loop 30564 1726882824.63728: getting the next task for host managed_node2 30564 1726882824.63734: done getting next task for host managed_node2 30564 1726882824.63737: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30564 1726882824.63742: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882824.63752: getting variables 30564 1726882824.63753: in VariableManager get_vars() 30564 1726882824.63782: Calling all_inventory to load vars for managed_node2 30564 1726882824.63785: Calling groups_inventory to load vars for managed_node2 30564 1726882824.63787: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882824.63795: Calling all_plugins_play to load vars for managed_node2 30564 1726882824.63799: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882824.63803: Calling groups_plugins_play to load vars for managed_node2 30564 1726882824.65349: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882824.66348: done with get_vars() 30564 1726882824.66366: done getting variables 30564 1726882824.66410: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:40:24 -0400 (0:00:00.598) 0:00:23.245 ****** 30564 1726882824.66437: entering _queue_task() for managed_node2/service 30564 1726882824.66664: worker is 1 (out of 1 available) 30564 1726882824.66681: exiting _queue_task() for managed_node2/service 30564 1726882824.66693: done queuing things up, now waiting for results queue to drain 30564 1726882824.66694: waiting for pending results... 30564 1726882824.66881: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30564 1726882824.66976: in run() - task 0e448fcc-3ce9-4216-acec-00000000073c 30564 1726882824.66993: variable 'ansible_search_path' from source: unknown 30564 1726882824.66996: variable 'ansible_search_path' from source: unknown 30564 1726882824.67028: calling self._execute() 30564 1726882824.67107: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882824.67110: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882824.67120: variable 'omit' from source: magic vars 30564 1726882824.67578: variable 'ansible_distribution_major_version' from source: facts 30564 1726882824.67596: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882824.67730: variable 'network_provider' from source: set_fact 30564 1726882824.67741: Evaluated conditional (network_provider == "nm"): True 30564 1726882824.67847: variable '__network_wpa_supplicant_required' from source: role '' defaults 30564 1726882824.67965: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30564 1726882824.68260: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882824.69817: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882824.69861: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882824.69893: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882824.69919: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882824.69940: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882824.70011: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882824.70029: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882824.70046: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882824.70078: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882824.70089: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882824.70121: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882824.70138: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882824.70155: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882824.70186: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882824.70197: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882824.70228: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882824.70241: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882824.70257: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882824.70288: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882824.70298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882824.70396: variable 'network_connections' from source: include params 30564 1726882824.70403: variable 'interface' from source: play vars 30564 1726882824.70449: variable 'interface' from source: play vars 30564 1726882824.70503: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882824.70609: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882824.70635: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882824.70662: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882824.70687: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882824.70720: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882824.70735: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882824.70752: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882824.70777: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882824.70815: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882824.70965: variable 'network_connections' from source: include params 30564 1726882824.70969: variable 'interface' from source: play vars 30564 1726882824.71016: variable 'interface' from source: play vars 30564 1726882824.71045: Evaluated conditional (__network_wpa_supplicant_required): False 30564 1726882824.71049: when evaluation is False, skipping this task 30564 1726882824.71051: _execute() done 30564 1726882824.71054: dumping result to json 30564 1726882824.71056: done dumping result, returning 30564 1726882824.71061: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0e448fcc-3ce9-4216-acec-00000000073c] 30564 1726882824.71075: sending task result for task 0e448fcc-3ce9-4216-acec-00000000073c 30564 1726882824.71169: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000073c 30564 1726882824.71173: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 30564 1726882824.71243: no more pending results, returning what we have 30564 1726882824.71246: results queue empty 30564 1726882824.71247: checking for any_errors_fatal 30564 1726882824.71268: done checking for any_errors_fatal 30564 1726882824.71269: checking for max_fail_percentage 30564 1726882824.71271: done checking for max_fail_percentage 30564 1726882824.71272: checking to see if all hosts have failed and the running result is not ok 30564 1726882824.71273: done checking to see if all hosts have failed 30564 1726882824.71273: getting the remaining hosts for this loop 30564 1726882824.71275: done getting the remaining hosts for this loop 30564 1726882824.71279: getting the next task for host managed_node2 30564 1726882824.71287: done getting next task for host managed_node2 30564 1726882824.71291: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 30564 1726882824.71296: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882824.71309: getting variables 30564 1726882824.71311: in VariableManager get_vars() 30564 1726882824.71342: Calling all_inventory to load vars for managed_node2 30564 1726882824.71344: Calling groups_inventory to load vars for managed_node2 30564 1726882824.71346: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882824.71354: Calling all_plugins_play to load vars for managed_node2 30564 1726882824.71357: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882824.71359: Calling groups_plugins_play to load vars for managed_node2 30564 1726882824.72238: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882824.73194: done with get_vars() 30564 1726882824.73210: done getting variables 30564 1726882824.73252: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:40:24 -0400 (0:00:00.068) 0:00:23.314 ****** 30564 1726882824.73279: entering _queue_task() for managed_node2/service 30564 1726882824.73488: worker is 1 (out of 1 available) 30564 1726882824.73503: exiting _queue_task() for managed_node2/service 30564 1726882824.73515: done queuing things up, now waiting for results queue to drain 30564 1726882824.73516: waiting for pending results... 30564 1726882824.73698: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 30564 1726882824.73797: in run() - task 0e448fcc-3ce9-4216-acec-00000000073d 30564 1726882824.73808: variable 'ansible_search_path' from source: unknown 30564 1726882824.73812: variable 'ansible_search_path' from source: unknown 30564 1726882824.73838: calling self._execute() 30564 1726882824.73917: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882824.73921: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882824.73930: variable 'omit' from source: magic vars 30564 1726882824.74206: variable 'ansible_distribution_major_version' from source: facts 30564 1726882824.74217: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882824.74298: variable 'network_provider' from source: set_fact 30564 1726882824.74304: Evaluated conditional (network_provider == "initscripts"): False 30564 1726882824.74307: when evaluation is False, skipping this task 30564 1726882824.74310: _execute() done 30564 1726882824.74313: dumping result to json 30564 1726882824.74315: done dumping result, returning 30564 1726882824.74321: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [0e448fcc-3ce9-4216-acec-00000000073d] 30564 1726882824.74331: sending task result for task 0e448fcc-3ce9-4216-acec-00000000073d 30564 1726882824.74422: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000073d 30564 1726882824.74425: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30564 1726882824.74477: no more pending results, returning what we have 30564 1726882824.74481: results queue empty 30564 1726882824.74482: checking for any_errors_fatal 30564 1726882824.74488: done checking for any_errors_fatal 30564 1726882824.74489: checking for max_fail_percentage 30564 1726882824.74490: done checking for max_fail_percentage 30564 1726882824.74491: checking to see if all hosts have failed and the running result is not ok 30564 1726882824.74492: done checking to see if all hosts have failed 30564 1726882824.74492: getting the remaining hosts for this loop 30564 1726882824.74494: done getting the remaining hosts for this loop 30564 1726882824.74497: getting the next task for host managed_node2 30564 1726882824.74504: done getting next task for host managed_node2 30564 1726882824.74508: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30564 1726882824.74512: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882824.74527: getting variables 30564 1726882824.74528: in VariableManager get_vars() 30564 1726882824.74565: Calling all_inventory to load vars for managed_node2 30564 1726882824.74568: Calling groups_inventory to load vars for managed_node2 30564 1726882824.74570: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882824.74578: Calling all_plugins_play to load vars for managed_node2 30564 1726882824.74581: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882824.74583: Calling groups_plugins_play to load vars for managed_node2 30564 1726882824.75354: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882824.76308: done with get_vars() 30564 1726882824.76322: done getting variables 30564 1726882824.76361: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:40:24 -0400 (0:00:00.031) 0:00:23.345 ****** 30564 1726882824.76389: entering _queue_task() for managed_node2/copy 30564 1726882824.76577: worker is 1 (out of 1 available) 30564 1726882824.76591: exiting _queue_task() for managed_node2/copy 30564 1726882824.76603: done queuing things up, now waiting for results queue to drain 30564 1726882824.76605: waiting for pending results... 30564 1726882824.76785: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30564 1726882824.76875: in run() - task 0e448fcc-3ce9-4216-acec-00000000073e 30564 1726882824.76886: variable 'ansible_search_path' from source: unknown 30564 1726882824.76890: variable 'ansible_search_path' from source: unknown 30564 1726882824.76919: calling self._execute() 30564 1726882824.76994: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882824.76998: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882824.77007: variable 'omit' from source: magic vars 30564 1726882824.77290: variable 'ansible_distribution_major_version' from source: facts 30564 1726882824.77301: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882824.77381: variable 'network_provider' from source: set_fact 30564 1726882824.77387: Evaluated conditional (network_provider == "initscripts"): False 30564 1726882824.77390: when evaluation is False, skipping this task 30564 1726882824.77393: _execute() done 30564 1726882824.77396: dumping result to json 30564 1726882824.77398: done dumping result, returning 30564 1726882824.77407: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0e448fcc-3ce9-4216-acec-00000000073e] 30564 1726882824.77413: sending task result for task 0e448fcc-3ce9-4216-acec-00000000073e 30564 1726882824.77505: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000073e 30564 1726882824.77507: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 30564 1726882824.77555: no more pending results, returning what we have 30564 1726882824.77558: results queue empty 30564 1726882824.77559: checking for any_errors_fatal 30564 1726882824.77569: done checking for any_errors_fatal 30564 1726882824.77570: checking for max_fail_percentage 30564 1726882824.77572: done checking for max_fail_percentage 30564 1726882824.77573: checking to see if all hosts have failed and the running result is not ok 30564 1726882824.77573: done checking to see if all hosts have failed 30564 1726882824.77574: getting the remaining hosts for this loop 30564 1726882824.77576: done getting the remaining hosts for this loop 30564 1726882824.77579: getting the next task for host managed_node2 30564 1726882824.77585: done getting next task for host managed_node2 30564 1726882824.77588: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30564 1726882824.77592: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882824.77607: getting variables 30564 1726882824.77609: in VariableManager get_vars() 30564 1726882824.77644: Calling all_inventory to load vars for managed_node2 30564 1726882824.77646: Calling groups_inventory to load vars for managed_node2 30564 1726882824.77648: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882824.77656: Calling all_plugins_play to load vars for managed_node2 30564 1726882824.77658: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882824.77660: Calling groups_plugins_play to load vars for managed_node2 30564 1726882824.78543: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882824.79492: done with get_vars() 30564 1726882824.79506: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:40:24 -0400 (0:00:00.031) 0:00:23.376 ****** 30564 1726882824.79562: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 30564 1726882824.79766: worker is 1 (out of 1 available) 30564 1726882824.79780: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 30564 1726882824.79793: done queuing things up, now waiting for results queue to drain 30564 1726882824.79794: waiting for pending results... 30564 1726882824.79974: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30564 1726882824.80063: in run() - task 0e448fcc-3ce9-4216-acec-00000000073f 30564 1726882824.80075: variable 'ansible_search_path' from source: unknown 30564 1726882824.80079: variable 'ansible_search_path' from source: unknown 30564 1726882824.80105: calling self._execute() 30564 1726882824.80187: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882824.80191: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882824.80200: variable 'omit' from source: magic vars 30564 1726882824.80483: variable 'ansible_distribution_major_version' from source: facts 30564 1726882824.80494: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882824.80499: variable 'omit' from source: magic vars 30564 1726882824.80542: variable 'omit' from source: magic vars 30564 1726882824.80652: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882824.82171: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882824.82217: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882824.82244: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882824.82272: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882824.82295: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882824.82351: variable 'network_provider' from source: set_fact 30564 1726882824.82462: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882824.82486: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882824.82503: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882824.82530: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882824.82541: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882824.82596: variable 'omit' from source: magic vars 30564 1726882824.82668: variable 'omit' from source: magic vars 30564 1726882824.82741: variable 'network_connections' from source: include params 30564 1726882824.82751: variable 'interface' from source: play vars 30564 1726882824.82799: variable 'interface' from source: play vars 30564 1726882824.82912: variable 'omit' from source: magic vars 30564 1726882824.82919: variable '__lsr_ansible_managed' from source: task vars 30564 1726882824.82960: variable '__lsr_ansible_managed' from source: task vars 30564 1726882824.83099: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 30564 1726882824.83237: Loaded config def from plugin (lookup/template) 30564 1726882824.83241: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 30564 1726882824.83261: File lookup term: get_ansible_managed.j2 30564 1726882824.83270: variable 'ansible_search_path' from source: unknown 30564 1726882824.83274: evaluation_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 30564 1726882824.83282: search_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 30564 1726882824.83294: variable 'ansible_search_path' from source: unknown 30564 1726882824.88689: variable 'ansible_managed' from source: unknown 30564 1726882824.88829: variable 'omit' from source: magic vars 30564 1726882824.88861: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882824.88898: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882824.88919: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882824.88939: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882824.88952: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882824.88988: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882824.88996: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882824.89004: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882824.89105: Set connection var ansible_timeout to 10 30564 1726882824.89110: Set connection var ansible_pipelining to False 30564 1726882824.89117: Set connection var ansible_shell_type to sh 30564 1726882824.89126: Set connection var ansible_shell_executable to /bin/sh 30564 1726882824.89137: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882824.89142: Set connection var ansible_connection to ssh 30564 1726882824.89174: variable 'ansible_shell_executable' from source: unknown 30564 1726882824.89181: variable 'ansible_connection' from source: unknown 30564 1726882824.89187: variable 'ansible_module_compression' from source: unknown 30564 1726882824.89193: variable 'ansible_shell_type' from source: unknown 30564 1726882824.89199: variable 'ansible_shell_executable' from source: unknown 30564 1726882824.89205: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882824.89212: variable 'ansible_pipelining' from source: unknown 30564 1726882824.89218: variable 'ansible_timeout' from source: unknown 30564 1726882824.89225: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882824.89353: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30564 1726882824.89380: variable 'omit' from source: magic vars 30564 1726882824.89390: starting attempt loop 30564 1726882824.89397: running the handler 30564 1726882824.89412: _low_level_execute_command(): starting 30564 1726882824.89422: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882824.90128: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882824.90142: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882824.90156: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882824.90179: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882824.90221: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882824.90233: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882824.90245: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882824.90261: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882824.90282: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882824.90293: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882824.90305: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882824.90319: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882824.90333: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882824.90343: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882824.90353: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882824.90370: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882824.90446: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882824.90474: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882824.90493: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882824.90632: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882824.92290: stdout chunk (state=3): >>>/root <<< 30564 1726882824.92481: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882824.92484: stdout chunk (state=3): >>><<< 30564 1726882824.92487: stderr chunk (state=3): >>><<< 30564 1726882824.92594: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882824.92597: _low_level_execute_command(): starting 30564 1726882824.92601: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882824.925052-31592-85637743426748 `" && echo ansible-tmp-1726882824.925052-31592-85637743426748="` echo /root/.ansible/tmp/ansible-tmp-1726882824.925052-31592-85637743426748 `" ) && sleep 0' 30564 1726882824.93288: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882824.93301: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882824.93320: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882824.93339: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882824.93412: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882824.93436: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882824.93469: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882824.93500: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882824.93523: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882824.93548: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882824.93568: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882824.93586: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882824.93601: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882824.93622: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882824.93654: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882824.93715: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882824.93719: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882824.93827: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882824.95755: stdout chunk (state=3): >>>ansible-tmp-1726882824.925052-31592-85637743426748=/root/.ansible/tmp/ansible-tmp-1726882824.925052-31592-85637743426748 <<< 30564 1726882824.95939: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882824.95942: stdout chunk (state=3): >>><<< 30564 1726882824.95945: stderr chunk (state=3): >>><<< 30564 1726882824.96173: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882824.925052-31592-85637743426748=/root/.ansible/tmp/ansible-tmp-1726882824.925052-31592-85637743426748 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882824.96181: variable 'ansible_module_compression' from source: unknown 30564 1726882824.96183: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 30564 1726882824.96186: variable 'ansible_facts' from source: unknown 30564 1726882824.96210: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882824.925052-31592-85637743426748/AnsiballZ_network_connections.py 30564 1726882824.96356: Sending initial data 30564 1726882824.96360: Sent initial data (166 bytes) 30564 1726882824.97030: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882824.97034: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882824.97073: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882824.97076: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882824.97084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882824.97129: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882824.97133: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882824.97242: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882824.99049: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882824.99150: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882824.99252: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmpxeii_6b_ /root/.ansible/tmp/ansible-tmp-1726882824.925052-31592-85637743426748/AnsiballZ_network_connections.py <<< 30564 1726882824.99352: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882825.00881: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882825.00972: stderr chunk (state=3): >>><<< 30564 1726882825.00975: stdout chunk (state=3): >>><<< 30564 1726882825.00994: done transferring module to remote 30564 1726882825.01002: _low_level_execute_command(): starting 30564 1726882825.01006: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882824.925052-31592-85637743426748/ /root/.ansible/tmp/ansible-tmp-1726882824.925052-31592-85637743426748/AnsiballZ_network_connections.py && sleep 0' 30564 1726882825.01421: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882825.01426: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882825.01473: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882825.01478: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882825.01480: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882825.01521: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882825.01542: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882825.01546: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882825.01650: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882825.03531: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882825.03543: stderr chunk (state=3): >>><<< 30564 1726882825.03546: stdout chunk (state=3): >>><<< 30564 1726882825.03562: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882825.03567: _low_level_execute_command(): starting 30564 1726882825.03595: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882824.925052-31592-85637743426748/AnsiballZ_network_connections.py && sleep 0' 30564 1726882825.04225: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882825.04230: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882825.04272: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882825.04276: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration <<< 30564 1726882825.04285: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882825.04291: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882825.04301: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882825.04306: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882825.04359: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882825.04374: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882825.04493: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882825.28398: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 25972e22-5267-43e5-84f8-5cddc8875a78\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "autoconnect": false, "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "autoconnect": false, "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 30564 1726882825.30091: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882825.30134: stderr chunk (state=3): >>><<< 30564 1726882825.30137: stdout chunk (state=3): >>><<< 30564 1726882825.30173: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 25972e22-5267-43e5-84f8-5cddc8875a78\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "autoconnect": false, "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "autoconnect": false, "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882825.30307: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'autoconnect': False, 'persistent_state': 'present', 'type': 'bridge', 'ip': {'dhcp4': False, 'auto6': False}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882824.925052-31592-85637743426748/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882825.30311: _low_level_execute_command(): starting 30564 1726882825.30314: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882824.925052-31592-85637743426748/ > /dev/null 2>&1 && sleep 0' 30564 1726882825.30966: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882825.30985: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882825.31001: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882825.31020: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882825.31069: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882825.31089: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882825.31105: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882825.31122: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882825.31133: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882825.31144: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882825.31156: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882825.31175: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882825.31200: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882825.31213: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882825.31225: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882825.31239: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882825.31323: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882825.31341: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882825.31354: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882825.31500: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882825.33386: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882825.33447: stderr chunk (state=3): >>><<< 30564 1726882825.33451: stdout chunk (state=3): >>><<< 30564 1726882825.33474: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882825.33482: handler run complete 30564 1726882825.33515: attempt loop complete, returning result 30564 1726882825.33519: _execute() done 30564 1726882825.33521: dumping result to json 30564 1726882825.33526: done dumping result, returning 30564 1726882825.33537: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0e448fcc-3ce9-4216-acec-00000000073f] 30564 1726882825.33543: sending task result for task 0e448fcc-3ce9-4216-acec-00000000073f 30564 1726882825.33661: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000073f 30564 1726882825.33665: WORKER PROCESS EXITING changed: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": false, "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 25972e22-5267-43e5-84f8-5cddc8875a78 30564 1726882825.33773: no more pending results, returning what we have 30564 1726882825.33777: results queue empty 30564 1726882825.33779: checking for any_errors_fatal 30564 1726882825.33785: done checking for any_errors_fatal 30564 1726882825.33785: checking for max_fail_percentage 30564 1726882825.33787: done checking for max_fail_percentage 30564 1726882825.33788: checking to see if all hosts have failed and the running result is not ok 30564 1726882825.33788: done checking to see if all hosts have failed 30564 1726882825.33789: getting the remaining hosts for this loop 30564 1726882825.33791: done getting the remaining hosts for this loop 30564 1726882825.33794: getting the next task for host managed_node2 30564 1726882825.33801: done getting next task for host managed_node2 30564 1726882825.33804: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 30564 1726882825.33809: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882825.33820: getting variables 30564 1726882825.33821: in VariableManager get_vars() 30564 1726882825.33854: Calling all_inventory to load vars for managed_node2 30564 1726882825.33856: Calling groups_inventory to load vars for managed_node2 30564 1726882825.33858: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882825.33869: Calling all_plugins_play to load vars for managed_node2 30564 1726882825.33871: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882825.33876: Calling groups_plugins_play to load vars for managed_node2 30564 1726882825.35801: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882825.37581: done with get_vars() 30564 1726882825.37604: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:40:25 -0400 (0:00:00.581) 0:00:23.958 ****** 30564 1726882825.37691: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 30564 1726882825.38001: worker is 1 (out of 1 available) 30564 1726882825.38014: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 30564 1726882825.38025: done queuing things up, now waiting for results queue to drain 30564 1726882825.38026: waiting for pending results... 30564 1726882825.38333: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 30564 1726882825.38466: in run() - task 0e448fcc-3ce9-4216-acec-000000000740 30564 1726882825.38489: variable 'ansible_search_path' from source: unknown 30564 1726882825.38495: variable 'ansible_search_path' from source: unknown 30564 1726882825.38531: calling self._execute() 30564 1726882825.38641: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882825.38653: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882825.38672: variable 'omit' from source: magic vars 30564 1726882825.39091: variable 'ansible_distribution_major_version' from source: facts 30564 1726882825.39110: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882825.39250: variable 'network_state' from source: role '' defaults 30564 1726882825.39270: Evaluated conditional (network_state != {}): False 30564 1726882825.39279: when evaluation is False, skipping this task 30564 1726882825.39286: _execute() done 30564 1726882825.39293: dumping result to json 30564 1726882825.39301: done dumping result, returning 30564 1726882825.39311: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [0e448fcc-3ce9-4216-acec-000000000740] 30564 1726882825.39321: sending task result for task 0e448fcc-3ce9-4216-acec-000000000740 30564 1726882825.39445: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000740 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30564 1726882825.39508: no more pending results, returning what we have 30564 1726882825.39513: results queue empty 30564 1726882825.39514: checking for any_errors_fatal 30564 1726882825.39527: done checking for any_errors_fatal 30564 1726882825.39528: checking for max_fail_percentage 30564 1726882825.39530: done checking for max_fail_percentage 30564 1726882825.39531: checking to see if all hosts have failed and the running result is not ok 30564 1726882825.39532: done checking to see if all hosts have failed 30564 1726882825.39533: getting the remaining hosts for this loop 30564 1726882825.39535: done getting the remaining hosts for this loop 30564 1726882825.39539: getting the next task for host managed_node2 30564 1726882825.39548: done getting next task for host managed_node2 30564 1726882825.39552: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30564 1726882825.39560: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882825.39585: getting variables 30564 1726882825.39587: in VariableManager get_vars() 30564 1726882825.39626: Calling all_inventory to load vars for managed_node2 30564 1726882825.39629: Calling groups_inventory to load vars for managed_node2 30564 1726882825.39632: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882825.39645: Calling all_plugins_play to load vars for managed_node2 30564 1726882825.39649: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882825.39653: Calling groups_plugins_play to load vars for managed_node2 30564 1726882825.40763: WORKER PROCESS EXITING 30564 1726882825.41409: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882825.43162: done with get_vars() 30564 1726882825.43191: done getting variables 30564 1726882825.43249: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:40:25 -0400 (0:00:00.055) 0:00:24.014 ****** 30564 1726882825.43290: entering _queue_task() for managed_node2/debug 30564 1726882825.43578: worker is 1 (out of 1 available) 30564 1726882825.43591: exiting _queue_task() for managed_node2/debug 30564 1726882825.43603: done queuing things up, now waiting for results queue to drain 30564 1726882825.43604: waiting for pending results... 30564 1726882825.43894: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30564 1726882825.44043: in run() - task 0e448fcc-3ce9-4216-acec-000000000741 30564 1726882825.44071: variable 'ansible_search_path' from source: unknown 30564 1726882825.44080: variable 'ansible_search_path' from source: unknown 30564 1726882825.44119: calling self._execute() 30564 1726882825.44222: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882825.44235: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882825.44249: variable 'omit' from source: magic vars 30564 1726882825.44629: variable 'ansible_distribution_major_version' from source: facts 30564 1726882825.44648: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882825.44660: variable 'omit' from source: magic vars 30564 1726882825.44733: variable 'omit' from source: magic vars 30564 1726882825.44774: variable 'omit' from source: magic vars 30564 1726882825.44822: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882825.44862: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882825.44893: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882825.44919: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882825.44936: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882825.44974: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882825.44983: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882825.44992: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882825.45104: Set connection var ansible_timeout to 10 30564 1726882825.45116: Set connection var ansible_pipelining to False 30564 1726882825.45123: Set connection var ansible_shell_type to sh 30564 1726882825.45138: Set connection var ansible_shell_executable to /bin/sh 30564 1726882825.45151: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882825.45158: Set connection var ansible_connection to ssh 30564 1726882825.45190: variable 'ansible_shell_executable' from source: unknown 30564 1726882825.45199: variable 'ansible_connection' from source: unknown 30564 1726882825.45207: variable 'ansible_module_compression' from source: unknown 30564 1726882825.45213: variable 'ansible_shell_type' from source: unknown 30564 1726882825.45220: variable 'ansible_shell_executable' from source: unknown 30564 1726882825.45226: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882825.45235: variable 'ansible_pipelining' from source: unknown 30564 1726882825.45244: variable 'ansible_timeout' from source: unknown 30564 1726882825.45252: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882825.45400: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882825.45418: variable 'omit' from source: magic vars 30564 1726882825.45428: starting attempt loop 30564 1726882825.45435: running the handler 30564 1726882825.45563: variable '__network_connections_result' from source: set_fact 30564 1726882825.45624: handler run complete 30564 1726882825.45646: attempt loop complete, returning result 30564 1726882825.45653: _execute() done 30564 1726882825.45659: dumping result to json 30564 1726882825.45672: done dumping result, returning 30564 1726882825.45690: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0e448fcc-3ce9-4216-acec-000000000741] 30564 1726882825.45700: sending task result for task 0e448fcc-3ce9-4216-acec-000000000741 ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 25972e22-5267-43e5-84f8-5cddc8875a78" ] } 30564 1726882825.45863: no more pending results, returning what we have 30564 1726882825.45870: results queue empty 30564 1726882825.45872: checking for any_errors_fatal 30564 1726882825.45880: done checking for any_errors_fatal 30564 1726882825.45880: checking for max_fail_percentage 30564 1726882825.45882: done checking for max_fail_percentage 30564 1726882825.45883: checking to see if all hosts have failed and the running result is not ok 30564 1726882825.45884: done checking to see if all hosts have failed 30564 1726882825.45885: getting the remaining hosts for this loop 30564 1726882825.45887: done getting the remaining hosts for this loop 30564 1726882825.45891: getting the next task for host managed_node2 30564 1726882825.45899: done getting next task for host managed_node2 30564 1726882825.45904: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30564 1726882825.45909: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882825.45922: getting variables 30564 1726882825.45924: in VariableManager get_vars() 30564 1726882825.45958: Calling all_inventory to load vars for managed_node2 30564 1726882825.45961: Calling groups_inventory to load vars for managed_node2 30564 1726882825.45965: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882825.45979: Calling all_plugins_play to load vars for managed_node2 30564 1726882825.45982: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882825.45985: Calling groups_plugins_play to load vars for managed_node2 30564 1726882825.46974: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000741 30564 1726882825.46978: WORKER PROCESS EXITING 30564 1726882825.48139: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882825.50807: done with get_vars() 30564 1726882825.50835: done getting variables 30564 1726882825.50902: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:40:25 -0400 (0:00:00.076) 0:00:24.090 ****** 30564 1726882825.50943: entering _queue_task() for managed_node2/debug 30564 1726882825.51270: worker is 1 (out of 1 available) 30564 1726882825.51284: exiting _queue_task() for managed_node2/debug 30564 1726882825.51297: done queuing things up, now waiting for results queue to drain 30564 1726882825.51299: waiting for pending results... 30564 1726882825.51605: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30564 1726882825.51755: in run() - task 0e448fcc-3ce9-4216-acec-000000000742 30564 1726882825.51782: variable 'ansible_search_path' from source: unknown 30564 1726882825.51791: variable 'ansible_search_path' from source: unknown 30564 1726882825.51833: calling self._execute() 30564 1726882825.51940: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882825.51951: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882825.51974: variable 'omit' from source: magic vars 30564 1726882825.52923: variable 'ansible_distribution_major_version' from source: facts 30564 1726882825.52943: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882825.52954: variable 'omit' from source: magic vars 30564 1726882825.53035: variable 'omit' from source: magic vars 30564 1726882825.53075: variable 'omit' from source: magic vars 30564 1726882825.53123: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882825.53166: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882825.53195: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882825.53224: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882825.53239: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882825.53275: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882825.53283: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882825.53291: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882825.53399: Set connection var ansible_timeout to 10 30564 1726882825.53410: Set connection var ansible_pipelining to False 30564 1726882825.53417: Set connection var ansible_shell_type to sh 30564 1726882825.53429: Set connection var ansible_shell_executable to /bin/sh 30564 1726882825.53444: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882825.53451: Set connection var ansible_connection to ssh 30564 1726882825.53484: variable 'ansible_shell_executable' from source: unknown 30564 1726882825.53492: variable 'ansible_connection' from source: unknown 30564 1726882825.53499: variable 'ansible_module_compression' from source: unknown 30564 1726882825.53504: variable 'ansible_shell_type' from source: unknown 30564 1726882825.53510: variable 'ansible_shell_executable' from source: unknown 30564 1726882825.53516: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882825.53522: variable 'ansible_pipelining' from source: unknown 30564 1726882825.53528: variable 'ansible_timeout' from source: unknown 30564 1726882825.53535: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882825.53691: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882825.53706: variable 'omit' from source: magic vars 30564 1726882825.53716: starting attempt loop 30564 1726882825.53722: running the handler 30564 1726882825.53778: variable '__network_connections_result' from source: set_fact 30564 1726882825.53853: variable '__network_connections_result' from source: set_fact 30564 1726882825.53990: handler run complete 30564 1726882825.54023: attempt loop complete, returning result 30564 1726882825.54030: _execute() done 30564 1726882825.54038: dumping result to json 30564 1726882825.54046: done dumping result, returning 30564 1726882825.54057: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0e448fcc-3ce9-4216-acec-000000000742] 30564 1726882825.54071: sending task result for task 0e448fcc-3ce9-4216-acec-000000000742 30564 1726882825.54189: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000742 ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": false, "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 25972e22-5267-43e5-84f8-5cddc8875a78\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 25972e22-5267-43e5-84f8-5cddc8875a78" ] } } 30564 1726882825.54290: no more pending results, returning what we have 30564 1726882825.54293: results queue empty 30564 1726882825.54294: checking for any_errors_fatal 30564 1726882825.54304: done checking for any_errors_fatal 30564 1726882825.54305: checking for max_fail_percentage 30564 1726882825.54308: done checking for max_fail_percentage 30564 1726882825.54309: checking to see if all hosts have failed and the running result is not ok 30564 1726882825.54310: done checking to see if all hosts have failed 30564 1726882825.54311: getting the remaining hosts for this loop 30564 1726882825.54313: done getting the remaining hosts for this loop 30564 1726882825.54316: getting the next task for host managed_node2 30564 1726882825.54324: done getting next task for host managed_node2 30564 1726882825.54328: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30564 1726882825.54334: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882825.54344: getting variables 30564 1726882825.54346: in VariableManager get_vars() 30564 1726882825.54394: Calling all_inventory to load vars for managed_node2 30564 1726882825.54397: Calling groups_inventory to load vars for managed_node2 30564 1726882825.54399: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882825.54410: Calling all_plugins_play to load vars for managed_node2 30564 1726882825.54414: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882825.54417: Calling groups_plugins_play to load vars for managed_node2 30564 1726882825.55486: WORKER PROCESS EXITING 30564 1726882825.56179: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882825.58098: done with get_vars() 30564 1726882825.58121: done getting variables 30564 1726882825.58186: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:40:25 -0400 (0:00:00.072) 0:00:24.163 ****** 30564 1726882825.58219: entering _queue_task() for managed_node2/debug 30564 1726882825.58523: worker is 1 (out of 1 available) 30564 1726882825.58536: exiting _queue_task() for managed_node2/debug 30564 1726882825.58551: done queuing things up, now waiting for results queue to drain 30564 1726882825.58552: waiting for pending results... 30564 1726882825.58848: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30564 1726882825.58992: in run() - task 0e448fcc-3ce9-4216-acec-000000000743 30564 1726882825.59017: variable 'ansible_search_path' from source: unknown 30564 1726882825.59025: variable 'ansible_search_path' from source: unknown 30564 1726882825.59067: calling self._execute() 30564 1726882825.59176: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882825.59188: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882825.59203: variable 'omit' from source: magic vars 30564 1726882825.59586: variable 'ansible_distribution_major_version' from source: facts 30564 1726882825.59604: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882825.59730: variable 'network_state' from source: role '' defaults 30564 1726882825.59744: Evaluated conditional (network_state != {}): False 30564 1726882825.59751: when evaluation is False, skipping this task 30564 1726882825.59761: _execute() done 30564 1726882825.59773: dumping result to json 30564 1726882825.59780: done dumping result, returning 30564 1726882825.59790: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0e448fcc-3ce9-4216-acec-000000000743] 30564 1726882825.59801: sending task result for task 0e448fcc-3ce9-4216-acec-000000000743 skipping: [managed_node2] => { "false_condition": "network_state != {}" } 30564 1726882825.59951: no more pending results, returning what we have 30564 1726882825.59955: results queue empty 30564 1726882825.59956: checking for any_errors_fatal 30564 1726882825.59966: done checking for any_errors_fatal 30564 1726882825.59969: checking for max_fail_percentage 30564 1726882825.59971: done checking for max_fail_percentage 30564 1726882825.59972: checking to see if all hosts have failed and the running result is not ok 30564 1726882825.59973: done checking to see if all hosts have failed 30564 1726882825.59974: getting the remaining hosts for this loop 30564 1726882825.59976: done getting the remaining hosts for this loop 30564 1726882825.59980: getting the next task for host managed_node2 30564 1726882825.59988: done getting next task for host managed_node2 30564 1726882825.59992: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 30564 1726882825.59998: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882825.60017: getting variables 30564 1726882825.60020: in VariableManager get_vars() 30564 1726882825.60055: Calling all_inventory to load vars for managed_node2 30564 1726882825.60058: Calling groups_inventory to load vars for managed_node2 30564 1726882825.60061: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882825.60078: Calling all_plugins_play to load vars for managed_node2 30564 1726882825.60081: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882825.60085: Calling groups_plugins_play to load vars for managed_node2 30564 1726882825.61085: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000743 30564 1726882825.61089: WORKER PROCESS EXITING 30564 1726882825.61780: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882825.63570: done with get_vars() 30564 1726882825.63596: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:40:25 -0400 (0:00:00.054) 0:00:24.218 ****** 30564 1726882825.63696: entering _queue_task() for managed_node2/ping 30564 1726882825.64017: worker is 1 (out of 1 available) 30564 1726882825.64029: exiting _queue_task() for managed_node2/ping 30564 1726882825.64043: done queuing things up, now waiting for results queue to drain 30564 1726882825.64044: waiting for pending results... 30564 1726882825.64335: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 30564 1726882825.64472: in run() - task 0e448fcc-3ce9-4216-acec-000000000744 30564 1726882825.64497: variable 'ansible_search_path' from source: unknown 30564 1726882825.64505: variable 'ansible_search_path' from source: unknown 30564 1726882825.64543: calling self._execute() 30564 1726882825.64645: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882825.64656: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882825.64674: variable 'omit' from source: magic vars 30564 1726882825.65054: variable 'ansible_distribution_major_version' from source: facts 30564 1726882825.65080: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882825.65090: variable 'omit' from source: magic vars 30564 1726882825.65159: variable 'omit' from source: magic vars 30564 1726882825.65198: variable 'omit' from source: magic vars 30564 1726882825.65244: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882825.65289: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882825.65312: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882825.65333: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882825.65349: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882825.65389: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882825.65397: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882825.65404: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882825.65512: Set connection var ansible_timeout to 10 30564 1726882825.65522: Set connection var ansible_pipelining to False 30564 1726882825.65528: Set connection var ansible_shell_type to sh 30564 1726882825.65536: Set connection var ansible_shell_executable to /bin/sh 30564 1726882825.65547: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882825.65553: Set connection var ansible_connection to ssh 30564 1726882825.65588: variable 'ansible_shell_executable' from source: unknown 30564 1726882825.65595: variable 'ansible_connection' from source: unknown 30564 1726882825.65602: variable 'ansible_module_compression' from source: unknown 30564 1726882825.65608: variable 'ansible_shell_type' from source: unknown 30564 1726882825.65613: variable 'ansible_shell_executable' from source: unknown 30564 1726882825.65619: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882825.65625: variable 'ansible_pipelining' from source: unknown 30564 1726882825.65631: variable 'ansible_timeout' from source: unknown 30564 1726882825.65637: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882825.65846: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30564 1726882825.65862: variable 'omit' from source: magic vars 30564 1726882825.65878: starting attempt loop 30564 1726882825.65885: running the handler 30564 1726882825.65904: _low_level_execute_command(): starting 30564 1726882825.65915: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882825.66654: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882825.66678: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882825.66695: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882825.66713: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882825.66756: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882825.66772: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882825.66789: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882825.66807: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882825.66818: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882825.66828: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882825.66838: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882825.66851: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882825.66869: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882825.66885: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882825.66902: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882825.66917: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882825.66993: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882825.67023: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882825.67039: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882825.67179: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882825.68878: stdout chunk (state=3): >>>/root <<< 30564 1726882825.69070: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882825.69073: stdout chunk (state=3): >>><<< 30564 1726882825.69076: stderr chunk (state=3): >>><<< 30564 1726882825.69193: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882825.69196: _low_level_execute_command(): starting 30564 1726882825.69199: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882825.6910114-31627-167593137751937 `" && echo ansible-tmp-1726882825.6910114-31627-167593137751937="` echo /root/.ansible/tmp/ansible-tmp-1726882825.6910114-31627-167593137751937 `" ) && sleep 0' 30564 1726882825.70534: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882825.70538: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882825.70580: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 30564 1726882825.70584: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882825.70588: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882825.70640: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882825.70888: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882825.70894: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882825.71008: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882825.72983: stdout chunk (state=3): >>>ansible-tmp-1726882825.6910114-31627-167593137751937=/root/.ansible/tmp/ansible-tmp-1726882825.6910114-31627-167593137751937 <<< 30564 1726882825.73110: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882825.73174: stderr chunk (state=3): >>><<< 30564 1726882825.73178: stdout chunk (state=3): >>><<< 30564 1726882825.73475: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882825.6910114-31627-167593137751937=/root/.ansible/tmp/ansible-tmp-1726882825.6910114-31627-167593137751937 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882825.73479: variable 'ansible_module_compression' from source: unknown 30564 1726882825.73481: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 30564 1726882825.73483: variable 'ansible_facts' from source: unknown 30564 1726882825.73485: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882825.6910114-31627-167593137751937/AnsiballZ_ping.py 30564 1726882825.74019: Sending initial data 30564 1726882825.74022: Sent initial data (153 bytes) 30564 1726882825.77000: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882825.77004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882825.77033: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 30564 1726882825.77037: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882825.77039: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882825.77099: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882825.77787: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882825.77793: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882825.77905: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882825.79741: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882825.79838: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882825.79943: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmpb_sx2lws /root/.ansible/tmp/ansible-tmp-1726882825.6910114-31627-167593137751937/AnsiballZ_ping.py <<< 30564 1726882825.80036: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882825.81504: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882825.81762: stderr chunk (state=3): >>><<< 30564 1726882825.81777: stdout chunk (state=3): >>><<< 30564 1726882825.81780: done transferring module to remote 30564 1726882825.81783: _low_level_execute_command(): starting 30564 1726882825.81785: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882825.6910114-31627-167593137751937/ /root/.ansible/tmp/ansible-tmp-1726882825.6910114-31627-167593137751937/AnsiballZ_ping.py && sleep 0' 30564 1726882825.82620: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882825.82624: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882825.82663: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 30564 1726882825.82671: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882825.82674: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882825.82728: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882825.83588: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882825.83594: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882825.83703: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882825.85566: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882825.85634: stderr chunk (state=3): >>><<< 30564 1726882825.85637: stdout chunk (state=3): >>><<< 30564 1726882825.85735: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882825.85739: _low_level_execute_command(): starting 30564 1726882825.85741: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882825.6910114-31627-167593137751937/AnsiballZ_ping.py && sleep 0' 30564 1726882825.87016: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882825.87019: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882825.87059: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 30564 1726882825.87062: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882825.87066: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882825.87118: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882825.87288: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882825.87411: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882826.00887: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 30564 1726882826.01962: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882826.02039: stderr chunk (state=3): >>><<< 30564 1726882826.02043: stdout chunk (state=3): >>><<< 30564 1726882826.02160: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882826.02167: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882825.6910114-31627-167593137751937/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882826.02170: _low_level_execute_command(): starting 30564 1726882826.02172: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882825.6910114-31627-167593137751937/ > /dev/null 2>&1 && sleep 0' 30564 1726882826.03201: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882826.03205: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882826.03243: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 30564 1726882826.03247: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882826.03249: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 30564 1726882826.03251: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882826.03309: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882826.03462: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882826.03487: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882826.03620: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882826.05492: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882826.05579: stderr chunk (state=3): >>><<< 30564 1726882826.05594: stdout chunk (state=3): >>><<< 30564 1726882826.05973: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882826.05977: handler run complete 30564 1726882826.05979: attempt loop complete, returning result 30564 1726882826.05981: _execute() done 30564 1726882826.05984: dumping result to json 30564 1726882826.05986: done dumping result, returning 30564 1726882826.05988: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0e448fcc-3ce9-4216-acec-000000000744] 30564 1726882826.05990: sending task result for task 0e448fcc-3ce9-4216-acec-000000000744 30564 1726882826.06057: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000744 30564 1726882826.06060: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 30564 1726882826.06131: no more pending results, returning what we have 30564 1726882826.06134: results queue empty 30564 1726882826.06135: checking for any_errors_fatal 30564 1726882826.06140: done checking for any_errors_fatal 30564 1726882826.06141: checking for max_fail_percentage 30564 1726882826.06143: done checking for max_fail_percentage 30564 1726882826.06143: checking to see if all hosts have failed and the running result is not ok 30564 1726882826.06144: done checking to see if all hosts have failed 30564 1726882826.06145: getting the remaining hosts for this loop 30564 1726882826.06146: done getting the remaining hosts for this loop 30564 1726882826.06149: getting the next task for host managed_node2 30564 1726882826.06159: done getting next task for host managed_node2 30564 1726882826.06161: ^ task is: TASK: meta (role_complete) 30564 1726882826.06170: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882826.06180: getting variables 30564 1726882826.06182: in VariableManager get_vars() 30564 1726882826.06216: Calling all_inventory to load vars for managed_node2 30564 1726882826.06219: Calling groups_inventory to load vars for managed_node2 30564 1726882826.06221: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882826.06230: Calling all_plugins_play to load vars for managed_node2 30564 1726882826.06233: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882826.06236: Calling groups_plugins_play to load vars for managed_node2 30564 1726882826.07832: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882826.09577: done with get_vars() 30564 1726882826.09600: done getting variables 30564 1726882826.09686: done queuing things up, now waiting for results queue to drain 30564 1726882826.09689: results queue empty 30564 1726882826.09690: checking for any_errors_fatal 30564 1726882826.09692: done checking for any_errors_fatal 30564 1726882826.09693: checking for max_fail_percentage 30564 1726882826.09694: done checking for max_fail_percentage 30564 1726882826.09695: checking to see if all hosts have failed and the running result is not ok 30564 1726882826.09696: done checking to see if all hosts have failed 30564 1726882826.09696: getting the remaining hosts for this loop 30564 1726882826.09697: done getting the remaining hosts for this loop 30564 1726882826.09700: getting the next task for host managed_node2 30564 1726882826.09704: done getting next task for host managed_node2 30564 1726882826.09707: ^ task is: TASK: Show result 30564 1726882826.09709: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882826.09712: getting variables 30564 1726882826.09713: in VariableManager get_vars() 30564 1726882826.09722: Calling all_inventory to load vars for managed_node2 30564 1726882826.09725: Calling groups_inventory to load vars for managed_node2 30564 1726882826.09727: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882826.09732: Calling all_plugins_play to load vars for managed_node2 30564 1726882826.09734: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882826.09737: Calling groups_plugins_play to load vars for managed_node2 30564 1726882826.10990: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882826.12755: done with get_vars() 30564 1726882826.12780: done getting variables 30564 1726882826.12822: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show result] ************************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile_no_autoconnect.yml:15 Friday 20 September 2024 21:40:26 -0400 (0:00:00.491) 0:00:24.709 ****** 30564 1726882826.12852: entering _queue_task() for managed_node2/debug 30564 1726882826.13185: worker is 1 (out of 1 available) 30564 1726882826.13198: exiting _queue_task() for managed_node2/debug 30564 1726882826.13212: done queuing things up, now waiting for results queue to drain 30564 1726882826.13213: waiting for pending results... 30564 1726882826.13510: running TaskExecutor() for managed_node2/TASK: Show result 30564 1726882826.13618: in run() - task 0e448fcc-3ce9-4216-acec-0000000006b2 30564 1726882826.13639: variable 'ansible_search_path' from source: unknown 30564 1726882826.13646: variable 'ansible_search_path' from source: unknown 30564 1726882826.13695: calling self._execute() 30564 1726882826.13799: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882826.13810: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882826.13826: variable 'omit' from source: magic vars 30564 1726882826.14215: variable 'ansible_distribution_major_version' from source: facts 30564 1726882826.14235: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882826.14245: variable 'omit' from source: magic vars 30564 1726882826.14301: variable 'omit' from source: magic vars 30564 1726882826.14341: variable 'omit' from source: magic vars 30564 1726882826.14390: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882826.14434: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882826.14458: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882826.14486: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882826.14502: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882826.14539: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882826.14548: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882826.14556: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882826.14785: Set connection var ansible_timeout to 10 30564 1726882826.14795: Set connection var ansible_pipelining to False 30564 1726882826.14857: Set connection var ansible_shell_type to sh 30564 1726882826.14872: Set connection var ansible_shell_executable to /bin/sh 30564 1726882826.14884: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882826.14891: Set connection var ansible_connection to ssh 30564 1726882826.14917: variable 'ansible_shell_executable' from source: unknown 30564 1726882826.14967: variable 'ansible_connection' from source: unknown 30564 1726882826.14978: variable 'ansible_module_compression' from source: unknown 30564 1726882826.14984: variable 'ansible_shell_type' from source: unknown 30564 1726882826.14991: variable 'ansible_shell_executable' from source: unknown 30564 1726882826.14996: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882826.15003: variable 'ansible_pipelining' from source: unknown 30564 1726882826.15009: variable 'ansible_timeout' from source: unknown 30564 1726882826.15016: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882826.15271: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882826.15412: variable 'omit' from source: magic vars 30564 1726882826.15422: starting attempt loop 30564 1726882826.15428: running the handler 30564 1726882826.15483: variable '__network_connections_result' from source: set_fact 30564 1726882826.15569: variable '__network_connections_result' from source: set_fact 30564 1726882826.15944: handler run complete 30564 1726882826.15982: attempt loop complete, returning result 30564 1726882826.15990: _execute() done 30564 1726882826.15997: dumping result to json 30564 1726882826.16006: done dumping result, returning 30564 1726882826.16017: done running TaskExecutor() for managed_node2/TASK: Show result [0e448fcc-3ce9-4216-acec-0000000006b2] 30564 1726882826.16027: sending task result for task 0e448fcc-3ce9-4216-acec-0000000006b2 ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": false, "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 25972e22-5267-43e5-84f8-5cddc8875a78\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 25972e22-5267-43e5-84f8-5cddc8875a78" ] } } 30564 1726882826.16232: no more pending results, returning what we have 30564 1726882826.16236: results queue empty 30564 1726882826.16237: checking for any_errors_fatal 30564 1726882826.16239: done checking for any_errors_fatal 30564 1726882826.16239: checking for max_fail_percentage 30564 1726882826.16241: done checking for max_fail_percentage 30564 1726882826.16242: checking to see if all hosts have failed and the running result is not ok 30564 1726882826.16243: done checking to see if all hosts have failed 30564 1726882826.16244: getting the remaining hosts for this loop 30564 1726882826.16246: done getting the remaining hosts for this loop 30564 1726882826.16250: getting the next task for host managed_node2 30564 1726882826.16260: done getting next task for host managed_node2 30564 1726882826.16270: ^ task is: TASK: Asserts 30564 1726882826.16274: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882826.16279: getting variables 30564 1726882826.16281: in VariableManager get_vars() 30564 1726882826.16311: Calling all_inventory to load vars for managed_node2 30564 1726882826.16313: Calling groups_inventory to load vars for managed_node2 30564 1726882826.16317: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882826.16329: Calling all_plugins_play to load vars for managed_node2 30564 1726882826.16333: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882826.16336: Calling groups_plugins_play to load vars for managed_node2 30564 1726882826.17538: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000006b2 30564 1726882826.17542: WORKER PROCESS EXITING 30564 1726882826.18929: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882826.20941: done with get_vars() 30564 1726882826.20972: done getting variables TASK [Asserts] ***************************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:36 Friday 20 September 2024 21:40:26 -0400 (0:00:00.084) 0:00:24.793 ****** 30564 1726882826.21262: entering _queue_task() for managed_node2/include_tasks 30564 1726882826.21592: worker is 1 (out of 1 available) 30564 1726882826.21604: exiting _queue_task() for managed_node2/include_tasks 30564 1726882826.21618: done queuing things up, now waiting for results queue to drain 30564 1726882826.21619: waiting for pending results... 30564 1726882826.21950: running TaskExecutor() for managed_node2/TASK: Asserts 30564 1726882826.22070: in run() - task 0e448fcc-3ce9-4216-acec-0000000005b9 30564 1726882826.22098: variable 'ansible_search_path' from source: unknown 30564 1726882826.22106: variable 'ansible_search_path' from source: unknown 30564 1726882826.22153: variable 'lsr_assert' from source: include params 30564 1726882826.22569: variable 'lsr_assert' from source: include params 30564 1726882826.22639: variable 'omit' from source: magic vars 30564 1726882826.22796: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882826.22810: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882826.22823: variable 'omit' from source: magic vars 30564 1726882826.23155: variable 'ansible_distribution_major_version' from source: facts 30564 1726882826.23312: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882826.23323: variable 'item' from source: unknown 30564 1726882826.23393: variable 'item' from source: unknown 30564 1726882826.23547: variable 'item' from source: unknown 30564 1726882826.23612: variable 'item' from source: unknown 30564 1726882826.23924: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882826.23936: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882826.23948: variable 'omit' from source: magic vars 30564 1726882826.24317: variable 'ansible_distribution_major_version' from source: facts 30564 1726882826.24327: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882826.24336: variable 'item' from source: unknown 30564 1726882826.24403: variable 'item' from source: unknown 30564 1726882826.24438: variable 'item' from source: unknown 30564 1726882826.24636: variable 'item' from source: unknown 30564 1726882826.24722: dumping result to json 30564 1726882826.24734: done dumping result, returning 30564 1726882826.24746: done running TaskExecutor() for managed_node2/TASK: Asserts [0e448fcc-3ce9-4216-acec-0000000005b9] 30564 1726882826.24755: sending task result for task 0e448fcc-3ce9-4216-acec-0000000005b9 30564 1726882826.24874: no more pending results, returning what we have 30564 1726882826.24880: in VariableManager get_vars() 30564 1726882826.24916: Calling all_inventory to load vars for managed_node2 30564 1726882826.24920: Calling groups_inventory to load vars for managed_node2 30564 1726882826.24924: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882826.24939: Calling all_plugins_play to load vars for managed_node2 30564 1726882826.24943: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882826.24946: Calling groups_plugins_play to load vars for managed_node2 30564 1726882826.25986: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000005b9 30564 1726882826.25989: WORKER PROCESS EXITING 30564 1726882826.26859: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882826.31931: done with get_vars() 30564 1726882826.32067: variable 'ansible_search_path' from source: unknown 30564 1726882826.32069: variable 'ansible_search_path' from source: unknown 30564 1726882826.32113: variable 'ansible_search_path' from source: unknown 30564 1726882826.32114: variable 'ansible_search_path' from source: unknown 30564 1726882826.32145: we have included files to process 30564 1726882826.32146: generating all_blocks data 30564 1726882826.32148: done generating all_blocks data 30564 1726882826.32152: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 30564 1726882826.32154: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 30564 1726882826.32156: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 30564 1726882826.32378: in VariableManager get_vars() 30564 1726882826.32400: done with get_vars() 30564 1726882826.32631: done processing included file 30564 1726882826.32633: iterating over new_blocks loaded from include file 30564 1726882826.32634: in VariableManager get_vars() 30564 1726882826.32648: done with get_vars() 30564 1726882826.32649: filtering new block on tags 30564 1726882826.32684: done filtering new block on tags 30564 1726882826.32687: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed_node2 => (item=tasks/assert_device_absent.yml) 30564 1726882826.32691: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 30564 1726882826.32692: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 30564 1726882826.32695: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 30564 1726882826.32905: in VariableManager get_vars() 30564 1726882826.32921: done with get_vars() 30564 1726882826.33520: done processing included file 30564 1726882826.33522: iterating over new_blocks loaded from include file 30564 1726882826.33523: in VariableManager get_vars() 30564 1726882826.33535: done with get_vars() 30564 1726882826.33536: filtering new block on tags 30564 1726882826.33650: done filtering new block on tags 30564 1726882826.33652: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node2 => (item=tasks/assert_profile_present.yml) 30564 1726882826.33656: extending task lists for all hosts with included blocks 30564 1726882826.36243: done extending task lists 30564 1726882826.36245: done processing included files 30564 1726882826.36245: results queue empty 30564 1726882826.36246: checking for any_errors_fatal 30564 1726882826.36252: done checking for any_errors_fatal 30564 1726882826.36253: checking for max_fail_percentage 30564 1726882826.36254: done checking for max_fail_percentage 30564 1726882826.36255: checking to see if all hosts have failed and the running result is not ok 30564 1726882826.36255: done checking to see if all hosts have failed 30564 1726882826.36256: getting the remaining hosts for this loop 30564 1726882826.36257: done getting the remaining hosts for this loop 30564 1726882826.36260: getting the next task for host managed_node2 30564 1726882826.36267: done getting next task for host managed_node2 30564 1726882826.36269: ^ task is: TASK: Include the task 'get_interface_stat.yml' 30564 1726882826.36272: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882826.36281: getting variables 30564 1726882826.36283: in VariableManager get_vars() 30564 1726882826.36406: Calling all_inventory to load vars for managed_node2 30564 1726882826.36409: Calling groups_inventory to load vars for managed_node2 30564 1726882826.36412: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882826.36417: Calling all_plugins_play to load vars for managed_node2 30564 1726882826.36420: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882826.36423: Calling groups_plugins_play to load vars for managed_node2 30564 1726882826.37965: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882826.39848: done with get_vars() 30564 1726882826.39872: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Friday 20 September 2024 21:40:26 -0400 (0:00:00.186) 0:00:24.980 ****** 30564 1726882826.39955: entering _queue_task() for managed_node2/include_tasks 30564 1726882826.40307: worker is 1 (out of 1 available) 30564 1726882826.40321: exiting _queue_task() for managed_node2/include_tasks 30564 1726882826.40334: done queuing things up, now waiting for results queue to drain 30564 1726882826.40335: waiting for pending results... 30564 1726882826.40636: running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' 30564 1726882826.40760: in run() - task 0e448fcc-3ce9-4216-acec-0000000008a8 30564 1726882826.40790: variable 'ansible_search_path' from source: unknown 30564 1726882826.40801: variable 'ansible_search_path' from source: unknown 30564 1726882826.40843: calling self._execute() 30564 1726882826.40946: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882826.40959: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882826.40978: variable 'omit' from source: magic vars 30564 1726882826.41573: variable 'ansible_distribution_major_version' from source: facts 30564 1726882826.41592: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882826.41602: _execute() done 30564 1726882826.41609: dumping result to json 30564 1726882826.41616: done dumping result, returning 30564 1726882826.41625: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' [0e448fcc-3ce9-4216-acec-0000000008a8] 30564 1726882826.41637: sending task result for task 0e448fcc-3ce9-4216-acec-0000000008a8 30564 1726882826.41782: no more pending results, returning what we have 30564 1726882826.41789: in VariableManager get_vars() 30564 1726882826.41831: Calling all_inventory to load vars for managed_node2 30564 1726882826.41834: Calling groups_inventory to load vars for managed_node2 30564 1726882826.41838: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882826.41852: Calling all_plugins_play to load vars for managed_node2 30564 1726882826.41855: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882826.41858: Calling groups_plugins_play to load vars for managed_node2 30564 1726882826.42959: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000008a8 30564 1726882826.42962: WORKER PROCESS EXITING 30564 1726882826.43866: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882826.46214: done with get_vars() 30564 1726882826.46239: variable 'ansible_search_path' from source: unknown 30564 1726882826.46240: variable 'ansible_search_path' from source: unknown 30564 1726882826.46249: variable 'item' from source: include params 30564 1726882826.46360: variable 'item' from source: include params 30564 1726882826.46395: we have included files to process 30564 1726882826.46396: generating all_blocks data 30564 1726882826.46398: done generating all_blocks data 30564 1726882826.46399: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30564 1726882826.46400: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30564 1726882826.46402: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30564 1726882826.46584: done processing included file 30564 1726882826.46586: iterating over new_blocks loaded from include file 30564 1726882826.46588: in VariableManager get_vars() 30564 1726882826.46602: done with get_vars() 30564 1726882826.46604: filtering new block on tags 30564 1726882826.46628: done filtering new block on tags 30564 1726882826.46630: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node2 30564 1726882826.46635: extending task lists for all hosts with included blocks 30564 1726882826.46794: done extending task lists 30564 1726882826.46796: done processing included files 30564 1726882826.46796: results queue empty 30564 1726882826.46797: checking for any_errors_fatal 30564 1726882826.46800: done checking for any_errors_fatal 30564 1726882826.46801: checking for max_fail_percentage 30564 1726882826.46802: done checking for max_fail_percentage 30564 1726882826.46803: checking to see if all hosts have failed and the running result is not ok 30564 1726882826.46803: done checking to see if all hosts have failed 30564 1726882826.46804: getting the remaining hosts for this loop 30564 1726882826.46805: done getting the remaining hosts for this loop 30564 1726882826.46808: getting the next task for host managed_node2 30564 1726882826.46812: done getting next task for host managed_node2 30564 1726882826.46814: ^ task is: TASK: Get stat for interface {{ interface }} 30564 1726882826.46818: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882826.46820: getting variables 30564 1726882826.46821: in VariableManager get_vars() 30564 1726882826.46829: Calling all_inventory to load vars for managed_node2 30564 1726882826.46831: Calling groups_inventory to load vars for managed_node2 30564 1726882826.46833: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882826.46838: Calling all_plugins_play to load vars for managed_node2 30564 1726882826.46840: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882826.46843: Calling groups_plugins_play to load vars for managed_node2 30564 1726882826.49122: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882826.59134: done with get_vars() 30564 1726882826.59161: done getting variables 30564 1726882826.59299: variable 'interface' from source: play vars TASK [Get stat for interface statebr] ****************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:40:26 -0400 (0:00:00.193) 0:00:25.174 ****** 30564 1726882826.59327: entering _queue_task() for managed_node2/stat 30564 1726882826.60098: worker is 1 (out of 1 available) 30564 1726882826.60224: exiting _queue_task() for managed_node2/stat 30564 1726882826.60236: done queuing things up, now waiting for results queue to drain 30564 1726882826.60238: waiting for pending results... 30564 1726882826.61032: running TaskExecutor() for managed_node2/TASK: Get stat for interface statebr 30564 1726882826.61162: in run() - task 0e448fcc-3ce9-4216-acec-000000000928 30564 1726882826.61178: variable 'ansible_search_path' from source: unknown 30564 1726882826.61183: variable 'ansible_search_path' from source: unknown 30564 1726882826.61219: calling self._execute() 30564 1726882826.61337: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882826.61342: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882826.61346: variable 'omit' from source: magic vars 30564 1726882826.61839: variable 'ansible_distribution_major_version' from source: facts 30564 1726882826.61843: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882826.61846: variable 'omit' from source: magic vars 30564 1726882826.61940: variable 'omit' from source: magic vars 30564 1726882826.62032: variable 'interface' from source: play vars 30564 1726882826.62048: variable 'omit' from source: magic vars 30564 1726882826.62093: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882826.62127: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882826.62147: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882826.62172: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882826.62180: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882826.62210: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882826.62214: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882826.62217: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882826.62315: Set connection var ansible_timeout to 10 30564 1726882826.62321: Set connection var ansible_pipelining to False 30564 1726882826.62323: Set connection var ansible_shell_type to sh 30564 1726882826.62330: Set connection var ansible_shell_executable to /bin/sh 30564 1726882826.62338: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882826.62341: Set connection var ansible_connection to ssh 30564 1726882826.62367: variable 'ansible_shell_executable' from source: unknown 30564 1726882826.62473: variable 'ansible_connection' from source: unknown 30564 1726882826.62476: variable 'ansible_module_compression' from source: unknown 30564 1726882826.62479: variable 'ansible_shell_type' from source: unknown 30564 1726882826.62482: variable 'ansible_shell_executable' from source: unknown 30564 1726882826.62486: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882826.62488: variable 'ansible_pipelining' from source: unknown 30564 1726882826.62490: variable 'ansible_timeout' from source: unknown 30564 1726882826.62497: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882826.62691: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30564 1726882826.62701: variable 'omit' from source: magic vars 30564 1726882826.62705: starting attempt loop 30564 1726882826.62709: running the handler 30564 1726882826.62722: _low_level_execute_command(): starting 30564 1726882826.62729: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882826.63617: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882826.63698: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882826.63707: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882826.63722: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882826.63760: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882826.63773: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882826.63782: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882826.63797: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882826.63806: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882826.63812: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882826.63819: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882826.63828: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882826.63840: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882826.63847: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882826.63854: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882826.63864: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882826.63937: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882826.63955: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882826.63973: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882826.64107: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882826.65797: stdout chunk (state=3): >>>/root <<< 30564 1726882826.65966: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882826.65972: stdout chunk (state=3): >>><<< 30564 1726882826.65980: stderr chunk (state=3): >>><<< 30564 1726882826.66002: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882826.66012: _low_level_execute_command(): starting 30564 1726882826.66018: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882826.659982-31686-130738518591997 `" && echo ansible-tmp-1726882826.659982-31686-130738518591997="` echo /root/.ansible/tmp/ansible-tmp-1726882826.659982-31686-130738518591997 `" ) && sleep 0' 30564 1726882826.67315: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882826.68180: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882826.68191: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882826.68205: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882826.68241: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882826.68248: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882826.68258: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882826.68273: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882826.68281: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882826.68287: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882826.68295: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882826.68304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882826.68315: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882826.68323: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882826.68329: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882826.68338: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882826.68410: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882826.68427: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882826.68440: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882826.68577: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882826.70560: stdout chunk (state=3): >>>ansible-tmp-1726882826.659982-31686-130738518591997=/root/.ansible/tmp/ansible-tmp-1726882826.659982-31686-130738518591997 <<< 30564 1726882826.70678: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882826.70750: stderr chunk (state=3): >>><<< 30564 1726882826.70753: stdout chunk (state=3): >>><<< 30564 1726882826.70778: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882826.659982-31686-130738518591997=/root/.ansible/tmp/ansible-tmp-1726882826.659982-31686-130738518591997 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882826.70825: variable 'ansible_module_compression' from source: unknown 30564 1726882826.70888: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 30564 1726882826.70922: variable 'ansible_facts' from source: unknown 30564 1726882826.71004: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882826.659982-31686-130738518591997/AnsiballZ_stat.py 30564 1726882826.71818: Sending initial data 30564 1726882826.71822: Sent initial data (152 bytes) 30564 1726882826.73873: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882826.73878: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882826.73921: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882826.73924: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration <<< 30564 1726882826.73961: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882826.73965: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882826.73982: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 30564 1726882826.73985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882826.74063: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882826.74199: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882826.74304: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882826.76149: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882826.76247: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882826.76349: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmp46h_s1hh /root/.ansible/tmp/ansible-tmp-1726882826.659982-31686-130738518591997/AnsiballZ_stat.py <<< 30564 1726882826.76442: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882826.77974: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882826.78142: stderr chunk (state=3): >>><<< 30564 1726882826.78146: stdout chunk (state=3): >>><<< 30564 1726882826.78148: done transferring module to remote 30564 1726882826.78151: _low_level_execute_command(): starting 30564 1726882826.78153: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882826.659982-31686-130738518591997/ /root/.ansible/tmp/ansible-tmp-1726882826.659982-31686-130738518591997/AnsiballZ_stat.py && sleep 0' 30564 1726882826.79778: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882826.79796: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882826.79930: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882826.79950: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882826.79994: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882826.80007: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882826.80027: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882826.80046: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882826.80058: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882826.80074: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882826.80088: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882826.80103: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882826.80120: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882826.80139: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882826.80152: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882826.80170: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882826.80251: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882826.80376: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882826.80390: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882826.80596: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882826.82539: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882826.82542: stdout chunk (state=3): >>><<< 30564 1726882826.82544: stderr chunk (state=3): >>><<< 30564 1726882826.82635: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882826.82639: _low_level_execute_command(): starting 30564 1726882826.82641: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882826.659982-31686-130738518591997/AnsiballZ_stat.py && sleep 0' 30564 1726882826.83711: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882826.83714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882826.83754: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 30564 1726882826.83758: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882826.83761: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882826.83815: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882826.84783: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882826.84913: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882826.98437: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 30564 1726882826.99495: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882826.99572: stderr chunk (state=3): >>><<< 30564 1726882826.99585: stdout chunk (state=3): >>><<< 30564 1726882826.99697: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882826.99702: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882826.659982-31686-130738518591997/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882826.99705: _low_level_execute_command(): starting 30564 1726882826.99712: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882826.659982-31686-130738518591997/ > /dev/null 2>&1 && sleep 0' 30564 1726882827.01150: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882827.01183: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882827.01199: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882827.01388: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882827.01430: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882827.01442: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882827.01456: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882827.01480: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882827.01494: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882827.01506: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882827.01518: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882827.01533: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882827.01551: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882827.01566: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882827.01581: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882827.01596: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882827.01672: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882827.01699: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882827.01716: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882827.01887: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882827.03816: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882827.03819: stdout chunk (state=3): >>><<< 30564 1726882827.03821: stderr chunk (state=3): >>><<< 30564 1726882827.03872: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882827.03875: handler run complete 30564 1726882827.03879: attempt loop complete, returning result 30564 1726882827.04071: _execute() done 30564 1726882827.04074: dumping result to json 30564 1726882827.04077: done dumping result, returning 30564 1726882827.04079: done running TaskExecutor() for managed_node2/TASK: Get stat for interface statebr [0e448fcc-3ce9-4216-acec-000000000928] 30564 1726882827.04081: sending task result for task 0e448fcc-3ce9-4216-acec-000000000928 30564 1726882827.04157: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000928 30564 1726882827.04160: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 30564 1726882827.04221: no more pending results, returning what we have 30564 1726882827.04224: results queue empty 30564 1726882827.04225: checking for any_errors_fatal 30564 1726882827.04227: done checking for any_errors_fatal 30564 1726882827.04227: checking for max_fail_percentage 30564 1726882827.04229: done checking for max_fail_percentage 30564 1726882827.04230: checking to see if all hosts have failed and the running result is not ok 30564 1726882827.04231: done checking to see if all hosts have failed 30564 1726882827.04231: getting the remaining hosts for this loop 30564 1726882827.04233: done getting the remaining hosts for this loop 30564 1726882827.04237: getting the next task for host managed_node2 30564 1726882827.04245: done getting next task for host managed_node2 30564 1726882827.04247: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 30564 1726882827.04252: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882827.04256: getting variables 30564 1726882827.04257: in VariableManager get_vars() 30564 1726882827.04288: Calling all_inventory to load vars for managed_node2 30564 1726882827.04291: Calling groups_inventory to load vars for managed_node2 30564 1726882827.04294: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882827.04304: Calling all_plugins_play to load vars for managed_node2 30564 1726882827.04306: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882827.04309: Calling groups_plugins_play to load vars for managed_node2 30564 1726882827.07799: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882827.13128: done with get_vars() 30564 1726882827.13156: done getting variables 30564 1726882827.13440: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30564 1726882827.13793: variable 'interface' from source: play vars TASK [Assert that the interface is absent - 'statebr'] ************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Friday 20 September 2024 21:40:27 -0400 (0:00:00.544) 0:00:25.719 ****** 30564 1726882827.14049: entering _queue_task() for managed_node2/assert 30564 1726882827.15091: worker is 1 (out of 1 available) 30564 1726882827.15104: exiting _queue_task() for managed_node2/assert 30564 1726882827.15117: done queuing things up, now waiting for results queue to drain 30564 1726882827.15118: waiting for pending results... 30564 1726882827.15378: running TaskExecutor() for managed_node2/TASK: Assert that the interface is absent - 'statebr' 30564 1726882827.15513: in run() - task 0e448fcc-3ce9-4216-acec-0000000008a9 30564 1726882827.15534: variable 'ansible_search_path' from source: unknown 30564 1726882827.15542: variable 'ansible_search_path' from source: unknown 30564 1726882827.15590: calling self._execute() 30564 1726882827.15695: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882827.15709: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882827.15723: variable 'omit' from source: magic vars 30564 1726882827.16103: variable 'ansible_distribution_major_version' from source: facts 30564 1726882827.16127: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882827.16139: variable 'omit' from source: magic vars 30564 1726882827.16192: variable 'omit' from source: magic vars 30564 1726882827.16297: variable 'interface' from source: play vars 30564 1726882827.16321: variable 'omit' from source: magic vars 30564 1726882827.16373: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882827.16412: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882827.16442: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882827.16469: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882827.16487: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882827.16521: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882827.16530: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882827.16538: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882827.16646: Set connection var ansible_timeout to 10 30564 1726882827.16662: Set connection var ansible_pipelining to False 30564 1726882827.16677: Set connection var ansible_shell_type to sh 30564 1726882827.16691: Set connection var ansible_shell_executable to /bin/sh 30564 1726882827.16703: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882827.16710: Set connection var ansible_connection to ssh 30564 1726882827.16739: variable 'ansible_shell_executable' from source: unknown 30564 1726882827.16749: variable 'ansible_connection' from source: unknown 30564 1726882827.16756: variable 'ansible_module_compression' from source: unknown 30564 1726882827.16772: variable 'ansible_shell_type' from source: unknown 30564 1726882827.16781: variable 'ansible_shell_executable' from source: unknown 30564 1726882827.16790: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882827.16798: variable 'ansible_pipelining' from source: unknown 30564 1726882827.16804: variable 'ansible_timeout' from source: unknown 30564 1726882827.16813: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882827.17215: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882827.17232: variable 'omit' from source: magic vars 30564 1726882827.17243: starting attempt loop 30564 1726882827.17255: running the handler 30564 1726882827.17410: variable 'interface_stat' from source: set_fact 30564 1726882827.17424: Evaluated conditional (not interface_stat.stat.exists): True 30564 1726882827.17433: handler run complete 30564 1726882827.17451: attempt loop complete, returning result 30564 1726882827.17458: _execute() done 30564 1726882827.17470: dumping result to json 30564 1726882827.17482: done dumping result, returning 30564 1726882827.17493: done running TaskExecutor() for managed_node2/TASK: Assert that the interface is absent - 'statebr' [0e448fcc-3ce9-4216-acec-0000000008a9] 30564 1726882827.17503: sending task result for task 0e448fcc-3ce9-4216-acec-0000000008a9 ok: [managed_node2] => { "changed": false } MSG: All assertions passed 30564 1726882827.17651: no more pending results, returning what we have 30564 1726882827.17655: results queue empty 30564 1726882827.17656: checking for any_errors_fatal 30564 1726882827.17670: done checking for any_errors_fatal 30564 1726882827.17671: checking for max_fail_percentage 30564 1726882827.17676: done checking for max_fail_percentage 30564 1726882827.17677: checking to see if all hosts have failed and the running result is not ok 30564 1726882827.17677: done checking to see if all hosts have failed 30564 1726882827.17679: getting the remaining hosts for this loop 30564 1726882827.17680: done getting the remaining hosts for this loop 30564 1726882827.17685: getting the next task for host managed_node2 30564 1726882827.17695: done getting next task for host managed_node2 30564 1726882827.17698: ^ task is: TASK: Include the task 'get_profile_stat.yml' 30564 1726882827.17701: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882827.17706: getting variables 30564 1726882827.17708: in VariableManager get_vars() 30564 1726882827.17739: Calling all_inventory to load vars for managed_node2 30564 1726882827.17742: Calling groups_inventory to load vars for managed_node2 30564 1726882827.17746: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882827.17758: Calling all_plugins_play to load vars for managed_node2 30564 1726882827.17761: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882827.17766: Calling groups_plugins_play to load vars for managed_node2 30564 1726882827.18284: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000008a9 30564 1726882827.18287: WORKER PROCESS EXITING 30564 1726882827.20384: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882827.24130: done with get_vars() 30564 1726882827.24157: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 21:40:27 -0400 (0:00:00.105) 0:00:25.824 ****** 30564 1726882827.24369: entering _queue_task() for managed_node2/include_tasks 30564 1726882827.25152: worker is 1 (out of 1 available) 30564 1726882827.25170: exiting _queue_task() for managed_node2/include_tasks 30564 1726882827.25184: done queuing things up, now waiting for results queue to drain 30564 1726882827.25185: waiting for pending results... 30564 1726882827.26161: running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' 30564 1726882827.26266: in run() - task 0e448fcc-3ce9-4216-acec-0000000008ad 30564 1726882827.26279: variable 'ansible_search_path' from source: unknown 30564 1726882827.26283: variable 'ansible_search_path' from source: unknown 30564 1726882827.26340: calling self._execute() 30564 1726882827.27325: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882827.27331: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882827.27343: variable 'omit' from source: magic vars 30564 1726882827.28118: variable 'ansible_distribution_major_version' from source: facts 30564 1726882827.28131: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882827.28137: _execute() done 30564 1726882827.28140: dumping result to json 30564 1726882827.28143: done dumping result, returning 30564 1726882827.28150: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' [0e448fcc-3ce9-4216-acec-0000000008ad] 30564 1726882827.28156: sending task result for task 0e448fcc-3ce9-4216-acec-0000000008ad 30564 1726882827.28376: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000008ad 30564 1726882827.28379: WORKER PROCESS EXITING 30564 1726882827.28408: no more pending results, returning what we have 30564 1726882827.28414: in VariableManager get_vars() 30564 1726882827.28453: Calling all_inventory to load vars for managed_node2 30564 1726882827.28457: Calling groups_inventory to load vars for managed_node2 30564 1726882827.28461: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882827.28481: Calling all_plugins_play to load vars for managed_node2 30564 1726882827.28485: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882827.28490: Calling groups_plugins_play to load vars for managed_node2 30564 1726882827.32046: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882827.37273: done with get_vars() 30564 1726882827.37414: variable 'ansible_search_path' from source: unknown 30564 1726882827.37415: variable 'ansible_search_path' from source: unknown 30564 1726882827.37426: variable 'item' from source: include params 30564 1726882827.37658: variable 'item' from source: include params 30564 1726882827.37701: we have included files to process 30564 1726882827.37702: generating all_blocks data 30564 1726882827.37704: done generating all_blocks data 30564 1726882827.37823: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30564 1726882827.37825: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30564 1726882827.37828: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30564 1726882827.39808: done processing included file 30564 1726882827.39810: iterating over new_blocks loaded from include file 30564 1726882827.39811: in VariableManager get_vars() 30564 1726882827.39827: done with get_vars() 30564 1726882827.39829: filtering new block on tags 30564 1726882827.40020: done filtering new block on tags 30564 1726882827.40023: in VariableManager get_vars() 30564 1726882827.40037: done with get_vars() 30564 1726882827.40039: filtering new block on tags 30564 1726882827.40214: done filtering new block on tags 30564 1726882827.40217: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node2 30564 1726882827.40223: extending task lists for all hosts with included blocks 30564 1726882827.41124: done extending task lists 30564 1726882827.41125: done processing included files 30564 1726882827.41126: results queue empty 30564 1726882827.41127: checking for any_errors_fatal 30564 1726882827.41131: done checking for any_errors_fatal 30564 1726882827.41131: checking for max_fail_percentage 30564 1726882827.41132: done checking for max_fail_percentage 30564 1726882827.41133: checking to see if all hosts have failed and the running result is not ok 30564 1726882827.41134: done checking to see if all hosts have failed 30564 1726882827.41135: getting the remaining hosts for this loop 30564 1726882827.41136: done getting the remaining hosts for this loop 30564 1726882827.41138: getting the next task for host managed_node2 30564 1726882827.41143: done getting next task for host managed_node2 30564 1726882827.41145: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 30564 1726882827.41149: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882827.41151: getting variables 30564 1726882827.41152: in VariableManager get_vars() 30564 1726882827.41160: Calling all_inventory to load vars for managed_node2 30564 1726882827.41162: Calling groups_inventory to load vars for managed_node2 30564 1726882827.41169: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882827.41287: Calling all_plugins_play to load vars for managed_node2 30564 1726882827.41291: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882827.41295: Calling groups_plugins_play to load vars for managed_node2 30564 1726882827.43913: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882827.47660: done with get_vars() 30564 1726882827.47690: done getting variables 30564 1726882827.47732: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 21:40:27 -0400 (0:00:00.235) 0:00:26.060 ****** 30564 1726882827.47884: entering _queue_task() for managed_node2/set_fact 30564 1726882827.48552: worker is 1 (out of 1 available) 30564 1726882827.48566: exiting _queue_task() for managed_node2/set_fact 30564 1726882827.48581: done queuing things up, now waiting for results queue to drain 30564 1726882827.48582: waiting for pending results... 30564 1726882827.49455: running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag 30564 1726882827.49725: in run() - task 0e448fcc-3ce9-4216-acec-000000000946 30564 1726882827.49745: variable 'ansible_search_path' from source: unknown 30564 1726882827.49832: variable 'ansible_search_path' from source: unknown 30564 1726882827.49876: calling self._execute() 30564 1726882827.50097: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882827.50109: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882827.50122: variable 'omit' from source: magic vars 30564 1726882827.50957: variable 'ansible_distribution_major_version' from source: facts 30564 1726882827.50982: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882827.50994: variable 'omit' from source: magic vars 30564 1726882827.51059: variable 'omit' from source: magic vars 30564 1726882827.51177: variable 'omit' from source: magic vars 30564 1726882827.51221: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882827.51393: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882827.51417: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882827.51440: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882827.51577: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882827.51612: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882827.51621: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882827.51627: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882827.51737: Set connection var ansible_timeout to 10 30564 1726882827.51909: Set connection var ansible_pipelining to False 30564 1726882827.51916: Set connection var ansible_shell_type to sh 30564 1726882827.51925: Set connection var ansible_shell_executable to /bin/sh 30564 1726882827.51936: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882827.51942: Set connection var ansible_connection to ssh 30564 1726882827.51974: variable 'ansible_shell_executable' from source: unknown 30564 1726882827.51982: variable 'ansible_connection' from source: unknown 30564 1726882827.51989: variable 'ansible_module_compression' from source: unknown 30564 1726882827.51996: variable 'ansible_shell_type' from source: unknown 30564 1726882827.52008: variable 'ansible_shell_executable' from source: unknown 30564 1726882827.52018: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882827.52026: variable 'ansible_pipelining' from source: unknown 30564 1726882827.52032: variable 'ansible_timeout' from source: unknown 30564 1726882827.52039: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882827.52297: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882827.52346: variable 'omit' from source: magic vars 30564 1726882827.52451: starting attempt loop 30564 1726882827.52457: running the handler 30564 1726882827.52479: handler run complete 30564 1726882827.52492: attempt loop complete, returning result 30564 1726882827.52498: _execute() done 30564 1726882827.52504: dumping result to json 30564 1726882827.52510: done dumping result, returning 30564 1726882827.52520: done running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag [0e448fcc-3ce9-4216-acec-000000000946] 30564 1726882827.52529: sending task result for task 0e448fcc-3ce9-4216-acec-000000000946 ok: [managed_node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 30564 1726882827.52716: no more pending results, returning what we have 30564 1726882827.52720: results queue empty 30564 1726882827.52722: checking for any_errors_fatal 30564 1726882827.52724: done checking for any_errors_fatal 30564 1726882827.52725: checking for max_fail_percentage 30564 1726882827.52727: done checking for max_fail_percentage 30564 1726882827.52728: checking to see if all hosts have failed and the running result is not ok 30564 1726882827.52728: done checking to see if all hosts have failed 30564 1726882827.52729: getting the remaining hosts for this loop 30564 1726882827.52731: done getting the remaining hosts for this loop 30564 1726882827.52735: getting the next task for host managed_node2 30564 1726882827.52744: done getting next task for host managed_node2 30564 1726882827.52747: ^ task is: TASK: Stat profile file 30564 1726882827.52753: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882827.52758: getting variables 30564 1726882827.52759: in VariableManager get_vars() 30564 1726882827.52797: Calling all_inventory to load vars for managed_node2 30564 1726882827.52800: Calling groups_inventory to load vars for managed_node2 30564 1726882827.52804: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882827.52815: Calling all_plugins_play to load vars for managed_node2 30564 1726882827.52819: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882827.52821: Calling groups_plugins_play to load vars for managed_node2 30564 1726882827.53814: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000946 30564 1726882827.53817: WORKER PROCESS EXITING 30564 1726882827.56147: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882827.59904: done with get_vars() 30564 1726882827.59937: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 21:40:27 -0400 (0:00:00.122) 0:00:26.182 ****** 30564 1726882827.60160: entering _queue_task() for managed_node2/stat 30564 1726882827.60937: worker is 1 (out of 1 available) 30564 1726882827.60950: exiting _queue_task() for managed_node2/stat 30564 1726882827.60962: done queuing things up, now waiting for results queue to drain 30564 1726882827.60965: waiting for pending results... 30564 1726882827.61878: running TaskExecutor() for managed_node2/TASK: Stat profile file 30564 1726882827.62116: in run() - task 0e448fcc-3ce9-4216-acec-000000000947 30564 1726882827.62253: variable 'ansible_search_path' from source: unknown 30564 1726882827.62260: variable 'ansible_search_path' from source: unknown 30564 1726882827.62304: calling self._execute() 30564 1726882827.62410: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882827.62582: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882827.62597: variable 'omit' from source: magic vars 30564 1726882827.63210: variable 'ansible_distribution_major_version' from source: facts 30564 1726882827.63345: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882827.63446: variable 'omit' from source: magic vars 30564 1726882827.63511: variable 'omit' from source: magic vars 30564 1726882827.63707: variable 'profile' from source: play vars 30564 1726882827.63773: variable 'interface' from source: play vars 30564 1726882827.63997: variable 'interface' from source: play vars 30564 1726882827.64027: variable 'omit' from source: magic vars 30564 1726882827.64079: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882827.64227: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882827.64252: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882827.64286: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882827.64310: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882827.64386: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882827.64432: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882827.64441: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882827.64757: Set connection var ansible_timeout to 10 30564 1726882827.64773: Set connection var ansible_pipelining to False 30564 1726882827.64782: Set connection var ansible_shell_type to sh 30564 1726882827.64794: Set connection var ansible_shell_executable to /bin/sh 30564 1726882827.64806: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882827.64815: Set connection var ansible_connection to ssh 30564 1726882827.64847: variable 'ansible_shell_executable' from source: unknown 30564 1726882827.64872: variable 'ansible_connection' from source: unknown 30564 1726882827.64957: variable 'ansible_module_compression' from source: unknown 30564 1726882827.64974: variable 'ansible_shell_type' from source: unknown 30564 1726882827.64989: variable 'ansible_shell_executable' from source: unknown 30564 1726882827.64996: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882827.65004: variable 'ansible_pipelining' from source: unknown 30564 1726882827.65012: variable 'ansible_timeout' from source: unknown 30564 1726882827.65019: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882827.65510: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30564 1726882827.65535: variable 'omit' from source: magic vars 30564 1726882827.65545: starting attempt loop 30564 1726882827.65552: running the handler 30564 1726882827.65576: _low_level_execute_command(): starting 30564 1726882827.65589: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882827.68373: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882827.68411: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882827.68415: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 30564 1726882827.68417: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30564 1726882827.68419: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882827.68422: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882827.68712: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882827.68716: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882827.68718: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882827.68840: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882827.70525: stdout chunk (state=3): >>>/root <<< 30564 1726882827.70631: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882827.70710: stderr chunk (state=3): >>><<< 30564 1726882827.70713: stdout chunk (state=3): >>><<< 30564 1726882827.70830: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882827.70834: _low_level_execute_command(): starting 30564 1726882827.70838: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882827.7073536-31718-175181161046665 `" && echo ansible-tmp-1726882827.7073536-31718-175181161046665="` echo /root/.ansible/tmp/ansible-tmp-1726882827.7073536-31718-175181161046665 `" ) && sleep 0' 30564 1726882827.72611: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882827.72624: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882827.72638: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882827.72670: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882827.72717: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882827.72730: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882827.72745: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882827.72765: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882827.72781: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882827.72799: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882827.72812: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882827.72825: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882827.72840: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882827.72851: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882827.72861: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882827.72881: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882827.72976: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882827.73130: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882827.73145: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882827.73353: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882827.75341: stdout chunk (state=3): >>>ansible-tmp-1726882827.7073536-31718-175181161046665=/root/.ansible/tmp/ansible-tmp-1726882827.7073536-31718-175181161046665 <<< 30564 1726882827.75530: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882827.75534: stdout chunk (state=3): >>><<< 30564 1726882827.75536: stderr chunk (state=3): >>><<< 30564 1726882827.75775: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882827.7073536-31718-175181161046665=/root/.ansible/tmp/ansible-tmp-1726882827.7073536-31718-175181161046665 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882827.75778: variable 'ansible_module_compression' from source: unknown 30564 1726882827.75780: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 30564 1726882827.75783: variable 'ansible_facts' from source: unknown 30564 1726882827.75798: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882827.7073536-31718-175181161046665/AnsiballZ_stat.py 30564 1726882827.76486: Sending initial data 30564 1726882827.76489: Sent initial data (153 bytes) 30564 1726882827.78790: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882827.78946: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882827.78960: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882827.78984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882827.79027: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882827.79046: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882827.79062: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882827.79087: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882827.79100: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882827.79110: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882827.79121: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882827.79133: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882827.79148: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882827.79165: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882827.79179: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882827.79191: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882827.79276: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882827.79390: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882827.79405: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882827.79620: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882827.81461: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882827.81555: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882827.81655: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmpnlqrfd4o /root/.ansible/tmp/ansible-tmp-1726882827.7073536-31718-175181161046665/AnsiballZ_stat.py <<< 30564 1726882827.81752: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882827.83254: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882827.83382: stderr chunk (state=3): >>><<< 30564 1726882827.83385: stdout chunk (state=3): >>><<< 30564 1726882827.83388: done transferring module to remote 30564 1726882827.83390: _low_level_execute_command(): starting 30564 1726882827.83393: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882827.7073536-31718-175181161046665/ /root/.ansible/tmp/ansible-tmp-1726882827.7073536-31718-175181161046665/AnsiballZ_stat.py && sleep 0' 30564 1726882827.84617: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882827.84630: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882827.84644: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882827.84673: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882827.84729: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882827.84774: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882827.84788: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882827.84811: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882827.84821: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882827.84829: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882827.84843: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882827.84861: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882827.84895: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882827.84909: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882827.84918: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882827.84929: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882827.85027: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882827.85044: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882827.85062: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882827.85258: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882827.87183: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882827.87186: stdout chunk (state=3): >>><<< 30564 1726882827.87189: stderr chunk (state=3): >>><<< 30564 1726882827.87285: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882827.87289: _low_level_execute_command(): starting 30564 1726882827.87291: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882827.7073536-31718-175181161046665/AnsiballZ_stat.py && sleep 0' 30564 1726882827.88960: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882827.88973: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882827.88978: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882827.88992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882827.89089: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882827.89165: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882827.89185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882827.89196: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882827.89199: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882827.89219: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882827.89223: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882827.89225: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882827.89237: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882827.89244: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882827.89251: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882827.89260: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882827.89347: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882827.89366: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882827.89378: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882827.89529: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882828.03048: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 30564 1726882828.04181: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882828.04185: stdout chunk (state=3): >>><<< 30564 1726882828.04191: stderr chunk (state=3): >>><<< 30564 1726882828.04213: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882828.04243: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882827.7073536-31718-175181161046665/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882828.04251: _low_level_execute_command(): starting 30564 1726882828.04256: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882827.7073536-31718-175181161046665/ > /dev/null 2>&1 && sleep 0' 30564 1726882828.05334: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882828.05342: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882828.05352: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882828.05366: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882828.05405: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882828.05413: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882828.05426: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882828.05444: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882828.05451: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882828.05458: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882828.05467: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882828.05479: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882828.05489: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882828.05496: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882828.05503: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882828.05512: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882828.05590: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882828.05606: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882828.05617: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882828.05746: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882828.07715: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882828.07723: stdout chunk (state=3): >>><<< 30564 1726882828.07726: stderr chunk (state=3): >>><<< 30564 1726882828.08076: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882828.08079: handler run complete 30564 1726882828.08081: attempt loop complete, returning result 30564 1726882828.08083: _execute() done 30564 1726882828.08085: dumping result to json 30564 1726882828.08087: done dumping result, returning 30564 1726882828.08089: done running TaskExecutor() for managed_node2/TASK: Stat profile file [0e448fcc-3ce9-4216-acec-000000000947] 30564 1726882828.08091: sending task result for task 0e448fcc-3ce9-4216-acec-000000000947 30564 1726882828.08161: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000947 30564 1726882828.08166: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 30564 1726882828.08253: no more pending results, returning what we have 30564 1726882828.08257: results queue empty 30564 1726882828.08258: checking for any_errors_fatal 30564 1726882828.08276: done checking for any_errors_fatal 30564 1726882828.08278: checking for max_fail_percentage 30564 1726882828.08280: done checking for max_fail_percentage 30564 1726882828.08281: checking to see if all hosts have failed and the running result is not ok 30564 1726882828.08285: done checking to see if all hosts have failed 30564 1726882828.08286: getting the remaining hosts for this loop 30564 1726882828.08287: done getting the remaining hosts for this loop 30564 1726882828.08291: getting the next task for host managed_node2 30564 1726882828.08304: done getting next task for host managed_node2 30564 1726882828.08309: ^ task is: TASK: Set NM profile exist flag based on the profile files 30564 1726882828.08314: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882828.08319: getting variables 30564 1726882828.08320: in VariableManager get_vars() 30564 1726882828.08351: Calling all_inventory to load vars for managed_node2 30564 1726882828.08357: Calling groups_inventory to load vars for managed_node2 30564 1726882828.08365: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882828.08384: Calling all_plugins_play to load vars for managed_node2 30564 1726882828.08391: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882828.08395: Calling groups_plugins_play to load vars for managed_node2 30564 1726882828.10379: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882828.12270: done with get_vars() 30564 1726882828.12294: done getting variables 30564 1726882828.12352: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 21:40:28 -0400 (0:00:00.522) 0:00:26.705 ****** 30564 1726882828.12390: entering _queue_task() for managed_node2/set_fact 30564 1726882828.12703: worker is 1 (out of 1 available) 30564 1726882828.12716: exiting _queue_task() for managed_node2/set_fact 30564 1726882828.12728: done queuing things up, now waiting for results queue to drain 30564 1726882828.12729: waiting for pending results... 30564 1726882828.13021: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files 30564 1726882828.13151: in run() - task 0e448fcc-3ce9-4216-acec-000000000948 30564 1726882828.13180: variable 'ansible_search_path' from source: unknown 30564 1726882828.13188: variable 'ansible_search_path' from source: unknown 30564 1726882828.13226: calling self._execute() 30564 1726882828.13331: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882828.13342: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882828.13359: variable 'omit' from source: magic vars 30564 1726882828.13818: variable 'ansible_distribution_major_version' from source: facts 30564 1726882828.13850: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882828.14028: variable 'profile_stat' from source: set_fact 30564 1726882828.14047: Evaluated conditional (profile_stat.stat.exists): False 30564 1726882828.14055: when evaluation is False, skipping this task 30564 1726882828.14062: _execute() done 30564 1726882828.14074: dumping result to json 30564 1726882828.14080: done dumping result, returning 30564 1726882828.14090: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files [0e448fcc-3ce9-4216-acec-000000000948] 30564 1726882828.14108: sending task result for task 0e448fcc-3ce9-4216-acec-000000000948 30564 1726882828.14259: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000948 skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30564 1726882828.14311: no more pending results, returning what we have 30564 1726882828.14316: results queue empty 30564 1726882828.14317: checking for any_errors_fatal 30564 1726882828.14327: done checking for any_errors_fatal 30564 1726882828.14327: checking for max_fail_percentage 30564 1726882828.14329: done checking for max_fail_percentage 30564 1726882828.14330: checking to see if all hosts have failed and the running result is not ok 30564 1726882828.14331: done checking to see if all hosts have failed 30564 1726882828.14332: getting the remaining hosts for this loop 30564 1726882828.14334: done getting the remaining hosts for this loop 30564 1726882828.14337: getting the next task for host managed_node2 30564 1726882828.14346: done getting next task for host managed_node2 30564 1726882828.14349: ^ task is: TASK: Get NM profile info 30564 1726882828.14356: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882828.14360: getting variables 30564 1726882828.14362: in VariableManager get_vars() 30564 1726882828.14397: Calling all_inventory to load vars for managed_node2 30564 1726882828.14400: Calling groups_inventory to load vars for managed_node2 30564 1726882828.14406: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882828.14425: Calling all_plugins_play to load vars for managed_node2 30564 1726882828.14432: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882828.14439: Calling groups_plugins_play to load vars for managed_node2 30564 1726882828.15476: WORKER PROCESS EXITING 30564 1726882828.16542: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882828.18399: done with get_vars() 30564 1726882828.18421: done getting variables 30564 1726882828.18497: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 21:40:28 -0400 (0:00:00.061) 0:00:26.766 ****** 30564 1726882828.18536: entering _queue_task() for managed_node2/shell 30564 1726882828.18873: worker is 1 (out of 1 available) 30564 1726882828.18885: exiting _queue_task() for managed_node2/shell 30564 1726882828.18898: done queuing things up, now waiting for results queue to drain 30564 1726882828.18900: waiting for pending results... 30564 1726882828.19294: running TaskExecutor() for managed_node2/TASK: Get NM profile info 30564 1726882828.19448: in run() - task 0e448fcc-3ce9-4216-acec-000000000949 30564 1726882828.19507: variable 'ansible_search_path' from source: unknown 30564 1726882828.19519: variable 'ansible_search_path' from source: unknown 30564 1726882828.19639: calling self._execute() 30564 1726882828.19861: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882828.19884: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882828.19902: variable 'omit' from source: magic vars 30564 1726882828.20388: variable 'ansible_distribution_major_version' from source: facts 30564 1726882828.20408: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882828.20418: variable 'omit' from source: magic vars 30564 1726882828.20502: variable 'omit' from source: magic vars 30564 1726882828.20635: variable 'profile' from source: play vars 30564 1726882828.20646: variable 'interface' from source: play vars 30564 1726882828.20732: variable 'interface' from source: play vars 30564 1726882828.20760: variable 'omit' from source: magic vars 30564 1726882828.20823: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882828.20877: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882828.20915: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882828.20939: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882828.20955: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882828.20998: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882828.21007: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882828.21018: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882828.21154: Set connection var ansible_timeout to 10 30564 1726882828.21171: Set connection var ansible_pipelining to False 30564 1726882828.21178: Set connection var ansible_shell_type to sh 30564 1726882828.21188: Set connection var ansible_shell_executable to /bin/sh 30564 1726882828.21200: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882828.21207: Set connection var ansible_connection to ssh 30564 1726882828.21241: variable 'ansible_shell_executable' from source: unknown 30564 1726882828.21254: variable 'ansible_connection' from source: unknown 30564 1726882828.21265: variable 'ansible_module_compression' from source: unknown 30564 1726882828.21278: variable 'ansible_shell_type' from source: unknown 30564 1726882828.21286: variable 'ansible_shell_executable' from source: unknown 30564 1726882828.21293: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882828.21301: variable 'ansible_pipelining' from source: unknown 30564 1726882828.21308: variable 'ansible_timeout' from source: unknown 30564 1726882828.21316: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882828.22132: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882828.22162: variable 'omit' from source: magic vars 30564 1726882828.22178: starting attempt loop 30564 1726882828.22190: running the handler 30564 1726882828.22220: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882828.22251: _low_level_execute_command(): starting 30564 1726882828.22279: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882828.23471: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882828.23577: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882828.23707: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882828.25305: stdout chunk (state=3): >>>/root <<< 30564 1726882828.25417: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882828.25537: stderr chunk (state=3): >>><<< 30564 1726882828.25540: stdout chunk (state=3): >>><<< 30564 1726882828.25660: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882828.25666: _low_level_execute_command(): starting 30564 1726882828.25673: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882828.2556174-31741-162369640073118 `" && echo ansible-tmp-1726882828.2556174-31741-162369640073118="` echo /root/.ansible/tmp/ansible-tmp-1726882828.2556174-31741-162369640073118 `" ) && sleep 0' 30564 1726882828.26725: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882828.26738: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882828.26761: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882828.26809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882828.26853: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882828.26909: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882828.26932: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882828.26952: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882828.26971: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882828.26988: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882828.27003: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882828.27022: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882828.27075: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882828.27093: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882828.27109: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882828.27126: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882828.27214: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882828.27233: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882828.27251: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882828.27437: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882828.29382: stdout chunk (state=3): >>>ansible-tmp-1726882828.2556174-31741-162369640073118=/root/.ansible/tmp/ansible-tmp-1726882828.2556174-31741-162369640073118 <<< 30564 1726882828.29532: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882828.29637: stderr chunk (state=3): >>><<< 30564 1726882828.29640: stdout chunk (state=3): >>><<< 30564 1726882828.29657: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882828.2556174-31741-162369640073118=/root/.ansible/tmp/ansible-tmp-1726882828.2556174-31741-162369640073118 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882828.29779: variable 'ansible_module_compression' from source: unknown 30564 1726882828.29991: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30564 1726882828.30053: variable 'ansible_facts' from source: unknown 30564 1726882828.30125: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882828.2556174-31741-162369640073118/AnsiballZ_command.py 30564 1726882828.30310: Sending initial data 30564 1726882828.30313: Sent initial data (156 bytes) 30564 1726882828.32372: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882828.32376: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882828.32379: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882828.32381: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882828.32388: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882828.32392: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882828.32571: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882828.32575: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882828.32577: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882828.32580: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882828.32582: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882828.32584: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882828.32586: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882828.32687: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882828.32691: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882828.32698: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882828.32701: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882828.32703: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882828.32705: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882828.32872: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882828.34677: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882828.34777: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882828.34886: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmpmn8s8a9y /root/.ansible/tmp/ansible-tmp-1726882828.2556174-31741-162369640073118/AnsiballZ_command.py <<< 30564 1726882828.35010: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882828.36392: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882828.36462: stderr chunk (state=3): >>><<< 30564 1726882828.36468: stdout chunk (state=3): >>><<< 30564 1726882828.36489: done transferring module to remote 30564 1726882828.36499: _low_level_execute_command(): starting 30564 1726882828.36505: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882828.2556174-31741-162369640073118/ /root/.ansible/tmp/ansible-tmp-1726882828.2556174-31741-162369640073118/AnsiballZ_command.py && sleep 0' 30564 1726882828.37179: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882828.37188: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882828.37199: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882828.37213: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882828.37260: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882828.37272: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882828.37279: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882828.37293: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882828.37317: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882828.37326: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882828.37340: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882828.37351: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882828.37370: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882828.37376: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882828.37383: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882828.37393: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882828.37687: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882828.37704: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882828.37716: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882828.37842: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882828.39682: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882828.39740: stderr chunk (state=3): >>><<< 30564 1726882828.39743: stdout chunk (state=3): >>><<< 30564 1726882828.39768: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882828.39773: _low_level_execute_command(): starting 30564 1726882828.39779: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882828.2556174-31741-162369640073118/AnsiballZ_command.py && sleep 0' 30564 1726882828.40431: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882828.40441: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882828.40454: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882828.40470: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882828.40510: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882828.40527: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882828.40538: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882828.40558: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882828.40567: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882828.40577: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882828.40585: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882828.40594: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882828.40605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882828.40613: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882828.40619: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882828.40634: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882828.40712: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882828.40728: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882828.40745: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882828.40877: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882828.56235: stdout chunk (state=3): >>> {"changed": true, "stdout": "statebr /etc/NetworkManager/system-connections/statebr.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-20 21:40:28.540821", "end": "2024-09-20 21:40:28.560195", "delta": "0:00:00.019374", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30564 1726882828.57588: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882828.57592: stdout chunk (state=3): >>><<< 30564 1726882828.57595: stderr chunk (state=3): >>><<< 30564 1726882828.57618: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "statebr /etc/NetworkManager/system-connections/statebr.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-20 21:40:28.540821", "end": "2024-09-20 21:40:28.560195", "delta": "0:00:00.019374", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882828.57657: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882828.2556174-31741-162369640073118/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882828.57662: _low_level_execute_command(): starting 30564 1726882828.57672: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882828.2556174-31741-162369640073118/ > /dev/null 2>&1 && sleep 0' 30564 1726882828.58275: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882828.58286: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882828.58296: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882828.58311: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882828.58361: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882828.58373: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882828.58383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882828.58397: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882828.58405: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882828.58412: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882828.58471: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882828.58478: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882828.58491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882828.58499: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882828.58506: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882828.58515: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882828.58783: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882828.58799: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882828.58811: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882828.58939: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882828.60816: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882828.60819: stdout chunk (state=3): >>><<< 30564 1726882828.60826: stderr chunk (state=3): >>><<< 30564 1726882828.60841: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882828.60848: handler run complete 30564 1726882828.60872: Evaluated conditional (False): False 30564 1726882828.60883: attempt loop complete, returning result 30564 1726882828.60888: _execute() done 30564 1726882828.60890: dumping result to json 30564 1726882828.60892: done dumping result, returning 30564 1726882828.60901: done running TaskExecutor() for managed_node2/TASK: Get NM profile info [0e448fcc-3ce9-4216-acec-000000000949] 30564 1726882828.60907: sending task result for task 0e448fcc-3ce9-4216-acec-000000000949 30564 1726882828.61014: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000949 30564 1726882828.61017: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "delta": "0:00:00.019374", "end": "2024-09-20 21:40:28.560195", "rc": 0, "start": "2024-09-20 21:40:28.540821" } STDOUT: statebr /etc/NetworkManager/system-connections/statebr.nmconnection 30564 1726882828.61088: no more pending results, returning what we have 30564 1726882828.61092: results queue empty 30564 1726882828.61093: checking for any_errors_fatal 30564 1726882828.61100: done checking for any_errors_fatal 30564 1726882828.61101: checking for max_fail_percentage 30564 1726882828.61102: done checking for max_fail_percentage 30564 1726882828.61103: checking to see if all hosts have failed and the running result is not ok 30564 1726882828.61104: done checking to see if all hosts have failed 30564 1726882828.61105: getting the remaining hosts for this loop 30564 1726882828.61106: done getting the remaining hosts for this loop 30564 1726882828.61110: getting the next task for host managed_node2 30564 1726882828.61117: done getting next task for host managed_node2 30564 1726882828.61120: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 30564 1726882828.61127: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882828.61130: getting variables 30564 1726882828.61132: in VariableManager get_vars() 30564 1726882828.61161: Calling all_inventory to load vars for managed_node2 30564 1726882828.61171: Calling groups_inventory to load vars for managed_node2 30564 1726882828.61176: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882828.61190: Calling all_plugins_play to load vars for managed_node2 30564 1726882828.61193: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882828.61195: Calling groups_plugins_play to load vars for managed_node2 30564 1726882828.62873: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882828.65302: done with get_vars() 30564 1726882828.65334: done getting variables 30564 1726882828.65408: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 21:40:28 -0400 (0:00:00.469) 0:00:27.235 ****** 30564 1726882828.65444: entering _queue_task() for managed_node2/set_fact 30564 1726882828.65848: worker is 1 (out of 1 available) 30564 1726882828.65866: exiting _queue_task() for managed_node2/set_fact 30564 1726882828.65898: done queuing things up, now waiting for results queue to drain 30564 1726882828.65900: waiting for pending results... 30564 1726882828.66888: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 30564 1726882828.67123: in run() - task 0e448fcc-3ce9-4216-acec-00000000094a 30564 1726882828.67147: variable 'ansible_search_path' from source: unknown 30564 1726882828.67162: variable 'ansible_search_path' from source: unknown 30564 1726882828.67208: calling self._execute() 30564 1726882828.67315: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882828.67325: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882828.67343: variable 'omit' from source: magic vars 30564 1726882828.67999: variable 'ansible_distribution_major_version' from source: facts 30564 1726882828.68022: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882828.68320: variable 'nm_profile_exists' from source: set_fact 30564 1726882828.68335: Evaluated conditional (nm_profile_exists.rc == 0): True 30564 1726882828.68350: variable 'omit' from source: magic vars 30564 1726882828.68535: variable 'omit' from source: magic vars 30564 1726882828.68580: variable 'omit' from source: magic vars 30564 1726882828.68673: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882828.68712: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882828.68771: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882828.68845: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882828.68877: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882828.68972: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882828.68982: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882828.69000: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882828.69123: Set connection var ansible_timeout to 10 30564 1726882828.69135: Set connection var ansible_pipelining to False 30564 1726882828.69142: Set connection var ansible_shell_type to sh 30564 1726882828.69152: Set connection var ansible_shell_executable to /bin/sh 30564 1726882828.69193: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882828.69200: Set connection var ansible_connection to ssh 30564 1726882828.69238: variable 'ansible_shell_executable' from source: unknown 30564 1726882828.69252: variable 'ansible_connection' from source: unknown 30564 1726882828.69259: variable 'ansible_module_compression' from source: unknown 30564 1726882828.69271: variable 'ansible_shell_type' from source: unknown 30564 1726882828.69279: variable 'ansible_shell_executable' from source: unknown 30564 1726882828.69289: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882828.69298: variable 'ansible_pipelining' from source: unknown 30564 1726882828.69304: variable 'ansible_timeout' from source: unknown 30564 1726882828.69312: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882828.69471: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882828.69489: variable 'omit' from source: magic vars 30564 1726882828.69499: starting attempt loop 30564 1726882828.69510: running the handler 30564 1726882828.69527: handler run complete 30564 1726882828.69547: attempt loop complete, returning result 30564 1726882828.69554: _execute() done 30564 1726882828.69560: dumping result to json 30564 1726882828.69573: done dumping result, returning 30564 1726882828.69586: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0e448fcc-3ce9-4216-acec-00000000094a] 30564 1726882828.69596: sending task result for task 0e448fcc-3ce9-4216-acec-00000000094a 30564 1726882828.69721: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000094a ok: [managed_node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 30564 1726882828.69780: no more pending results, returning what we have 30564 1726882828.69784: results queue empty 30564 1726882828.69785: checking for any_errors_fatal 30564 1726882828.69792: done checking for any_errors_fatal 30564 1726882828.69793: checking for max_fail_percentage 30564 1726882828.69795: done checking for max_fail_percentage 30564 1726882828.69796: checking to see if all hosts have failed and the running result is not ok 30564 1726882828.69796: done checking to see if all hosts have failed 30564 1726882828.69798: getting the remaining hosts for this loop 30564 1726882828.69799: done getting the remaining hosts for this loop 30564 1726882828.69803: getting the next task for host managed_node2 30564 1726882828.69814: done getting next task for host managed_node2 30564 1726882828.69819: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 30564 1726882828.69825: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882828.69829: getting variables 30564 1726882828.69831: in VariableManager get_vars() 30564 1726882828.69865: Calling all_inventory to load vars for managed_node2 30564 1726882828.69870: Calling groups_inventory to load vars for managed_node2 30564 1726882828.69875: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882828.69887: Calling all_plugins_play to load vars for managed_node2 30564 1726882828.69890: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882828.69893: Calling groups_plugins_play to load vars for managed_node2 30564 1726882828.71011: WORKER PROCESS EXITING 30564 1726882828.72486: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882828.74595: done with get_vars() 30564 1726882828.74614: done getting variables 30564 1726882828.74656: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30564 1726882828.74752: variable 'profile' from source: play vars 30564 1726882828.74755: variable 'interface' from source: play vars 30564 1726882828.74799: variable 'interface' from source: play vars TASK [Get the ansible_managed comment in ifcfg-statebr] ************************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 21:40:28 -0400 (0:00:00.093) 0:00:27.329 ****** 30564 1726882828.74823: entering _queue_task() for managed_node2/command 30564 1726882828.75046: worker is 1 (out of 1 available) 30564 1726882828.75061: exiting _queue_task() for managed_node2/command 30564 1726882828.75078: done queuing things up, now waiting for results queue to drain 30564 1726882828.75080: waiting for pending results... 30564 1726882828.75260: running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-statebr 30564 1726882828.75350: in run() - task 0e448fcc-3ce9-4216-acec-00000000094c 30564 1726882828.75404: variable 'ansible_search_path' from source: unknown 30564 1726882828.75416: variable 'ansible_search_path' from source: unknown 30564 1726882828.75728: calling self._execute() 30564 1726882828.75831: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882828.75844: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882828.75858: variable 'omit' from source: magic vars 30564 1726882828.76354: variable 'ansible_distribution_major_version' from source: facts 30564 1726882828.76403: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882828.76548: variable 'profile_stat' from source: set_fact 30564 1726882828.76563: Evaluated conditional (profile_stat.stat.exists): False 30564 1726882828.76575: when evaluation is False, skipping this task 30564 1726882828.76583: _execute() done 30564 1726882828.76590: dumping result to json 30564 1726882828.76597: done dumping result, returning 30564 1726882828.76606: done running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-statebr [0e448fcc-3ce9-4216-acec-00000000094c] 30564 1726882828.76628: sending task result for task 0e448fcc-3ce9-4216-acec-00000000094c skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30564 1726882828.76789: no more pending results, returning what we have 30564 1726882828.76794: results queue empty 30564 1726882828.76795: checking for any_errors_fatal 30564 1726882828.76803: done checking for any_errors_fatal 30564 1726882828.76803: checking for max_fail_percentage 30564 1726882828.76805: done checking for max_fail_percentage 30564 1726882828.76806: checking to see if all hosts have failed and the running result is not ok 30564 1726882828.76807: done checking to see if all hosts have failed 30564 1726882828.76808: getting the remaining hosts for this loop 30564 1726882828.76809: done getting the remaining hosts for this loop 30564 1726882828.76813: getting the next task for host managed_node2 30564 1726882828.76822: done getting next task for host managed_node2 30564 1726882828.76825: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 30564 1726882828.76831: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882828.76835: getting variables 30564 1726882828.76838: in VariableManager get_vars() 30564 1726882828.76872: Calling all_inventory to load vars for managed_node2 30564 1726882828.76876: Calling groups_inventory to load vars for managed_node2 30564 1726882828.76880: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882828.76894: Calling all_plugins_play to load vars for managed_node2 30564 1726882828.76898: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882828.76901: Calling groups_plugins_play to load vars for managed_node2 30564 1726882828.77972: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000094c 30564 1726882828.77975: WORKER PROCESS EXITING 30564 1726882828.79513: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882828.81300: done with get_vars() 30564 1726882828.81327: done getting variables 30564 1726882828.81384: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30564 1726882828.81498: variable 'profile' from source: play vars 30564 1726882828.81502: variable 'interface' from source: play vars 30564 1726882828.81560: variable 'interface' from source: play vars TASK [Verify the ansible_managed comment in ifcfg-statebr] ********************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 21:40:28 -0400 (0:00:00.067) 0:00:27.397 ****** 30564 1726882828.81593: entering _queue_task() for managed_node2/set_fact 30564 1726882828.81888: worker is 1 (out of 1 available) 30564 1726882828.81899: exiting _queue_task() for managed_node2/set_fact 30564 1726882828.81912: done queuing things up, now waiting for results queue to drain 30564 1726882828.81913: waiting for pending results... 30564 1726882828.82847: running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-statebr 30564 1726882828.83112: in run() - task 0e448fcc-3ce9-4216-acec-00000000094d 30564 1726882828.83122: variable 'ansible_search_path' from source: unknown 30564 1726882828.83127: variable 'ansible_search_path' from source: unknown 30564 1726882828.83176: calling self._execute() 30564 1726882828.83278: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882828.83282: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882828.83292: variable 'omit' from source: magic vars 30564 1726882828.83686: variable 'ansible_distribution_major_version' from source: facts 30564 1726882828.83696: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882828.83820: variable 'profile_stat' from source: set_fact 30564 1726882828.83830: Evaluated conditional (profile_stat.stat.exists): False 30564 1726882828.83833: when evaluation is False, skipping this task 30564 1726882828.83836: _execute() done 30564 1726882828.83838: dumping result to json 30564 1726882828.83841: done dumping result, returning 30564 1726882828.83848: done running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-statebr [0e448fcc-3ce9-4216-acec-00000000094d] 30564 1726882828.83854: sending task result for task 0e448fcc-3ce9-4216-acec-00000000094d 30564 1726882828.83948: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000094d 30564 1726882828.83951: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30564 1726882828.84001: no more pending results, returning what we have 30564 1726882828.84006: results queue empty 30564 1726882828.84007: checking for any_errors_fatal 30564 1726882828.84015: done checking for any_errors_fatal 30564 1726882828.84016: checking for max_fail_percentage 30564 1726882828.84018: done checking for max_fail_percentage 30564 1726882828.84018: checking to see if all hosts have failed and the running result is not ok 30564 1726882828.84019: done checking to see if all hosts have failed 30564 1726882828.84020: getting the remaining hosts for this loop 30564 1726882828.84022: done getting the remaining hosts for this loop 30564 1726882828.84026: getting the next task for host managed_node2 30564 1726882828.84034: done getting next task for host managed_node2 30564 1726882828.84038: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 30564 1726882828.84043: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882828.84048: getting variables 30564 1726882828.84050: in VariableManager get_vars() 30564 1726882828.84085: Calling all_inventory to load vars for managed_node2 30564 1726882828.84088: Calling groups_inventory to load vars for managed_node2 30564 1726882828.84092: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882828.84106: Calling all_plugins_play to load vars for managed_node2 30564 1726882828.84110: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882828.84114: Calling groups_plugins_play to load vars for managed_node2 30564 1726882828.85992: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882828.87961: done with get_vars() 30564 1726882828.88550: done getting variables 30564 1726882828.88704: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30564 1726882828.89020: variable 'profile' from source: play vars 30564 1726882828.89025: variable 'interface' from source: play vars 30564 1726882828.89093: variable 'interface' from source: play vars TASK [Get the fingerprint comment in ifcfg-statebr] **************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 21:40:28 -0400 (0:00:00.075) 0:00:27.472 ****** 30564 1726882828.89126: entering _queue_task() for managed_node2/command 30564 1726882828.89485: worker is 1 (out of 1 available) 30564 1726882828.89498: exiting _queue_task() for managed_node2/command 30564 1726882828.89512: done queuing things up, now waiting for results queue to drain 30564 1726882828.89513: waiting for pending results... 30564 1726882828.89804: running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-statebr 30564 1726882828.89915: in run() - task 0e448fcc-3ce9-4216-acec-00000000094e 30564 1726882828.89930: variable 'ansible_search_path' from source: unknown 30564 1726882828.89934: variable 'ansible_search_path' from source: unknown 30564 1726882828.89975: calling self._execute() 30564 1726882828.90076: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882828.90079: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882828.90090: variable 'omit' from source: magic vars 30564 1726882828.90437: variable 'ansible_distribution_major_version' from source: facts 30564 1726882828.90450: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882828.90576: variable 'profile_stat' from source: set_fact 30564 1726882828.90586: Evaluated conditional (profile_stat.stat.exists): False 30564 1726882828.90590: when evaluation is False, skipping this task 30564 1726882828.90592: _execute() done 30564 1726882828.90595: dumping result to json 30564 1726882828.90597: done dumping result, returning 30564 1726882828.90609: done running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-statebr [0e448fcc-3ce9-4216-acec-00000000094e] 30564 1726882828.90617: sending task result for task 0e448fcc-3ce9-4216-acec-00000000094e 30564 1726882828.90706: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000094e 30564 1726882828.90710: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30564 1726882828.90760: no more pending results, returning what we have 30564 1726882828.90767: results queue empty 30564 1726882828.90769: checking for any_errors_fatal 30564 1726882828.90777: done checking for any_errors_fatal 30564 1726882828.90778: checking for max_fail_percentage 30564 1726882828.90780: done checking for max_fail_percentage 30564 1726882828.90780: checking to see if all hosts have failed and the running result is not ok 30564 1726882828.90781: done checking to see if all hosts have failed 30564 1726882828.90782: getting the remaining hosts for this loop 30564 1726882828.90784: done getting the remaining hosts for this loop 30564 1726882828.90787: getting the next task for host managed_node2 30564 1726882828.90795: done getting next task for host managed_node2 30564 1726882828.90798: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 30564 1726882828.90803: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882828.90807: getting variables 30564 1726882828.90808: in VariableManager get_vars() 30564 1726882828.90838: Calling all_inventory to load vars for managed_node2 30564 1726882828.90841: Calling groups_inventory to load vars for managed_node2 30564 1726882828.90845: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882828.90858: Calling all_plugins_play to load vars for managed_node2 30564 1726882828.90861: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882828.90866: Calling groups_plugins_play to load vars for managed_node2 30564 1726882828.92679: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882828.94842: done with get_vars() 30564 1726882828.94868: done getting variables 30564 1726882828.94924: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30564 1726882828.95337: variable 'profile' from source: play vars 30564 1726882828.95341: variable 'interface' from source: play vars 30564 1726882828.95399: variable 'interface' from source: play vars TASK [Verify the fingerprint comment in ifcfg-statebr] ************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 21:40:28 -0400 (0:00:00.063) 0:00:27.535 ****** 30564 1726882828.95430: entering _queue_task() for managed_node2/set_fact 30564 1726882828.95822: worker is 1 (out of 1 available) 30564 1726882828.95832: exiting _queue_task() for managed_node2/set_fact 30564 1726882828.95843: done queuing things up, now waiting for results queue to drain 30564 1726882828.95845: waiting for pending results... 30564 1726882828.96238: running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-statebr 30564 1726882828.96368: in run() - task 0e448fcc-3ce9-4216-acec-00000000094f 30564 1726882828.96381: variable 'ansible_search_path' from source: unknown 30564 1726882828.96385: variable 'ansible_search_path' from source: unknown 30564 1726882828.96435: calling self._execute() 30564 1726882828.96507: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882828.96510: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882828.96520: variable 'omit' from source: magic vars 30564 1726882828.96792: variable 'ansible_distribution_major_version' from source: facts 30564 1726882828.96803: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882828.96891: variable 'profile_stat' from source: set_fact 30564 1726882828.96899: Evaluated conditional (profile_stat.stat.exists): False 30564 1726882828.96902: when evaluation is False, skipping this task 30564 1726882828.96905: _execute() done 30564 1726882828.96908: dumping result to json 30564 1726882828.96910: done dumping result, returning 30564 1726882828.96915: done running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-statebr [0e448fcc-3ce9-4216-acec-00000000094f] 30564 1726882828.96922: sending task result for task 0e448fcc-3ce9-4216-acec-00000000094f 30564 1726882828.97012: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000094f 30564 1726882828.97015: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30564 1726882828.97068: no more pending results, returning what we have 30564 1726882828.97072: results queue empty 30564 1726882828.97073: checking for any_errors_fatal 30564 1726882828.97081: done checking for any_errors_fatal 30564 1726882828.97081: checking for max_fail_percentage 30564 1726882828.97083: done checking for max_fail_percentage 30564 1726882828.97083: checking to see if all hosts have failed and the running result is not ok 30564 1726882828.97084: done checking to see if all hosts have failed 30564 1726882828.97085: getting the remaining hosts for this loop 30564 1726882828.97087: done getting the remaining hosts for this loop 30564 1726882828.97090: getting the next task for host managed_node2 30564 1726882828.97098: done getting next task for host managed_node2 30564 1726882828.97100: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 30564 1726882828.97104: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882828.97108: getting variables 30564 1726882828.97109: in VariableManager get_vars() 30564 1726882828.97140: Calling all_inventory to load vars for managed_node2 30564 1726882828.97147: Calling groups_inventory to load vars for managed_node2 30564 1726882828.97150: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882828.97159: Calling all_plugins_play to load vars for managed_node2 30564 1726882828.97161: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882828.97165: Calling groups_plugins_play to load vars for managed_node2 30564 1726882828.98071: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882828.99738: done with get_vars() 30564 1726882828.99758: done getting variables 30564 1726882828.99827: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30564 1726882828.99941: variable 'profile' from source: play vars 30564 1726882828.99945: variable 'interface' from source: play vars 30564 1726882829.00024: variable 'interface' from source: play vars TASK [Assert that the profile is present - 'statebr'] ************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 21:40:28 -0400 (0:00:00.046) 0:00:27.581 ****** 30564 1726882829.00071: entering _queue_task() for managed_node2/assert 30564 1726882829.00344: worker is 1 (out of 1 available) 30564 1726882829.00364: exiting _queue_task() for managed_node2/assert 30564 1726882829.00379: done queuing things up, now waiting for results queue to drain 30564 1726882829.00381: waiting for pending results... 30564 1726882829.00687: running TaskExecutor() for managed_node2/TASK: Assert that the profile is present - 'statebr' 30564 1726882829.00786: in run() - task 0e448fcc-3ce9-4216-acec-0000000008ae 30564 1726882829.00798: variable 'ansible_search_path' from source: unknown 30564 1726882829.00808: variable 'ansible_search_path' from source: unknown 30564 1726882829.00849: calling self._execute() 30564 1726882829.00923: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882829.00927: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882829.00941: variable 'omit' from source: magic vars 30564 1726882829.01198: variable 'ansible_distribution_major_version' from source: facts 30564 1726882829.01213: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882829.01219: variable 'omit' from source: magic vars 30564 1726882829.01252: variable 'omit' from source: magic vars 30564 1726882829.01322: variable 'profile' from source: play vars 30564 1726882829.01326: variable 'interface' from source: play vars 30564 1726882829.01375: variable 'interface' from source: play vars 30564 1726882829.01392: variable 'omit' from source: magic vars 30564 1726882829.01424: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882829.01448: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882829.01470: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882829.01488: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882829.01500: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882829.01523: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882829.01526: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882829.01530: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882829.01606: Set connection var ansible_timeout to 10 30564 1726882829.01609: Set connection var ansible_pipelining to False 30564 1726882829.01612: Set connection var ansible_shell_type to sh 30564 1726882829.01614: Set connection var ansible_shell_executable to /bin/sh 30564 1726882829.01621: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882829.01624: Set connection var ansible_connection to ssh 30564 1726882829.01642: variable 'ansible_shell_executable' from source: unknown 30564 1726882829.01645: variable 'ansible_connection' from source: unknown 30564 1726882829.01647: variable 'ansible_module_compression' from source: unknown 30564 1726882829.01650: variable 'ansible_shell_type' from source: unknown 30564 1726882829.01652: variable 'ansible_shell_executable' from source: unknown 30564 1726882829.01654: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882829.01658: variable 'ansible_pipelining' from source: unknown 30564 1726882829.01661: variable 'ansible_timeout' from source: unknown 30564 1726882829.01666: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882829.01762: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882829.01775: variable 'omit' from source: magic vars 30564 1726882829.01779: starting attempt loop 30564 1726882829.01782: running the handler 30564 1726882829.01855: variable 'lsr_net_profile_exists' from source: set_fact 30564 1726882829.01859: Evaluated conditional (lsr_net_profile_exists): True 30564 1726882829.01867: handler run complete 30564 1726882829.01881: attempt loop complete, returning result 30564 1726882829.01884: _execute() done 30564 1726882829.01887: dumping result to json 30564 1726882829.01889: done dumping result, returning 30564 1726882829.01899: done running TaskExecutor() for managed_node2/TASK: Assert that the profile is present - 'statebr' [0e448fcc-3ce9-4216-acec-0000000008ae] 30564 1726882829.01901: sending task result for task 0e448fcc-3ce9-4216-acec-0000000008ae 30564 1726882829.01984: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000008ae 30564 1726882829.01986: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 30564 1726882829.02058: no more pending results, returning what we have 30564 1726882829.02060: results queue empty 30564 1726882829.02061: checking for any_errors_fatal 30564 1726882829.02068: done checking for any_errors_fatal 30564 1726882829.02069: checking for max_fail_percentage 30564 1726882829.02070: done checking for max_fail_percentage 30564 1726882829.02071: checking to see if all hosts have failed and the running result is not ok 30564 1726882829.02072: done checking to see if all hosts have failed 30564 1726882829.02073: getting the remaining hosts for this loop 30564 1726882829.02074: done getting the remaining hosts for this loop 30564 1726882829.02077: getting the next task for host managed_node2 30564 1726882829.02082: done getting next task for host managed_node2 30564 1726882829.02085: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 30564 1726882829.02088: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882829.02092: getting variables 30564 1726882829.02093: in VariableManager get_vars() 30564 1726882829.02119: Calling all_inventory to load vars for managed_node2 30564 1726882829.02122: Calling groups_inventory to load vars for managed_node2 30564 1726882829.02125: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882829.02133: Calling all_plugins_play to load vars for managed_node2 30564 1726882829.02136: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882829.02139: Calling groups_plugins_play to load vars for managed_node2 30564 1726882829.03829: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882829.05874: done with get_vars() 30564 1726882829.05897: done getting variables 30564 1726882829.05973: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30564 1726882829.06105: variable 'profile' from source: play vars 30564 1726882829.06109: variable 'interface' from source: play vars 30564 1726882829.06179: variable 'interface' from source: play vars TASK [Assert that the ansible managed comment is present in 'statebr'] ********* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 21:40:29 -0400 (0:00:00.061) 0:00:27.643 ****** 30564 1726882829.06221: entering _queue_task() for managed_node2/assert 30564 1726882829.06548: worker is 1 (out of 1 available) 30564 1726882829.06560: exiting _queue_task() for managed_node2/assert 30564 1726882829.06580: done queuing things up, now waiting for results queue to drain 30564 1726882829.06581: waiting for pending results... 30564 1726882829.06886: running TaskExecutor() for managed_node2/TASK: Assert that the ansible managed comment is present in 'statebr' 30564 1726882829.06999: in run() - task 0e448fcc-3ce9-4216-acec-0000000008af 30564 1726882829.07019: variable 'ansible_search_path' from source: unknown 30564 1726882829.07024: variable 'ansible_search_path' from source: unknown 30564 1726882829.07072: calling self._execute() 30564 1726882829.07309: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882829.07313: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882829.07316: variable 'omit' from source: magic vars 30564 1726882829.07692: variable 'ansible_distribution_major_version' from source: facts 30564 1726882829.07704: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882829.07710: variable 'omit' from source: magic vars 30564 1726882829.07781: variable 'omit' from source: magic vars 30564 1726882829.08000: variable 'profile' from source: play vars 30564 1726882829.08035: variable 'interface' from source: play vars 30564 1726882829.08395: variable 'interface' from source: play vars 30564 1726882829.08398: variable 'omit' from source: magic vars 30564 1726882829.08401: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882829.08652: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882829.08655: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882829.08657: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882829.08659: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882829.08661: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882829.08664: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882829.09612: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882829.09616: Set connection var ansible_timeout to 10 30564 1726882829.09619: Set connection var ansible_pipelining to False 30564 1726882829.09621: Set connection var ansible_shell_type to sh 30564 1726882829.09623: Set connection var ansible_shell_executable to /bin/sh 30564 1726882829.09625: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882829.09628: Set connection var ansible_connection to ssh 30564 1726882829.09630: variable 'ansible_shell_executable' from source: unknown 30564 1726882829.09631: variable 'ansible_connection' from source: unknown 30564 1726882829.09633: variable 'ansible_module_compression' from source: unknown 30564 1726882829.09635: variable 'ansible_shell_type' from source: unknown 30564 1726882829.09637: variable 'ansible_shell_executable' from source: unknown 30564 1726882829.09639: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882829.09641: variable 'ansible_pipelining' from source: unknown 30564 1726882829.09643: variable 'ansible_timeout' from source: unknown 30564 1726882829.09645: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882829.09648: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882829.09650: variable 'omit' from source: magic vars 30564 1726882829.09652: starting attempt loop 30564 1726882829.09655: running the handler 30564 1726882829.09888: variable 'lsr_net_profile_ansible_managed' from source: set_fact 30564 1726882829.09892: Evaluated conditional (lsr_net_profile_ansible_managed): True 30564 1726882829.09901: handler run complete 30564 1726882829.09927: attempt loop complete, returning result 30564 1726882829.09931: _execute() done 30564 1726882829.09939: dumping result to json 30564 1726882829.09942: done dumping result, returning 30564 1726882829.09950: done running TaskExecutor() for managed_node2/TASK: Assert that the ansible managed comment is present in 'statebr' [0e448fcc-3ce9-4216-acec-0000000008af] 30564 1726882829.09955: sending task result for task 0e448fcc-3ce9-4216-acec-0000000008af 30564 1726882829.10383: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000008af 30564 1726882829.10386: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 30564 1726882829.10426: no more pending results, returning what we have 30564 1726882829.10431: results queue empty 30564 1726882829.10433: checking for any_errors_fatal 30564 1726882829.10437: done checking for any_errors_fatal 30564 1726882829.10438: checking for max_fail_percentage 30564 1726882829.10440: done checking for max_fail_percentage 30564 1726882829.10441: checking to see if all hosts have failed and the running result is not ok 30564 1726882829.10441: done checking to see if all hosts have failed 30564 1726882829.10442: getting the remaining hosts for this loop 30564 1726882829.10444: done getting the remaining hosts for this loop 30564 1726882829.10447: getting the next task for host managed_node2 30564 1726882829.10454: done getting next task for host managed_node2 30564 1726882829.10456: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 30564 1726882829.10460: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882829.10475: getting variables 30564 1726882829.10477: in VariableManager get_vars() 30564 1726882829.10504: Calling all_inventory to load vars for managed_node2 30564 1726882829.10506: Calling groups_inventory to load vars for managed_node2 30564 1726882829.10510: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882829.10520: Calling all_plugins_play to load vars for managed_node2 30564 1726882829.10523: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882829.10526: Calling groups_plugins_play to load vars for managed_node2 30564 1726882829.12361: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882829.14934: done with get_vars() 30564 1726882829.14967: done getting variables 30564 1726882829.15026: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30564 1726882829.15144: variable 'profile' from source: play vars 30564 1726882829.15148: variable 'interface' from source: play vars 30564 1726882829.15221: variable 'interface' from source: play vars TASK [Assert that the fingerprint comment is present in statebr] *************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 21:40:29 -0400 (0:00:00.090) 0:00:27.733 ****** 30564 1726882829.15254: entering _queue_task() for managed_node2/assert 30564 1726882829.15690: worker is 1 (out of 1 available) 30564 1726882829.15703: exiting _queue_task() for managed_node2/assert 30564 1726882829.15716: done queuing things up, now waiting for results queue to drain 30564 1726882829.15717: waiting for pending results... 30564 1726882829.15999: running TaskExecutor() for managed_node2/TASK: Assert that the fingerprint comment is present in statebr 30564 1726882829.16120: in run() - task 0e448fcc-3ce9-4216-acec-0000000008b0 30564 1726882829.16139: variable 'ansible_search_path' from source: unknown 30564 1726882829.16146: variable 'ansible_search_path' from source: unknown 30564 1726882829.16189: calling self._execute() 30564 1726882829.16286: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882829.16297: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882829.16310: variable 'omit' from source: magic vars 30564 1726882829.16645: variable 'ansible_distribution_major_version' from source: facts 30564 1726882829.16663: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882829.16676: variable 'omit' from source: magic vars 30564 1726882829.16728: variable 'omit' from source: magic vars 30564 1726882829.16824: variable 'profile' from source: play vars 30564 1726882829.16835: variable 'interface' from source: play vars 30564 1726882829.16903: variable 'interface' from source: play vars 30564 1726882829.16929: variable 'omit' from source: magic vars 30564 1726882829.16973: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882829.17007: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882829.17034: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882829.17055: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882829.17074: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882829.17106: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882829.17114: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882829.17120: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882829.17453: Set connection var ansible_timeout to 10 30564 1726882829.17465: Set connection var ansible_pipelining to False 30564 1726882829.17473: Set connection var ansible_shell_type to sh 30564 1726882829.17484: Set connection var ansible_shell_executable to /bin/sh 30564 1726882829.17495: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882829.17501: Set connection var ansible_connection to ssh 30564 1726882829.17527: variable 'ansible_shell_executable' from source: unknown 30564 1726882829.17538: variable 'ansible_connection' from source: unknown 30564 1726882829.17545: variable 'ansible_module_compression' from source: unknown 30564 1726882829.17550: variable 'ansible_shell_type' from source: unknown 30564 1726882829.17556: variable 'ansible_shell_executable' from source: unknown 30564 1726882829.17561: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882829.17570: variable 'ansible_pipelining' from source: unknown 30564 1726882829.17577: variable 'ansible_timeout' from source: unknown 30564 1726882829.17584: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882829.17720: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882829.17736: variable 'omit' from source: magic vars 30564 1726882829.17745: starting attempt loop 30564 1726882829.17756: running the handler 30564 1726882829.17868: variable 'lsr_net_profile_fingerprint' from source: set_fact 30564 1726882829.17880: Evaluated conditional (lsr_net_profile_fingerprint): True 30564 1726882829.17891: handler run complete 30564 1726882829.17908: attempt loop complete, returning result 30564 1726882829.17915: _execute() done 30564 1726882829.17920: dumping result to json 30564 1726882829.17927: done dumping result, returning 30564 1726882829.17936: done running TaskExecutor() for managed_node2/TASK: Assert that the fingerprint comment is present in statebr [0e448fcc-3ce9-4216-acec-0000000008b0] 30564 1726882829.17945: sending task result for task 0e448fcc-3ce9-4216-acec-0000000008b0 30564 1726882829.18046: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000008b0 30564 1726882829.18053: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 30564 1726882829.18122: no more pending results, returning what we have 30564 1726882829.18127: results queue empty 30564 1726882829.18128: checking for any_errors_fatal 30564 1726882829.18137: done checking for any_errors_fatal 30564 1726882829.18138: checking for max_fail_percentage 30564 1726882829.18140: done checking for max_fail_percentage 30564 1726882829.18141: checking to see if all hosts have failed and the running result is not ok 30564 1726882829.18142: done checking to see if all hosts have failed 30564 1726882829.18143: getting the remaining hosts for this loop 30564 1726882829.18145: done getting the remaining hosts for this loop 30564 1726882829.18149: getting the next task for host managed_node2 30564 1726882829.18158: done getting next task for host managed_node2 30564 1726882829.18162: ^ task is: TASK: Conditional asserts 30564 1726882829.18182: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882829.18188: getting variables 30564 1726882829.18190: in VariableManager get_vars() 30564 1726882829.18223: Calling all_inventory to load vars for managed_node2 30564 1726882829.18226: Calling groups_inventory to load vars for managed_node2 30564 1726882829.18230: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882829.18241: Calling all_plugins_play to load vars for managed_node2 30564 1726882829.18245: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882829.18248: Calling groups_plugins_play to load vars for managed_node2 30564 1726882829.20054: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882829.21912: done with get_vars() 30564 1726882829.21937: done getting variables TASK [Conditional asserts] ***************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:42 Friday 20 September 2024 21:40:29 -0400 (0:00:00.067) 0:00:27.801 ****** 30564 1726882829.22028: entering _queue_task() for managed_node2/include_tasks 30564 1726882829.22334: worker is 1 (out of 1 available) 30564 1726882829.22346: exiting _queue_task() for managed_node2/include_tasks 30564 1726882829.22359: done queuing things up, now waiting for results queue to drain 30564 1726882829.22360: waiting for pending results... 30564 1726882829.23337: running TaskExecutor() for managed_node2/TASK: Conditional asserts 30564 1726882829.23699: in run() - task 0e448fcc-3ce9-4216-acec-0000000005ba 30564 1726882829.23728: variable 'ansible_search_path' from source: unknown 30564 1726882829.23736: variable 'ansible_search_path' from source: unknown 30564 1726882829.24706: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882829.28946: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882829.29019: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882829.29097: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882829.29141: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882829.29177: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882829.29272: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882829.29310: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882829.29458: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882829.29714: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882829.29863: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882829.30773: dumping result to json 30564 1726882829.30846: done dumping result, returning 30564 1726882829.30923: done running TaskExecutor() for managed_node2/TASK: Conditional asserts [0e448fcc-3ce9-4216-acec-0000000005ba] 30564 1726882829.30947: sending task result for task 0e448fcc-3ce9-4216-acec-0000000005ba skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } 30564 1726882829.31157: no more pending results, returning what we have 30564 1726882829.31175: results queue empty 30564 1726882829.31176: checking for any_errors_fatal 30564 1726882829.31182: done checking for any_errors_fatal 30564 1726882829.31183: checking for max_fail_percentage 30564 1726882829.31185: done checking for max_fail_percentage 30564 1726882829.31185: checking to see if all hosts have failed and the running result is not ok 30564 1726882829.31186: done checking to see if all hosts have failed 30564 1726882829.31187: getting the remaining hosts for this loop 30564 1726882829.31189: done getting the remaining hosts for this loop 30564 1726882829.31193: getting the next task for host managed_node2 30564 1726882829.31211: done getting next task for host managed_node2 30564 1726882829.31216: ^ task is: TASK: Success in test '{{ lsr_description }}' 30564 1726882829.31219: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882829.31225: getting variables 30564 1726882829.31227: in VariableManager get_vars() 30564 1726882829.31302: Calling all_inventory to load vars for managed_node2 30564 1726882829.31305: Calling groups_inventory to load vars for managed_node2 30564 1726882829.31316: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882829.31328: Calling all_plugins_play to load vars for managed_node2 30564 1726882829.31332: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882829.31336: Calling groups_plugins_play to load vars for managed_node2 30564 1726882829.32648: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000005ba 30564 1726882829.32652: WORKER PROCESS EXITING 30564 1726882829.34033: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882829.45920: done with get_vars() 30564 1726882829.45949: done getting variables 30564 1726882829.46000: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30564 1726882829.46102: variable 'lsr_description' from source: include params TASK [Success in test 'I can create a profile without autoconnect'] ************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:47 Friday 20 September 2024 21:40:29 -0400 (0:00:00.240) 0:00:28.042 ****** 30564 1726882829.46126: entering _queue_task() for managed_node2/debug 30564 1726882829.46794: worker is 1 (out of 1 available) 30564 1726882829.46827: exiting _queue_task() for managed_node2/debug 30564 1726882829.46844: done queuing things up, now waiting for results queue to drain 30564 1726882829.46845: waiting for pending results... 30564 1726882829.48167: running TaskExecutor() for managed_node2/TASK: Success in test 'I can create a profile without autoconnect' 30564 1726882829.48531: in run() - task 0e448fcc-3ce9-4216-acec-0000000005bb 30564 1726882829.48551: variable 'ansible_search_path' from source: unknown 30564 1726882829.48561: variable 'ansible_search_path' from source: unknown 30564 1726882829.48629: calling self._execute() 30564 1726882829.48731: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882829.48750: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882829.48798: variable 'omit' from source: magic vars 30564 1726882829.49498: variable 'ansible_distribution_major_version' from source: facts 30564 1726882829.49518: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882829.49536: variable 'omit' from source: magic vars 30564 1726882829.49583: variable 'omit' from source: magic vars 30564 1726882829.49692: variable 'lsr_description' from source: include params 30564 1726882829.49714: variable 'omit' from source: magic vars 30564 1726882829.49770: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882829.49812: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882829.49835: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882829.49857: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882829.49880: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882829.49918: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882829.49925: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882829.49932: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882829.50039: Set connection var ansible_timeout to 10 30564 1726882829.50049: Set connection var ansible_pipelining to False 30564 1726882829.50055: Set connection var ansible_shell_type to sh 30564 1726882829.50067: Set connection var ansible_shell_executable to /bin/sh 30564 1726882829.50108: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882829.50115: Set connection var ansible_connection to ssh 30564 1726882829.50146: variable 'ansible_shell_executable' from source: unknown 30564 1726882829.50153: variable 'ansible_connection' from source: unknown 30564 1726882829.50158: variable 'ansible_module_compression' from source: unknown 30564 1726882829.50166: variable 'ansible_shell_type' from source: unknown 30564 1726882829.50174: variable 'ansible_shell_executable' from source: unknown 30564 1726882829.50180: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882829.50192: variable 'ansible_pipelining' from source: unknown 30564 1726882829.50199: variable 'ansible_timeout' from source: unknown 30564 1726882829.50205: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882829.50350: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882829.50369: variable 'omit' from source: magic vars 30564 1726882829.50379: starting attempt loop 30564 1726882829.50385: running the handler 30564 1726882829.50547: handler run complete 30564 1726882829.50572: attempt loop complete, returning result 30564 1726882829.50585: _execute() done 30564 1726882829.50608: dumping result to json 30564 1726882829.50615: done dumping result, returning 30564 1726882829.50628: done running TaskExecutor() for managed_node2/TASK: Success in test 'I can create a profile without autoconnect' [0e448fcc-3ce9-4216-acec-0000000005bb] 30564 1726882829.50641: sending task result for task 0e448fcc-3ce9-4216-acec-0000000005bb 30564 1726882829.50755: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000005bb 30564 1726882829.50765: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: +++++ Success in test 'I can create a profile without autoconnect' +++++ 30564 1726882829.50822: no more pending results, returning what we have 30564 1726882829.50827: results queue empty 30564 1726882829.50827: checking for any_errors_fatal 30564 1726882829.50838: done checking for any_errors_fatal 30564 1726882829.50839: checking for max_fail_percentage 30564 1726882829.50841: done checking for max_fail_percentage 30564 1726882829.50842: checking to see if all hosts have failed and the running result is not ok 30564 1726882829.50843: done checking to see if all hosts have failed 30564 1726882829.50844: getting the remaining hosts for this loop 30564 1726882829.50846: done getting the remaining hosts for this loop 30564 1726882829.50850: getting the next task for host managed_node2 30564 1726882829.50860: done getting next task for host managed_node2 30564 1726882829.50865: ^ task is: TASK: Cleanup 30564 1726882829.50869: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882829.50874: getting variables 30564 1726882829.50876: in VariableManager get_vars() 30564 1726882829.50906: Calling all_inventory to load vars for managed_node2 30564 1726882829.50909: Calling groups_inventory to load vars for managed_node2 30564 1726882829.50913: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882829.50924: Calling all_plugins_play to load vars for managed_node2 30564 1726882829.50928: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882829.50931: Calling groups_plugins_play to load vars for managed_node2 30564 1726882829.53959: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882829.59429: done with get_vars() 30564 1726882829.59462: done getting variables TASK [Cleanup] ***************************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:66 Friday 20 September 2024 21:40:29 -0400 (0:00:00.134) 0:00:28.177 ****** 30564 1726882829.59603: entering _queue_task() for managed_node2/include_tasks 30564 1726882829.59965: worker is 1 (out of 1 available) 30564 1726882829.59978: exiting _queue_task() for managed_node2/include_tasks 30564 1726882829.59990: done queuing things up, now waiting for results queue to drain 30564 1726882829.59992: waiting for pending results... 30564 1726882829.60291: running TaskExecutor() for managed_node2/TASK: Cleanup 30564 1726882829.60420: in run() - task 0e448fcc-3ce9-4216-acec-0000000005bf 30564 1726882829.60444: variable 'ansible_search_path' from source: unknown 30564 1726882829.60452: variable 'ansible_search_path' from source: unknown 30564 1726882829.60505: variable 'lsr_cleanup' from source: include params 30564 1726882829.60729: variable 'lsr_cleanup' from source: include params 30564 1726882829.60815: variable 'omit' from source: magic vars 30564 1726882829.60966: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882829.60989: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882829.61008: variable 'omit' from source: magic vars 30564 1726882829.61251: variable 'ansible_distribution_major_version' from source: facts 30564 1726882829.61270: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882829.61285: variable 'item' from source: unknown 30564 1726882829.61354: variable 'item' from source: unknown 30564 1726882829.61397: variable 'item' from source: unknown 30564 1726882829.61467: variable 'item' from source: unknown 30564 1726882829.61628: dumping result to json 30564 1726882829.61638: done dumping result, returning 30564 1726882829.61648: done running TaskExecutor() for managed_node2/TASK: Cleanup [0e448fcc-3ce9-4216-acec-0000000005bf] 30564 1726882829.61659: sending task result for task 0e448fcc-3ce9-4216-acec-0000000005bf 30564 1726882829.61745: no more pending results, returning what we have 30564 1726882829.61751: in VariableManager get_vars() 30564 1726882829.61788: Calling all_inventory to load vars for managed_node2 30564 1726882829.61792: Calling groups_inventory to load vars for managed_node2 30564 1726882829.61795: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882829.61809: Calling all_plugins_play to load vars for managed_node2 30564 1726882829.61813: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882829.61817: Calling groups_plugins_play to load vars for managed_node2 30564 1726882829.62481: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000005bf 30564 1726882829.62485: WORKER PROCESS EXITING 30564 1726882829.64218: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882829.66286: done with get_vars() 30564 1726882829.66305: variable 'ansible_search_path' from source: unknown 30564 1726882829.66306: variable 'ansible_search_path' from source: unknown 30564 1726882829.66344: we have included files to process 30564 1726882829.66346: generating all_blocks data 30564 1726882829.66349: done generating all_blocks data 30564 1726882829.66354: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30564 1726882829.66356: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30564 1726882829.66358: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30564 1726882829.66579: done processing included file 30564 1726882829.66581: iterating over new_blocks loaded from include file 30564 1726882829.66587: in VariableManager get_vars() 30564 1726882829.66607: done with get_vars() 30564 1726882829.66609: filtering new block on tags 30564 1726882829.66637: done filtering new block on tags 30564 1726882829.66639: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml for managed_node2 => (item=tasks/cleanup_profile+device.yml) 30564 1726882829.66644: extending task lists for all hosts with included blocks 30564 1726882829.68635: done extending task lists 30564 1726882829.68637: done processing included files 30564 1726882829.68638: results queue empty 30564 1726882829.68638: checking for any_errors_fatal 30564 1726882829.68642: done checking for any_errors_fatal 30564 1726882829.68642: checking for max_fail_percentage 30564 1726882829.68643: done checking for max_fail_percentage 30564 1726882829.68644: checking to see if all hosts have failed and the running result is not ok 30564 1726882829.68645: done checking to see if all hosts have failed 30564 1726882829.68646: getting the remaining hosts for this loop 30564 1726882829.68647: done getting the remaining hosts for this loop 30564 1726882829.68650: getting the next task for host managed_node2 30564 1726882829.68654: done getting next task for host managed_node2 30564 1726882829.68656: ^ task is: TASK: Cleanup profile and device 30564 1726882829.68658: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882829.68660: getting variables 30564 1726882829.68661: in VariableManager get_vars() 30564 1726882829.68678: Calling all_inventory to load vars for managed_node2 30564 1726882829.68681: Calling groups_inventory to load vars for managed_node2 30564 1726882829.68683: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882829.68689: Calling all_plugins_play to load vars for managed_node2 30564 1726882829.68691: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882829.68694: Calling groups_plugins_play to load vars for managed_node2 30564 1726882829.69879: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882829.71121: done with get_vars() 30564 1726882829.71142: done getting variables 30564 1726882829.71188: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Cleanup profile and device] ********************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml:3 Friday 20 September 2024 21:40:29 -0400 (0:00:00.116) 0:00:28.293 ****** 30564 1726882829.71216: entering _queue_task() for managed_node2/shell 30564 1726882829.71573: worker is 1 (out of 1 available) 30564 1726882829.71585: exiting _queue_task() for managed_node2/shell 30564 1726882829.71597: done queuing things up, now waiting for results queue to drain 30564 1726882829.71598: waiting for pending results... 30564 1726882829.72690: running TaskExecutor() for managed_node2/TASK: Cleanup profile and device 30564 1726882829.72696: in run() - task 0e448fcc-3ce9-4216-acec-0000000009a0 30564 1726882829.72699: variable 'ansible_search_path' from source: unknown 30564 1726882829.72702: variable 'ansible_search_path' from source: unknown 30564 1726882829.72705: calling self._execute() 30564 1726882829.72708: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882829.72710: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882829.72713: variable 'omit' from source: magic vars 30564 1726882829.72716: variable 'ansible_distribution_major_version' from source: facts 30564 1726882829.72718: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882829.72730: variable 'omit' from source: magic vars 30564 1726882829.73086: variable 'omit' from source: magic vars 30564 1726882829.73089: variable 'interface' from source: play vars 30564 1726882829.73092: variable 'omit' from source: magic vars 30564 1726882829.73095: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882829.73098: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882829.73100: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882829.73103: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882829.73106: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882829.73277: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882829.73280: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882829.73285: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882829.73288: Set connection var ansible_timeout to 10 30564 1726882829.73291: Set connection var ansible_pipelining to False 30564 1726882829.73294: Set connection var ansible_shell_type to sh 30564 1726882829.73297: Set connection var ansible_shell_executable to /bin/sh 30564 1726882829.73300: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882829.73302: Set connection var ansible_connection to ssh 30564 1726882829.73305: variable 'ansible_shell_executable' from source: unknown 30564 1726882829.73308: variable 'ansible_connection' from source: unknown 30564 1726882829.73372: variable 'ansible_module_compression' from source: unknown 30564 1726882829.73378: variable 'ansible_shell_type' from source: unknown 30564 1726882829.73381: variable 'ansible_shell_executable' from source: unknown 30564 1726882829.73383: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882829.73385: variable 'ansible_pipelining' from source: unknown 30564 1726882829.73388: variable 'ansible_timeout' from source: unknown 30564 1726882829.73391: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882829.73512: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882829.73516: variable 'omit' from source: magic vars 30564 1726882829.73519: starting attempt loop 30564 1726882829.73521: running the handler 30564 1726882829.73524: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882829.73569: _low_level_execute_command(): starting 30564 1726882829.73573: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882829.74283: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882829.74292: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882829.74300: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882829.74311: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882829.74316: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882829.74389: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882829.74407: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882829.74516: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882829.76172: stdout chunk (state=3): >>>/root <<< 30564 1726882829.76290: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882829.76354: stderr chunk (state=3): >>><<< 30564 1726882829.76358: stdout chunk (state=3): >>><<< 30564 1726882829.76460: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882829.76467: _low_level_execute_command(): starting 30564 1726882829.76470: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882829.7637875-31822-182014277479430 `" && echo ansible-tmp-1726882829.7637875-31822-182014277479430="` echo /root/.ansible/tmp/ansible-tmp-1726882829.7637875-31822-182014277479430 `" ) && sleep 0' 30564 1726882829.77909: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882829.77912: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882829.77939: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 30564 1726882829.77944: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882829.77963: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 30564 1726882829.77968: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882829.78035: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882829.78038: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882829.78155: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882829.80033: stdout chunk (state=3): >>>ansible-tmp-1726882829.7637875-31822-182014277479430=/root/.ansible/tmp/ansible-tmp-1726882829.7637875-31822-182014277479430 <<< 30564 1726882829.80144: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882829.80210: stderr chunk (state=3): >>><<< 30564 1726882829.80214: stdout chunk (state=3): >>><<< 30564 1726882829.80270: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882829.7637875-31822-182014277479430=/root/.ansible/tmp/ansible-tmp-1726882829.7637875-31822-182014277479430 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882829.80274: variable 'ansible_module_compression' from source: unknown 30564 1726882829.80480: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30564 1726882829.80485: variable 'ansible_facts' from source: unknown 30564 1726882829.80487: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882829.7637875-31822-182014277479430/AnsiballZ_command.py 30564 1726882829.81020: Sending initial data 30564 1726882829.81024: Sent initial data (156 bytes) 30564 1726882829.82657: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882829.82661: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882829.82709: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882829.82713: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882829.82715: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882829.82785: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882829.82797: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882829.82923: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882829.84744: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882829.84835: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882829.84934: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmpxefv8kvg /root/.ansible/tmp/ansible-tmp-1726882829.7637875-31822-182014277479430/AnsiballZ_command.py <<< 30564 1726882829.85026: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882829.86537: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882829.86549: stderr chunk (state=3): >>><<< 30564 1726882829.86552: stdout chunk (state=3): >>><<< 30564 1726882829.86569: done transferring module to remote 30564 1726882829.86583: _low_level_execute_command(): starting 30564 1726882829.86587: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882829.7637875-31822-182014277479430/ /root/.ansible/tmp/ansible-tmp-1726882829.7637875-31822-182014277479430/AnsiballZ_command.py && sleep 0' 30564 1726882829.87412: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882829.87416: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882829.87469: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882829.87478: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration <<< 30564 1726882829.87485: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882829.87490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882829.87505: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 30564 1726882829.87510: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882829.87601: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882829.87604: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882829.87617: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882829.87736: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882829.89576: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882829.89581: stderr chunk (state=3): >>><<< 30564 1726882829.89586: stdout chunk (state=3): >>><<< 30564 1726882829.89602: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882829.89606: _low_level_execute_command(): starting 30564 1726882829.89611: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882829.7637875-31822-182014277479430/AnsiballZ_command.py && sleep 0' 30564 1726882829.90717: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882829.90723: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882829.90742: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882829.90781: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882829.90787: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882829.90803: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 30564 1726882829.90809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882829.90887: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882829.90894: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882829.90908: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882829.91038: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882830.07762: stdout chunk (state=3): >>> {"changed": true, "stdout": "Connection 'statebr' (25972e22-5267-43e5-84f8-5cddc8875a78) successfully deleted.", "stderr": "Cannot find device \"statebr\"", "rc": 1, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-20 21:40:30.037953", "end": "2024-09-20 21:40:30.075466", "delta": "0:00:00.037513", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30564 1726882830.08985: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.11.158 closed. <<< 30564 1726882830.09042: stderr chunk (state=3): >>><<< 30564 1726882830.09053: stdout chunk (state=3): >>><<< 30564 1726882830.09066: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "Connection 'statebr' (25972e22-5267-43e5-84f8-5cddc8875a78) successfully deleted.", "stderr": "Cannot find device \"statebr\"", "rc": 1, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-20 21:40:30.037953", "end": "2024-09-20 21:40:30.075466", "delta": "0:00:00.037513", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.11.158 closed. 30564 1726882830.09121: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882829.7637875-31822-182014277479430/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882830.09128: _low_level_execute_command(): starting 30564 1726882830.09148: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882829.7637875-31822-182014277479430/ > /dev/null 2>&1 && sleep 0' 30564 1726882830.09656: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882830.09724: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882830.09728: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882830.09731: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882830.09803: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882830.09823: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882830.09974: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882830.11793: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882830.11797: stdout chunk (state=3): >>><<< 30564 1726882830.11802: stderr chunk (state=3): >>><<< 30564 1726882830.11818: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882830.11847: handler run complete 30564 1726882830.11857: Evaluated conditional (False): False 30564 1726882830.11876: attempt loop complete, returning result 30564 1726882830.11889: _execute() done 30564 1726882830.11894: dumping result to json 30564 1726882830.11896: done dumping result, returning 30564 1726882830.11898: done running TaskExecutor() for managed_node2/TASK: Cleanup profile and device [0e448fcc-3ce9-4216-acec-0000000009a0] 30564 1726882830.11906: sending task result for task 0e448fcc-3ce9-4216-acec-0000000009a0 30564 1726882830.12022: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000009a0 30564 1726882830.12025: WORKER PROCESS EXITING fatal: [managed_node2]: FAILED! => { "changed": false, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "delta": "0:00:00.037513", "end": "2024-09-20 21:40:30.075466", "rc": 1, "start": "2024-09-20 21:40:30.037953" } STDOUT: Connection 'statebr' (25972e22-5267-43e5-84f8-5cddc8875a78) successfully deleted. STDERR: Cannot find device "statebr" MSG: non-zero return code ...ignoring 30564 1726882830.12100: no more pending results, returning what we have 30564 1726882830.12105: results queue empty 30564 1726882830.12106: checking for any_errors_fatal 30564 1726882830.12107: done checking for any_errors_fatal 30564 1726882830.12108: checking for max_fail_percentage 30564 1726882830.12109: done checking for max_fail_percentage 30564 1726882830.12110: checking to see if all hosts have failed and the running result is not ok 30564 1726882830.12111: done checking to see if all hosts have failed 30564 1726882830.12112: getting the remaining hosts for this loop 30564 1726882830.12113: done getting the remaining hosts for this loop 30564 1726882830.12117: getting the next task for host managed_node2 30564 1726882830.12128: done getting next task for host managed_node2 30564 1726882830.12131: ^ task is: TASK: Include the task 'run_test.yml' 30564 1726882830.12133: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882830.12138: getting variables 30564 1726882830.12139: in VariableManager get_vars() 30564 1726882830.12173: Calling all_inventory to load vars for managed_node2 30564 1726882830.12176: Calling groups_inventory to load vars for managed_node2 30564 1726882830.12179: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882830.12189: Calling all_plugins_play to load vars for managed_node2 30564 1726882830.12193: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882830.12196: Calling groups_plugins_play to load vars for managed_node2 30564 1726882830.13154: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882830.14391: done with get_vars() 30564 1726882830.14408: done getting variables TASK [Include the task 'run_test.yml'] ***************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_states.yml:65 Friday 20 September 2024 21:40:30 -0400 (0:00:00.432) 0:00:28.726 ****** 30564 1726882830.14479: entering _queue_task() for managed_node2/include_tasks 30564 1726882830.14700: worker is 1 (out of 1 available) 30564 1726882830.14714: exiting _queue_task() for managed_node2/include_tasks 30564 1726882830.14725: done queuing things up, now waiting for results queue to drain 30564 1726882830.14727: waiting for pending results... 30564 1726882830.14918: running TaskExecutor() for managed_node2/TASK: Include the task 'run_test.yml' 30564 1726882830.15037: in run() - task 0e448fcc-3ce9-4216-acec-000000000011 30564 1726882830.15051: variable 'ansible_search_path' from source: unknown 30564 1726882830.15090: calling self._execute() 30564 1726882830.15184: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882830.15188: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882830.15198: variable 'omit' from source: magic vars 30564 1726882830.15485: variable 'ansible_distribution_major_version' from source: facts 30564 1726882830.15496: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882830.15499: _execute() done 30564 1726882830.15504: dumping result to json 30564 1726882830.15507: done dumping result, returning 30564 1726882830.15513: done running TaskExecutor() for managed_node2/TASK: Include the task 'run_test.yml' [0e448fcc-3ce9-4216-acec-000000000011] 30564 1726882830.15519: sending task result for task 0e448fcc-3ce9-4216-acec-000000000011 30564 1726882830.15651: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000011 30564 1726882830.15654: WORKER PROCESS EXITING 30564 1726882830.15679: no more pending results, returning what we have 30564 1726882830.15684: in VariableManager get_vars() 30564 1726882830.15741: Calling all_inventory to load vars for managed_node2 30564 1726882830.15744: Calling groups_inventory to load vars for managed_node2 30564 1726882830.15754: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882830.15766: Calling all_plugins_play to load vars for managed_node2 30564 1726882830.15770: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882830.15772: Calling groups_plugins_play to load vars for managed_node2 30564 1726882830.16955: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882830.18188: done with get_vars() 30564 1726882830.18201: variable 'ansible_search_path' from source: unknown 30564 1726882830.18214: we have included files to process 30564 1726882830.18218: generating all_blocks data 30564 1726882830.18220: done generating all_blocks data 30564 1726882830.18227: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30564 1726882830.18228: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30564 1726882830.18231: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30564 1726882830.18596: in VariableManager get_vars() 30564 1726882830.18607: done with get_vars() 30564 1726882830.18631: in VariableManager get_vars() 30564 1726882830.18640: done with get_vars() 30564 1726882830.18673: in VariableManager get_vars() 30564 1726882830.18685: done with get_vars() 30564 1726882830.18711: in VariableManager get_vars() 30564 1726882830.18720: done with get_vars() 30564 1726882830.18764: in VariableManager get_vars() 30564 1726882830.18780: done with get_vars() 30564 1726882830.19182: in VariableManager get_vars() 30564 1726882830.19192: done with get_vars() 30564 1726882830.19200: done processing included file 30564 1726882830.19201: iterating over new_blocks loaded from include file 30564 1726882830.19202: in VariableManager get_vars() 30564 1726882830.19208: done with get_vars() 30564 1726882830.19209: filtering new block on tags 30564 1726882830.19272: done filtering new block on tags 30564 1726882830.19274: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml for managed_node2 30564 1726882830.19278: extending task lists for all hosts with included blocks 30564 1726882830.19316: done extending task lists 30564 1726882830.19317: done processing included files 30564 1726882830.19318: results queue empty 30564 1726882830.19319: checking for any_errors_fatal 30564 1726882830.19322: done checking for any_errors_fatal 30564 1726882830.19323: checking for max_fail_percentage 30564 1726882830.19324: done checking for max_fail_percentage 30564 1726882830.19325: checking to see if all hosts have failed and the running result is not ok 30564 1726882830.19326: done checking to see if all hosts have failed 30564 1726882830.19326: getting the remaining hosts for this loop 30564 1726882830.19328: done getting the remaining hosts for this loop 30564 1726882830.19330: getting the next task for host managed_node2 30564 1726882830.19334: done getting next task for host managed_node2 30564 1726882830.19336: ^ task is: TASK: TEST: {{ lsr_description }} 30564 1726882830.19338: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882830.19339: getting variables 30564 1726882830.19340: in VariableManager get_vars() 30564 1726882830.19346: Calling all_inventory to load vars for managed_node2 30564 1726882830.19347: Calling groups_inventory to load vars for managed_node2 30564 1726882830.19349: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882830.19352: Calling all_plugins_play to load vars for managed_node2 30564 1726882830.19353: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882830.19355: Calling groups_plugins_play to load vars for managed_node2 30564 1726882830.20238: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882830.21840: done with get_vars() 30564 1726882830.21860: done getting variables 30564 1726882830.21902: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30564 1726882830.21990: variable 'lsr_description' from source: include params TASK [TEST: I can activate an existing profile] ******************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:5 Friday 20 September 2024 21:40:30 -0400 (0:00:00.075) 0:00:28.801 ****** 30564 1726882830.22015: entering _queue_task() for managed_node2/debug 30564 1726882830.22235: worker is 1 (out of 1 available) 30564 1726882830.22249: exiting _queue_task() for managed_node2/debug 30564 1726882830.22271: done queuing things up, now waiting for results queue to drain 30564 1726882830.22273: waiting for pending results... 30564 1726882830.22745: running TaskExecutor() for managed_node2/TASK: TEST: I can activate an existing profile 30564 1726882830.22794: in run() - task 0e448fcc-3ce9-4216-acec-000000000a49 30564 1726882830.22826: variable 'ansible_search_path' from source: unknown 30564 1726882830.22854: variable 'ansible_search_path' from source: unknown 30564 1726882830.22882: calling self._execute() 30564 1726882830.23012: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882830.23023: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882830.23034: variable 'omit' from source: magic vars 30564 1726882830.23414: variable 'ansible_distribution_major_version' from source: facts 30564 1726882830.23425: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882830.23430: variable 'omit' from source: magic vars 30564 1726882830.23453: variable 'omit' from source: magic vars 30564 1726882830.23616: variable 'lsr_description' from source: include params 30564 1726882830.23620: variable 'omit' from source: magic vars 30564 1726882830.23693: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882830.23755: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882830.23778: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882830.23815: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882830.23853: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882830.23879: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882830.23882: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882830.23885: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882830.23994: Set connection var ansible_timeout to 10 30564 1726882830.23998: Set connection var ansible_pipelining to False 30564 1726882830.24000: Set connection var ansible_shell_type to sh 30564 1726882830.24006: Set connection var ansible_shell_executable to /bin/sh 30564 1726882830.24012: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882830.24015: Set connection var ansible_connection to ssh 30564 1726882830.24033: variable 'ansible_shell_executable' from source: unknown 30564 1726882830.24036: variable 'ansible_connection' from source: unknown 30564 1726882830.24040: variable 'ansible_module_compression' from source: unknown 30564 1726882830.24042: variable 'ansible_shell_type' from source: unknown 30564 1726882830.24045: variable 'ansible_shell_executable' from source: unknown 30564 1726882830.24047: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882830.24049: variable 'ansible_pipelining' from source: unknown 30564 1726882830.24051: variable 'ansible_timeout' from source: unknown 30564 1726882830.24053: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882830.24152: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882830.24160: variable 'omit' from source: magic vars 30564 1726882830.24164: starting attempt loop 30564 1726882830.24170: running the handler 30564 1726882830.24207: handler run complete 30564 1726882830.24217: attempt loop complete, returning result 30564 1726882830.24220: _execute() done 30564 1726882830.24223: dumping result to json 30564 1726882830.24225: done dumping result, returning 30564 1726882830.24231: done running TaskExecutor() for managed_node2/TASK: TEST: I can activate an existing profile [0e448fcc-3ce9-4216-acec-000000000a49] 30564 1726882830.24236: sending task result for task 0e448fcc-3ce9-4216-acec-000000000a49 30564 1726882830.24330: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000a49 30564 1726882830.24333: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: ########## I can activate an existing profile ########## 30564 1726882830.24409: no more pending results, returning what we have 30564 1726882830.24412: results queue empty 30564 1726882830.24413: checking for any_errors_fatal 30564 1726882830.24415: done checking for any_errors_fatal 30564 1726882830.24438: checking for max_fail_percentage 30564 1726882830.24441: done checking for max_fail_percentage 30564 1726882830.24442: checking to see if all hosts have failed and the running result is not ok 30564 1726882830.24443: done checking to see if all hosts have failed 30564 1726882830.24443: getting the remaining hosts for this loop 30564 1726882830.24445: done getting the remaining hosts for this loop 30564 1726882830.24448: getting the next task for host managed_node2 30564 1726882830.24453: done getting next task for host managed_node2 30564 1726882830.24456: ^ task is: TASK: Show item 30564 1726882830.24458: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882830.24461: getting variables 30564 1726882830.24463: in VariableManager get_vars() 30564 1726882830.24515: Calling all_inventory to load vars for managed_node2 30564 1726882830.24517: Calling groups_inventory to load vars for managed_node2 30564 1726882830.24520: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882830.24530: Calling all_plugins_play to load vars for managed_node2 30564 1726882830.24532: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882830.24535: Calling groups_plugins_play to load vars for managed_node2 30564 1726882830.26239: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882830.27199: done with get_vars() 30564 1726882830.27213: done getting variables 30564 1726882830.27251: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show item] *************************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:9 Friday 20 September 2024 21:40:30 -0400 (0:00:00.052) 0:00:28.854 ****** 30564 1726882830.27274: entering _queue_task() for managed_node2/debug 30564 1726882830.27460: worker is 1 (out of 1 available) 30564 1726882830.27476: exiting _queue_task() for managed_node2/debug 30564 1726882830.27489: done queuing things up, now waiting for results queue to drain 30564 1726882830.27490: waiting for pending results... 30564 1726882830.27677: running TaskExecutor() for managed_node2/TASK: Show item 30564 1726882830.27737: in run() - task 0e448fcc-3ce9-4216-acec-000000000a4a 30564 1726882830.27749: variable 'ansible_search_path' from source: unknown 30564 1726882830.27752: variable 'ansible_search_path' from source: unknown 30564 1726882830.27793: variable 'omit' from source: magic vars 30564 1726882830.27904: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882830.27913: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882830.27928: variable 'omit' from source: magic vars 30564 1726882830.28183: variable 'ansible_distribution_major_version' from source: facts 30564 1726882830.28195: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882830.28201: variable 'omit' from source: magic vars 30564 1726882830.28225: variable 'omit' from source: magic vars 30564 1726882830.28257: variable 'item' from source: unknown 30564 1726882830.28309: variable 'item' from source: unknown 30564 1726882830.28322: variable 'omit' from source: magic vars 30564 1726882830.28354: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882830.28766: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882830.28771: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882830.28773: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882830.28776: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882830.28778: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882830.28780: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882830.28783: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882830.28785: Set connection var ansible_timeout to 10 30564 1726882830.28787: Set connection var ansible_pipelining to False 30564 1726882830.28788: Set connection var ansible_shell_type to sh 30564 1726882830.28790: Set connection var ansible_shell_executable to /bin/sh 30564 1726882830.28792: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882830.28794: Set connection var ansible_connection to ssh 30564 1726882830.28797: variable 'ansible_shell_executable' from source: unknown 30564 1726882830.28799: variable 'ansible_connection' from source: unknown 30564 1726882830.28800: variable 'ansible_module_compression' from source: unknown 30564 1726882830.28802: variable 'ansible_shell_type' from source: unknown 30564 1726882830.28804: variable 'ansible_shell_executable' from source: unknown 30564 1726882830.28806: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882830.28807: variable 'ansible_pipelining' from source: unknown 30564 1726882830.28809: variable 'ansible_timeout' from source: unknown 30564 1726882830.28811: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882830.29100: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882830.29114: variable 'omit' from source: magic vars 30564 1726882830.29122: starting attempt loop 30564 1726882830.29128: running the handler 30564 1726882830.29176: variable 'lsr_description' from source: include params 30564 1726882830.29241: variable 'lsr_description' from source: include params 30564 1726882830.29255: handler run complete 30564 1726882830.29279: attempt loop complete, returning result 30564 1726882830.29299: variable 'item' from source: unknown 30564 1726882830.29360: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_description) => { "ansible_loop_var": "item", "item": "lsr_description", "lsr_description": "I can activate an existing profile" } 30564 1726882830.29578: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882830.29591: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882830.29603: variable 'omit' from source: magic vars 30564 1726882830.29754: variable 'ansible_distribution_major_version' from source: facts 30564 1726882830.29769: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882830.29779: variable 'omit' from source: magic vars 30564 1726882830.29796: variable 'omit' from source: magic vars 30564 1726882830.29841: variable 'item' from source: unknown 30564 1726882830.29907: variable 'item' from source: unknown 30564 1726882830.29928: variable 'omit' from source: magic vars 30564 1726882830.29953: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882830.29967: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882830.29980: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882830.29996: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882830.30004: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882830.30011: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882830.30088: Set connection var ansible_timeout to 10 30564 1726882830.30099: Set connection var ansible_pipelining to False 30564 1726882830.30106: Set connection var ansible_shell_type to sh 30564 1726882830.30115: Set connection var ansible_shell_executable to /bin/sh 30564 1726882830.30126: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882830.30133: Set connection var ansible_connection to ssh 30564 1726882830.30156: variable 'ansible_shell_executable' from source: unknown 30564 1726882830.30165: variable 'ansible_connection' from source: unknown 30564 1726882830.30174: variable 'ansible_module_compression' from source: unknown 30564 1726882830.30182: variable 'ansible_shell_type' from source: unknown 30564 1726882830.30188: variable 'ansible_shell_executable' from source: unknown 30564 1726882830.30194: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882830.30201: variable 'ansible_pipelining' from source: unknown 30564 1726882830.30207: variable 'ansible_timeout' from source: unknown 30564 1726882830.30214: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882830.30304: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882830.30317: variable 'omit' from source: magic vars 30564 1726882830.30326: starting attempt loop 30564 1726882830.30333: running the handler 30564 1726882830.30357: variable 'lsr_setup' from source: include params 30564 1726882830.30429: variable 'lsr_setup' from source: include params 30564 1726882830.30478: handler run complete 30564 1726882830.30497: attempt loop complete, returning result 30564 1726882830.30516: variable 'item' from source: unknown 30564 1726882830.30580: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_setup) => { "ansible_loop_var": "item", "item": "lsr_setup", "lsr_setup": [ "tasks/create_bridge_profile.yml" ] } 30564 1726882830.30733: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882830.30747: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882830.30761: variable 'omit' from source: magic vars 30564 1726882830.30913: variable 'ansible_distribution_major_version' from source: facts 30564 1726882830.30924: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882830.30932: variable 'omit' from source: magic vars 30564 1726882830.30950: variable 'omit' from source: magic vars 30564 1726882830.30996: variable 'item' from source: unknown 30564 1726882830.31059: variable 'item' from source: unknown 30564 1726882830.31084: variable 'omit' from source: magic vars 30564 1726882830.31106: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882830.31117: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882830.31127: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882830.31141: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882830.31148: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882830.31156: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882830.31237: Set connection var ansible_timeout to 10 30564 1726882830.31247: Set connection var ansible_pipelining to False 30564 1726882830.31260: Set connection var ansible_shell_type to sh 30564 1726882830.31283: Set connection var ansible_shell_executable to /bin/sh 30564 1726882830.31297: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882830.31304: Set connection var ansible_connection to ssh 30564 1726882830.31329: variable 'ansible_shell_executable' from source: unknown 30564 1726882830.31337: variable 'ansible_connection' from source: unknown 30564 1726882830.31344: variable 'ansible_module_compression' from source: unknown 30564 1726882830.31351: variable 'ansible_shell_type' from source: unknown 30564 1726882830.31358: variable 'ansible_shell_executable' from source: unknown 30564 1726882830.31366: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882830.31374: variable 'ansible_pipelining' from source: unknown 30564 1726882830.31381: variable 'ansible_timeout' from source: unknown 30564 1726882830.31389: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882830.31475: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882830.31490: variable 'omit' from source: magic vars 30564 1726882830.31498: starting attempt loop 30564 1726882830.31505: running the handler 30564 1726882830.31526: variable 'lsr_test' from source: include params 30564 1726882830.31593: variable 'lsr_test' from source: include params 30564 1726882830.31614: handler run complete 30564 1726882830.31632: attempt loop complete, returning result 30564 1726882830.31651: variable 'item' from source: unknown 30564 1726882830.31715: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_test) => { "ansible_loop_var": "item", "item": "lsr_test", "lsr_test": [ "tasks/activate_profile.yml" ] } 30564 1726882830.31852: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882830.31870: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882830.31885: variable 'omit' from source: magic vars 30564 1726882830.32028: variable 'ansible_distribution_major_version' from source: facts 30564 1726882830.32039: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882830.32047: variable 'omit' from source: magic vars 30564 1726882830.32066: variable 'omit' from source: magic vars 30564 1726882830.32110: variable 'item' from source: unknown 30564 1726882830.32173: variable 'item' from source: unknown 30564 1726882830.32192: variable 'omit' from source: magic vars 30564 1726882830.32214: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882830.32226: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882830.32237: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882830.32251: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882830.32258: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882830.32267: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882830.32339: Set connection var ansible_timeout to 10 30564 1726882830.32350: Set connection var ansible_pipelining to False 30564 1726882830.32357: Set connection var ansible_shell_type to sh 30564 1726882830.32369: Set connection var ansible_shell_executable to /bin/sh 30564 1726882830.32382: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882830.32388: Set connection var ansible_connection to ssh 30564 1726882830.32412: variable 'ansible_shell_executable' from source: unknown 30564 1726882830.32420: variable 'ansible_connection' from source: unknown 30564 1726882830.32427: variable 'ansible_module_compression' from source: unknown 30564 1726882830.32434: variable 'ansible_shell_type' from source: unknown 30564 1726882830.32440: variable 'ansible_shell_executable' from source: unknown 30564 1726882830.32447: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882830.32455: variable 'ansible_pipelining' from source: unknown 30564 1726882830.32461: variable 'ansible_timeout' from source: unknown 30564 1726882830.32471: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882830.32557: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882830.32573: variable 'omit' from source: magic vars 30564 1726882830.32582: starting attempt loop 30564 1726882830.32589: running the handler 30564 1726882830.32610: variable 'lsr_assert' from source: include params 30564 1726882830.32674: variable 'lsr_assert' from source: include params 30564 1726882830.32697: handler run complete 30564 1726882830.32715: attempt loop complete, returning result 30564 1726882830.32735: variable 'item' from source: unknown 30564 1726882830.32798: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_assert) => { "ansible_loop_var": "item", "item": "lsr_assert", "lsr_assert": [ "tasks/assert_device_present.yml", "tasks/assert_profile_present.yml" ] } 30564 1726882830.32953: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882830.32969: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882830.32982: variable 'omit' from source: magic vars 30564 1726882830.33152: variable 'ansible_distribution_major_version' from source: facts 30564 1726882830.33162: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882830.33179: variable 'omit' from source: magic vars 30564 1726882830.33197: variable 'omit' from source: magic vars 30564 1726882830.33237: variable 'item' from source: unknown 30564 1726882830.33301: variable 'item' from source: unknown 30564 1726882830.33319: variable 'omit' from source: magic vars 30564 1726882830.33339: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882830.33350: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882830.33360: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882830.33376: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882830.33383: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882830.33390: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882830.33459: Set connection var ansible_timeout to 10 30564 1726882830.33473: Set connection var ansible_pipelining to False 30564 1726882830.33481: Set connection var ansible_shell_type to sh 30564 1726882830.33491: Set connection var ansible_shell_executable to /bin/sh 30564 1726882830.33502: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882830.33509: Set connection var ansible_connection to ssh 30564 1726882830.33532: variable 'ansible_shell_executable' from source: unknown 30564 1726882830.33540: variable 'ansible_connection' from source: unknown 30564 1726882830.33546: variable 'ansible_module_compression' from source: unknown 30564 1726882830.33552: variable 'ansible_shell_type' from source: unknown 30564 1726882830.33558: variable 'ansible_shell_executable' from source: unknown 30564 1726882830.33567: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882830.33575: variable 'ansible_pipelining' from source: unknown 30564 1726882830.33582: variable 'ansible_timeout' from source: unknown 30564 1726882830.33589: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882830.33672: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882830.33685: variable 'omit' from source: magic vars 30564 1726882830.33694: starting attempt loop 30564 1726882830.33700: running the handler 30564 1726882830.33805: handler run complete 30564 1726882830.33821: attempt loop complete, returning result 30564 1726882830.33840: variable 'item' from source: unknown 30564 1726882830.33906: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_assert_when) => { "ansible_loop_var": "item", "item": "lsr_assert_when", "lsr_assert_when": "VARIABLE IS NOT DEFINED!: 'lsr_assert_when' is undefined" } 30564 1726882830.34036: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882830.34049: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882830.34062: variable 'omit' from source: magic vars 30564 1726882830.34205: variable 'ansible_distribution_major_version' from source: facts 30564 1726882830.34866: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882830.34876: variable 'omit' from source: magic vars 30564 1726882830.34893: variable 'omit' from source: magic vars 30564 1726882830.34929: variable 'item' from source: unknown 30564 1726882830.34992: variable 'item' from source: unknown 30564 1726882830.35011: variable 'omit' from source: magic vars 30564 1726882830.35033: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882830.35045: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882830.35055: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882830.35077: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882830.35086: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882830.35093: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882830.35166: Set connection var ansible_timeout to 10 30564 1726882830.35178: Set connection var ansible_pipelining to False 30564 1726882830.35185: Set connection var ansible_shell_type to sh 30564 1726882830.35195: Set connection var ansible_shell_executable to /bin/sh 30564 1726882830.35206: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882830.35212: Set connection var ansible_connection to ssh 30564 1726882830.35236: variable 'ansible_shell_executable' from source: unknown 30564 1726882830.35244: variable 'ansible_connection' from source: unknown 30564 1726882830.35250: variable 'ansible_module_compression' from source: unknown 30564 1726882830.35256: variable 'ansible_shell_type' from source: unknown 30564 1726882830.35261: variable 'ansible_shell_executable' from source: unknown 30564 1726882830.35270: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882830.35278: variable 'ansible_pipelining' from source: unknown 30564 1726882830.35284: variable 'ansible_timeout' from source: unknown 30564 1726882830.35292: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882830.35379: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882830.35391: variable 'omit' from source: magic vars 30564 1726882830.35400: starting attempt loop 30564 1726882830.35406: running the handler 30564 1726882830.35427: variable 'lsr_fail_debug' from source: play vars 30564 1726882830.35495: variable 'lsr_fail_debug' from source: play vars 30564 1726882830.35516: handler run complete 30564 1726882830.35535: attempt loop complete, returning result 30564 1726882830.35566: variable 'item' from source: unknown 30564 1726882830.35645: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_fail_debug) => { "ansible_loop_var": "item", "item": "lsr_fail_debug", "lsr_fail_debug": [ "__network_connections_result" ] } 30564 1726882830.35806: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882830.35819: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882830.35833: variable 'omit' from source: magic vars 30564 1726882830.35990: variable 'ansible_distribution_major_version' from source: facts 30564 1726882830.36000: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882830.36009: variable 'omit' from source: magic vars 30564 1726882830.36026: variable 'omit' from source: magic vars 30564 1726882830.36070: variable 'item' from source: unknown 30564 1726882830.36133: variable 'item' from source: unknown 30564 1726882830.36152: variable 'omit' from source: magic vars 30564 1726882830.36177: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882830.36189: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882830.36200: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882830.36219: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882830.36226: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882830.36234: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882830.36307: Set connection var ansible_timeout to 10 30564 1726882830.36318: Set connection var ansible_pipelining to False 30564 1726882830.36325: Set connection var ansible_shell_type to sh 30564 1726882830.36335: Set connection var ansible_shell_executable to /bin/sh 30564 1726882830.36346: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882830.36354: Set connection var ansible_connection to ssh 30564 1726882830.36379: variable 'ansible_shell_executable' from source: unknown 30564 1726882830.36387: variable 'ansible_connection' from source: unknown 30564 1726882830.36394: variable 'ansible_module_compression' from source: unknown 30564 1726882830.36401: variable 'ansible_shell_type' from source: unknown 30564 1726882830.36408: variable 'ansible_shell_executable' from source: unknown 30564 1726882830.36415: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882830.36423: variable 'ansible_pipelining' from source: unknown 30564 1726882830.36429: variable 'ansible_timeout' from source: unknown 30564 1726882830.36437: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882830.36524: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882830.36536: variable 'omit' from source: magic vars 30564 1726882830.36544: starting attempt loop 30564 1726882830.36551: running the handler 30564 1726882830.36575: variable 'lsr_cleanup' from source: include params 30564 1726882830.36638: variable 'lsr_cleanup' from source: include params 30564 1726882830.36659: handler run complete 30564 1726882830.36680: attempt loop complete, returning result 30564 1726882830.36700: variable 'item' from source: unknown 30564 1726882830.36762: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_cleanup) => { "ansible_loop_var": "item", "item": "lsr_cleanup", "lsr_cleanup": [ "tasks/cleanup_profile+device.yml" ] } 30564 1726882830.36857: dumping result to json 30564 1726882830.36874: done dumping result, returning 30564 1726882830.36887: done running TaskExecutor() for managed_node2/TASK: Show item [0e448fcc-3ce9-4216-acec-000000000a4a] 30564 1726882830.36897: sending task result for task 0e448fcc-3ce9-4216-acec-000000000a4a 30564 1726882830.37061: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000a4a 30564 1726882830.37072: WORKER PROCESS EXITING 30564 1726882830.37129: no more pending results, returning what we have 30564 1726882830.37132: results queue empty 30564 1726882830.37133: checking for any_errors_fatal 30564 1726882830.37141: done checking for any_errors_fatal 30564 1726882830.37142: checking for max_fail_percentage 30564 1726882830.37143: done checking for max_fail_percentage 30564 1726882830.37144: checking to see if all hosts have failed and the running result is not ok 30564 1726882830.37145: done checking to see if all hosts have failed 30564 1726882830.37146: getting the remaining hosts for this loop 30564 1726882830.37147: done getting the remaining hosts for this loop 30564 1726882830.37151: getting the next task for host managed_node2 30564 1726882830.37157: done getting next task for host managed_node2 30564 1726882830.37159: ^ task is: TASK: Include the task 'show_interfaces.yml' 30564 1726882830.37162: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882830.37170: getting variables 30564 1726882830.37171: in VariableManager get_vars() 30564 1726882830.37200: Calling all_inventory to load vars for managed_node2 30564 1726882830.37203: Calling groups_inventory to load vars for managed_node2 30564 1726882830.37206: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882830.37217: Calling all_plugins_play to load vars for managed_node2 30564 1726882830.37220: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882830.37222: Calling groups_plugins_play to load vars for managed_node2 30564 1726882830.40826: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882830.44823: done with get_vars() 30564 1726882830.44852: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:21 Friday 20 September 2024 21:40:30 -0400 (0:00:00.178) 0:00:29.032 ****** 30564 1726882830.45146: entering _queue_task() for managed_node2/include_tasks 30564 1726882830.45827: worker is 1 (out of 1 available) 30564 1726882830.45840: exiting _queue_task() for managed_node2/include_tasks 30564 1726882830.45969: done queuing things up, now waiting for results queue to drain 30564 1726882830.45972: waiting for pending results... 30564 1726882830.46775: running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' 30564 1726882830.47166: in run() - task 0e448fcc-3ce9-4216-acec-000000000a4b 30564 1726882830.47188: variable 'ansible_search_path' from source: unknown 30564 1726882830.47195: variable 'ansible_search_path' from source: unknown 30564 1726882830.47236: calling self._execute() 30564 1726882830.47558: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882830.47573: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882830.47589: variable 'omit' from source: magic vars 30564 1726882830.47938: variable 'ansible_distribution_major_version' from source: facts 30564 1726882830.48285: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882830.48296: _execute() done 30564 1726882830.48302: dumping result to json 30564 1726882830.48309: done dumping result, returning 30564 1726882830.48317: done running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' [0e448fcc-3ce9-4216-acec-000000000a4b] 30564 1726882830.48326: sending task result for task 0e448fcc-3ce9-4216-acec-000000000a4b 30564 1726882830.48451: no more pending results, returning what we have 30564 1726882830.48456: in VariableManager get_vars() 30564 1726882830.48501: Calling all_inventory to load vars for managed_node2 30564 1726882830.48504: Calling groups_inventory to load vars for managed_node2 30564 1726882830.48507: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882830.48523: Calling all_plugins_play to load vars for managed_node2 30564 1726882830.48526: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882830.48530: Calling groups_plugins_play to load vars for managed_node2 30564 1726882830.49091: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000a4b 30564 1726882830.49095: WORKER PROCESS EXITING 30564 1726882830.50891: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882830.54946: done with get_vars() 30564 1726882830.54971: variable 'ansible_search_path' from source: unknown 30564 1726882830.54972: variable 'ansible_search_path' from source: unknown 30564 1726882830.55013: we have included files to process 30564 1726882830.55014: generating all_blocks data 30564 1726882830.55016: done generating all_blocks data 30564 1726882830.55020: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30564 1726882830.55022: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30564 1726882830.55024: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30564 1726882830.55247: in VariableManager get_vars() 30564 1726882830.55385: done with get_vars() 30564 1726882830.55625: done processing included file 30564 1726882830.55627: iterating over new_blocks loaded from include file 30564 1726882830.55629: in VariableManager get_vars() 30564 1726882830.55644: done with get_vars() 30564 1726882830.55646: filtering new block on tags 30564 1726882830.55688: done filtering new block on tags 30564 1726882830.55690: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node2 30564 1726882830.55696: extending task lists for all hosts with included blocks 30564 1726882830.56766: done extending task lists 30564 1726882830.56770: done processing included files 30564 1726882830.56771: results queue empty 30564 1726882830.56772: checking for any_errors_fatal 30564 1726882830.56778: done checking for any_errors_fatal 30564 1726882830.56779: checking for max_fail_percentage 30564 1726882830.56780: done checking for max_fail_percentage 30564 1726882830.56781: checking to see if all hosts have failed and the running result is not ok 30564 1726882830.56782: done checking to see if all hosts have failed 30564 1726882830.56782: getting the remaining hosts for this loop 30564 1726882830.56897: done getting the remaining hosts for this loop 30564 1726882830.56901: getting the next task for host managed_node2 30564 1726882830.56906: done getting next task for host managed_node2 30564 1726882830.56908: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 30564 1726882830.56912: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882830.56914: getting variables 30564 1726882830.56915: in VariableManager get_vars() 30564 1726882830.56924: Calling all_inventory to load vars for managed_node2 30564 1726882830.56927: Calling groups_inventory to load vars for managed_node2 30564 1726882830.56929: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882830.56934: Calling all_plugins_play to load vars for managed_node2 30564 1726882830.56936: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882830.56939: Calling groups_plugins_play to load vars for managed_node2 30564 1726882830.59598: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882830.61744: done with get_vars() 30564 1726882830.61777: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 21:40:30 -0400 (0:00:00.167) 0:00:29.199 ****** 30564 1726882830.61855: entering _queue_task() for managed_node2/include_tasks 30564 1726882830.62231: worker is 1 (out of 1 available) 30564 1726882830.62252: exiting _queue_task() for managed_node2/include_tasks 30564 1726882830.62273: done queuing things up, now waiting for results queue to drain 30564 1726882830.62274: waiting for pending results... 30564 1726882830.62611: running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' 30564 1726882830.62722: in run() - task 0e448fcc-3ce9-4216-acec-000000000a72 30564 1726882830.62739: variable 'ansible_search_path' from source: unknown 30564 1726882830.62744: variable 'ansible_search_path' from source: unknown 30564 1726882830.62790: calling self._execute() 30564 1726882830.62913: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882830.62927: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882830.62938: variable 'omit' from source: magic vars 30564 1726882830.64224: variable 'ansible_distribution_major_version' from source: facts 30564 1726882830.64238: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882830.64245: _execute() done 30564 1726882830.64248: dumping result to json 30564 1726882830.64250: done dumping result, returning 30564 1726882830.64257: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' [0e448fcc-3ce9-4216-acec-000000000a72] 30564 1726882830.64262: sending task result for task 0e448fcc-3ce9-4216-acec-000000000a72 30564 1726882830.64370: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000a72 30564 1726882830.64401: no more pending results, returning what we have 30564 1726882830.64407: in VariableManager get_vars() 30564 1726882830.64449: Calling all_inventory to load vars for managed_node2 30564 1726882830.64452: Calling groups_inventory to load vars for managed_node2 30564 1726882830.64456: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882830.64475: Calling all_plugins_play to load vars for managed_node2 30564 1726882830.64480: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882830.64484: Calling groups_plugins_play to load vars for managed_node2 30564 1726882830.65501: WORKER PROCESS EXITING 30564 1726882830.67558: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882830.71310: done with get_vars() 30564 1726882830.71336: variable 'ansible_search_path' from source: unknown 30564 1726882830.71338: variable 'ansible_search_path' from source: unknown 30564 1726882830.71381: we have included files to process 30564 1726882830.71383: generating all_blocks data 30564 1726882830.71384: done generating all_blocks data 30564 1726882830.71386: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30564 1726882830.71388: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30564 1726882830.71390: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30564 1726882830.72020: done processing included file 30564 1726882830.72022: iterating over new_blocks loaded from include file 30564 1726882830.72024: in VariableManager get_vars() 30564 1726882830.72041: done with get_vars() 30564 1726882830.72043: filtering new block on tags 30564 1726882830.72199: done filtering new block on tags 30564 1726882830.72202: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node2 30564 1726882830.72207: extending task lists for all hosts with included blocks 30564 1726882830.72612: done extending task lists 30564 1726882830.72614: done processing included files 30564 1726882830.72615: results queue empty 30564 1726882830.72615: checking for any_errors_fatal 30564 1726882830.72619: done checking for any_errors_fatal 30564 1726882830.72620: checking for max_fail_percentage 30564 1726882830.72621: done checking for max_fail_percentage 30564 1726882830.72622: checking to see if all hosts have failed and the running result is not ok 30564 1726882830.72622: done checking to see if all hosts have failed 30564 1726882830.72623: getting the remaining hosts for this loop 30564 1726882830.72624: done getting the remaining hosts for this loop 30564 1726882830.72627: getting the next task for host managed_node2 30564 1726882830.72632: done getting next task for host managed_node2 30564 1726882830.72634: ^ task is: TASK: Gather current interface info 30564 1726882830.72638: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882830.72639: getting variables 30564 1726882830.72640: in VariableManager get_vars() 30564 1726882830.72650: Calling all_inventory to load vars for managed_node2 30564 1726882830.72652: Calling groups_inventory to load vars for managed_node2 30564 1726882830.72654: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882830.72659: Calling all_plugins_play to load vars for managed_node2 30564 1726882830.72662: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882830.72666: Calling groups_plugins_play to load vars for managed_node2 30564 1726882830.75393: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882830.79294: done with get_vars() 30564 1726882830.79317: done getting variables 30564 1726882830.79484: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 21:40:30 -0400 (0:00:00.176) 0:00:29.376 ****** 30564 1726882830.79515: entering _queue_task() for managed_node2/command 30564 1726882830.80013: worker is 1 (out of 1 available) 30564 1726882830.80025: exiting _queue_task() for managed_node2/command 30564 1726882830.80037: done queuing things up, now waiting for results queue to drain 30564 1726882830.80039: waiting for pending results... 30564 1726882830.80681: running TaskExecutor() for managed_node2/TASK: Gather current interface info 30564 1726882830.80686: in run() - task 0e448fcc-3ce9-4216-acec-000000000aad 30564 1726882830.80689: variable 'ansible_search_path' from source: unknown 30564 1726882830.80693: variable 'ansible_search_path' from source: unknown 30564 1726882830.80697: calling self._execute() 30564 1726882830.80701: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882830.80704: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882830.80708: variable 'omit' from source: magic vars 30564 1726882830.81105: variable 'ansible_distribution_major_version' from source: facts 30564 1726882830.81121: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882830.81124: variable 'omit' from source: magic vars 30564 1726882830.81178: variable 'omit' from source: magic vars 30564 1726882830.81214: variable 'omit' from source: magic vars 30564 1726882830.81261: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882830.81307: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882830.81328: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882830.81342: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882830.81355: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882830.81391: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882830.81394: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882830.81397: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882830.81542: Set connection var ansible_timeout to 10 30564 1726882830.81547: Set connection var ansible_pipelining to False 30564 1726882830.81550: Set connection var ansible_shell_type to sh 30564 1726882830.81556: Set connection var ansible_shell_executable to /bin/sh 30564 1726882830.81566: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882830.81569: Set connection var ansible_connection to ssh 30564 1726882830.81600: variable 'ansible_shell_executable' from source: unknown 30564 1726882830.81604: variable 'ansible_connection' from source: unknown 30564 1726882830.81607: variable 'ansible_module_compression' from source: unknown 30564 1726882830.81610: variable 'ansible_shell_type' from source: unknown 30564 1726882830.81612: variable 'ansible_shell_executable' from source: unknown 30564 1726882830.81615: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882830.81617: variable 'ansible_pipelining' from source: unknown 30564 1726882830.81625: variable 'ansible_timeout' from source: unknown 30564 1726882830.81630: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882830.81786: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882830.81795: variable 'omit' from source: magic vars 30564 1726882830.81801: starting attempt loop 30564 1726882830.81809: running the handler 30564 1726882830.81827: _low_level_execute_command(): starting 30564 1726882830.81835: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882830.84228: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882830.84244: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882830.84260: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882830.84283: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882830.84393: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882830.84406: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882830.84419: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882830.84437: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882830.84448: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882830.84461: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882830.84476: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882830.84490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882830.84504: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882830.84516: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882830.84526: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882830.84540: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882830.84620: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882830.84644: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882830.84666: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882830.84811: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882830.86457: stdout chunk (state=3): >>>/root <<< 30564 1726882830.86640: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882830.86643: stdout chunk (state=3): >>><<< 30564 1726882830.86645: stderr chunk (state=3): >>><<< 30564 1726882830.86758: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882830.86765: _low_level_execute_command(): starting 30564 1726882830.86769: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882830.866682-31877-14548357277448 `" && echo ansible-tmp-1726882830.866682-31877-14548357277448="` echo /root/.ansible/tmp/ansible-tmp-1726882830.866682-31877-14548357277448 `" ) && sleep 0' 30564 1726882830.88185: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882830.88189: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882830.88345: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882830.88349: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882830.88352: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882830.88424: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882830.88487: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882830.88645: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882830.90522: stdout chunk (state=3): >>>ansible-tmp-1726882830.866682-31877-14548357277448=/root/.ansible/tmp/ansible-tmp-1726882830.866682-31877-14548357277448 <<< 30564 1726882830.90631: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882830.90710: stderr chunk (state=3): >>><<< 30564 1726882830.90713: stdout chunk (state=3): >>><<< 30564 1726882830.90973: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882830.866682-31877-14548357277448=/root/.ansible/tmp/ansible-tmp-1726882830.866682-31877-14548357277448 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882830.90977: variable 'ansible_module_compression' from source: unknown 30564 1726882830.90979: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30564 1726882830.90981: variable 'ansible_facts' from source: unknown 30564 1726882830.90983: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882830.866682-31877-14548357277448/AnsiballZ_command.py 30564 1726882830.91792: Sending initial data 30564 1726882830.91795: Sent initial data (154 bytes) 30564 1726882830.93090: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882830.93104: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882830.93117: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882830.93133: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882830.93177: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882830.93194: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882830.93209: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882830.93227: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882830.93240: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882830.93250: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882830.93262: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882830.93279: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882830.93302: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882830.93313: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882830.93323: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882830.93334: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882830.93418: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882830.93442: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882830.93459: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882830.93594: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882830.95339: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882830.95437: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882830.95540: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmpsdbk8wii /root/.ansible/tmp/ansible-tmp-1726882830.866682-31877-14548357277448/AnsiballZ_command.py <<< 30564 1726882830.95633: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882830.97238: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882830.97323: stderr chunk (state=3): >>><<< 30564 1726882830.97326: stdout chunk (state=3): >>><<< 30564 1726882830.97347: done transferring module to remote 30564 1726882830.97358: _low_level_execute_command(): starting 30564 1726882830.97363: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882830.866682-31877-14548357277448/ /root/.ansible/tmp/ansible-tmp-1726882830.866682-31877-14548357277448/AnsiballZ_command.py && sleep 0' 30564 1726882830.98923: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882830.98930: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882830.98983: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30564 1726882830.98987: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882830.99468: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882830.99837: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882830.99840: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882830.99853: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882830.99981: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882831.01816: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882831.01820: stderr chunk (state=3): >>><<< 30564 1726882831.01822: stdout chunk (state=3): >>><<< 30564 1726882831.01839: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882831.01842: _low_level_execute_command(): starting 30564 1726882831.01848: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882830.866682-31877-14548357277448/AnsiballZ_command.py && sleep 0' 30564 1726882831.03306: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882831.03310: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882831.03485: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882831.03489: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882831.03505: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 30564 1726882831.03510: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882831.03698: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882831.03702: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882831.03714: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882831.03843: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882831.17324: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo\nrpltstbr", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:40:31.167965", "end": "2024-09-20 21:40:31.171267", "delta": "0:00:00.003302", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30564 1726882831.18495: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882831.18545: stderr chunk (state=3): >>><<< 30564 1726882831.18548: stdout chunk (state=3): >>><<< 30564 1726882831.18565: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo\nrpltstbr", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:40:31.167965", "end": "2024-09-20 21:40:31.171267", "delta": "0:00:00.003302", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882831.18596: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882830.866682-31877-14548357277448/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882831.18602: _low_level_execute_command(): starting 30564 1726882831.18611: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882830.866682-31877-14548357277448/ > /dev/null 2>&1 && sleep 0' 30564 1726882831.19045: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882831.19049: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882831.19070: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 30564 1726882831.19092: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882831.19095: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882831.19136: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882831.19148: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882831.19254: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882831.21056: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882831.21114: stderr chunk (state=3): >>><<< 30564 1726882831.21117: stdout chunk (state=3): >>><<< 30564 1726882831.21370: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882831.21373: handler run complete 30564 1726882831.21375: Evaluated conditional (False): False 30564 1726882831.21377: attempt loop complete, returning result 30564 1726882831.21379: _execute() done 30564 1726882831.21381: dumping result to json 30564 1726882831.21382: done dumping result, returning 30564 1726882831.21384: done running TaskExecutor() for managed_node2/TASK: Gather current interface info [0e448fcc-3ce9-4216-acec-000000000aad] 30564 1726882831.21386: sending task result for task 0e448fcc-3ce9-4216-acec-000000000aad 30564 1726882831.21455: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000aad 30564 1726882831.21459: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003302", "end": "2024-09-20 21:40:31.171267", "rc": 0, "start": "2024-09-20 21:40:31.167965" } STDOUT: bonding_masters eth0 lo rpltstbr 30564 1726882831.21534: no more pending results, returning what we have 30564 1726882831.21537: results queue empty 30564 1726882831.21538: checking for any_errors_fatal 30564 1726882831.21540: done checking for any_errors_fatal 30564 1726882831.21540: checking for max_fail_percentage 30564 1726882831.21543: done checking for max_fail_percentage 30564 1726882831.21544: checking to see if all hosts have failed and the running result is not ok 30564 1726882831.21544: done checking to see if all hosts have failed 30564 1726882831.21545: getting the remaining hosts for this loop 30564 1726882831.21547: done getting the remaining hosts for this loop 30564 1726882831.21550: getting the next task for host managed_node2 30564 1726882831.21557: done getting next task for host managed_node2 30564 1726882831.21559: ^ task is: TASK: Set current_interfaces 30564 1726882831.21565: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882831.21571: getting variables 30564 1726882831.21572: in VariableManager get_vars() 30564 1726882831.21601: Calling all_inventory to load vars for managed_node2 30564 1726882831.21603: Calling groups_inventory to load vars for managed_node2 30564 1726882831.21606: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882831.21616: Calling all_plugins_play to load vars for managed_node2 30564 1726882831.21618: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882831.21621: Calling groups_plugins_play to load vars for managed_node2 30564 1726882831.22724: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882831.23670: done with get_vars() 30564 1726882831.23687: done getting variables 30564 1726882831.23731: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 21:40:31 -0400 (0:00:00.442) 0:00:29.818 ****** 30564 1726882831.23754: entering _queue_task() for managed_node2/set_fact 30564 1726882831.23977: worker is 1 (out of 1 available) 30564 1726882831.23991: exiting _queue_task() for managed_node2/set_fact 30564 1726882831.24003: done queuing things up, now waiting for results queue to drain 30564 1726882831.24004: waiting for pending results... 30564 1726882831.24185: running TaskExecutor() for managed_node2/TASK: Set current_interfaces 30564 1726882831.24258: in run() - task 0e448fcc-3ce9-4216-acec-000000000aae 30564 1726882831.24273: variable 'ansible_search_path' from source: unknown 30564 1726882831.24278: variable 'ansible_search_path' from source: unknown 30564 1726882831.24303: calling self._execute() 30564 1726882831.24380: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882831.24383: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882831.24394: variable 'omit' from source: magic vars 30564 1726882831.24663: variable 'ansible_distribution_major_version' from source: facts 30564 1726882831.24677: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882831.24686: variable 'omit' from source: magic vars 30564 1726882831.24716: variable 'omit' from source: magic vars 30564 1726882831.24792: variable '_current_interfaces' from source: set_fact 30564 1726882831.24840: variable 'omit' from source: magic vars 30564 1726882831.24874: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882831.24905: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882831.24921: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882831.24935: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882831.24944: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882831.24971: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882831.24974: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882831.24977: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882831.25046: Set connection var ansible_timeout to 10 30564 1726882831.25049: Set connection var ansible_pipelining to False 30564 1726882831.25052: Set connection var ansible_shell_type to sh 30564 1726882831.25057: Set connection var ansible_shell_executable to /bin/sh 30564 1726882831.25065: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882831.25074: Set connection var ansible_connection to ssh 30564 1726882831.25091: variable 'ansible_shell_executable' from source: unknown 30564 1726882831.25094: variable 'ansible_connection' from source: unknown 30564 1726882831.25096: variable 'ansible_module_compression' from source: unknown 30564 1726882831.25099: variable 'ansible_shell_type' from source: unknown 30564 1726882831.25102: variable 'ansible_shell_executable' from source: unknown 30564 1726882831.25105: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882831.25107: variable 'ansible_pipelining' from source: unknown 30564 1726882831.25109: variable 'ansible_timeout' from source: unknown 30564 1726882831.25111: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882831.25215: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882831.25225: variable 'omit' from source: magic vars 30564 1726882831.25228: starting attempt loop 30564 1726882831.25232: running the handler 30564 1726882831.25243: handler run complete 30564 1726882831.25250: attempt loop complete, returning result 30564 1726882831.25253: _execute() done 30564 1726882831.25255: dumping result to json 30564 1726882831.25258: done dumping result, returning 30564 1726882831.25267: done running TaskExecutor() for managed_node2/TASK: Set current_interfaces [0e448fcc-3ce9-4216-acec-000000000aae] 30564 1726882831.25272: sending task result for task 0e448fcc-3ce9-4216-acec-000000000aae 30564 1726882831.25358: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000aae 30564 1726882831.25361: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo", "rpltstbr" ] }, "changed": false } 30564 1726882831.25417: no more pending results, returning what we have 30564 1726882831.25421: results queue empty 30564 1726882831.25422: checking for any_errors_fatal 30564 1726882831.25427: done checking for any_errors_fatal 30564 1726882831.25428: checking for max_fail_percentage 30564 1726882831.25430: done checking for max_fail_percentage 30564 1726882831.25431: checking to see if all hosts have failed and the running result is not ok 30564 1726882831.25431: done checking to see if all hosts have failed 30564 1726882831.25432: getting the remaining hosts for this loop 30564 1726882831.25434: done getting the remaining hosts for this loop 30564 1726882831.25442: getting the next task for host managed_node2 30564 1726882831.25453: done getting next task for host managed_node2 30564 1726882831.25455: ^ task is: TASK: Show current_interfaces 30564 1726882831.25459: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882831.25462: getting variables 30564 1726882831.25464: in VariableManager get_vars() 30564 1726882831.25493: Calling all_inventory to load vars for managed_node2 30564 1726882831.25495: Calling groups_inventory to load vars for managed_node2 30564 1726882831.25498: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882831.25507: Calling all_plugins_play to load vars for managed_node2 30564 1726882831.25509: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882831.25512: Calling groups_plugins_play to load vars for managed_node2 30564 1726882831.26433: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882831.27360: done with get_vars() 30564 1726882831.27379: done getting variables 30564 1726882831.27419: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 21:40:31 -0400 (0:00:00.036) 0:00:29.855 ****** 30564 1726882831.27440: entering _queue_task() for managed_node2/debug 30564 1726882831.27646: worker is 1 (out of 1 available) 30564 1726882831.27659: exiting _queue_task() for managed_node2/debug 30564 1726882831.27676: done queuing things up, now waiting for results queue to drain 30564 1726882831.27677: waiting for pending results... 30564 1726882831.27854: running TaskExecutor() for managed_node2/TASK: Show current_interfaces 30564 1726882831.27924: in run() - task 0e448fcc-3ce9-4216-acec-000000000a73 30564 1726882831.27938: variable 'ansible_search_path' from source: unknown 30564 1726882831.27941: variable 'ansible_search_path' from source: unknown 30564 1726882831.27972: calling self._execute() 30564 1726882831.28043: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882831.28047: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882831.28056: variable 'omit' from source: magic vars 30564 1726882831.28322: variable 'ansible_distribution_major_version' from source: facts 30564 1726882831.28334: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882831.28339: variable 'omit' from source: magic vars 30564 1726882831.28374: variable 'omit' from source: magic vars 30564 1726882831.28437: variable 'current_interfaces' from source: set_fact 30564 1726882831.28459: variable 'omit' from source: magic vars 30564 1726882831.28493: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882831.28520: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882831.28534: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882831.28547: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882831.28558: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882831.28584: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882831.28587: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882831.28590: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882831.28657: Set connection var ansible_timeout to 10 30564 1726882831.28660: Set connection var ansible_pipelining to False 30564 1726882831.28664: Set connection var ansible_shell_type to sh 30564 1726882831.28673: Set connection var ansible_shell_executable to /bin/sh 30564 1726882831.28680: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882831.28682: Set connection var ansible_connection to ssh 30564 1726882831.28702: variable 'ansible_shell_executable' from source: unknown 30564 1726882831.28705: variable 'ansible_connection' from source: unknown 30564 1726882831.28708: variable 'ansible_module_compression' from source: unknown 30564 1726882831.28710: variable 'ansible_shell_type' from source: unknown 30564 1726882831.28712: variable 'ansible_shell_executable' from source: unknown 30564 1726882831.28715: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882831.28717: variable 'ansible_pipelining' from source: unknown 30564 1726882831.28719: variable 'ansible_timeout' from source: unknown 30564 1726882831.28726: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882831.28819: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882831.28829: variable 'omit' from source: magic vars 30564 1726882831.28832: starting attempt loop 30564 1726882831.28835: running the handler 30564 1726882831.28874: handler run complete 30564 1726882831.28884: attempt loop complete, returning result 30564 1726882831.28888: _execute() done 30564 1726882831.28891: dumping result to json 30564 1726882831.28893: done dumping result, returning 30564 1726882831.28900: done running TaskExecutor() for managed_node2/TASK: Show current_interfaces [0e448fcc-3ce9-4216-acec-000000000a73] 30564 1726882831.28903: sending task result for task 0e448fcc-3ce9-4216-acec-000000000a73 30564 1726882831.28992: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000a73 30564 1726882831.28995: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo', 'rpltstbr'] 30564 1726882831.29044: no more pending results, returning what we have 30564 1726882831.29047: results queue empty 30564 1726882831.29048: checking for any_errors_fatal 30564 1726882831.29059: done checking for any_errors_fatal 30564 1726882831.29060: checking for max_fail_percentage 30564 1726882831.29062: done checking for max_fail_percentage 30564 1726882831.29064: checking to see if all hosts have failed and the running result is not ok 30564 1726882831.29065: done checking to see if all hosts have failed 30564 1726882831.29066: getting the remaining hosts for this loop 30564 1726882831.29069: done getting the remaining hosts for this loop 30564 1726882831.29073: getting the next task for host managed_node2 30564 1726882831.29081: done getting next task for host managed_node2 30564 1726882831.29084: ^ task is: TASK: Setup 30564 1726882831.29086: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882831.29089: getting variables 30564 1726882831.29091: in VariableManager get_vars() 30564 1726882831.29115: Calling all_inventory to load vars for managed_node2 30564 1726882831.29117: Calling groups_inventory to load vars for managed_node2 30564 1726882831.29120: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882831.29129: Calling all_plugins_play to load vars for managed_node2 30564 1726882831.29131: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882831.29134: Calling groups_plugins_play to load vars for managed_node2 30564 1726882831.29940: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882831.30886: done with get_vars() 30564 1726882831.30902: done getting variables TASK [Setup] ******************************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:24 Friday 20 September 2024 21:40:31 -0400 (0:00:00.035) 0:00:29.890 ****** 30564 1726882831.30961: entering _queue_task() for managed_node2/include_tasks 30564 1726882831.31150: worker is 1 (out of 1 available) 30564 1726882831.31166: exiting _queue_task() for managed_node2/include_tasks 30564 1726882831.31181: done queuing things up, now waiting for results queue to drain 30564 1726882831.31183: waiting for pending results... 30564 1726882831.31373: running TaskExecutor() for managed_node2/TASK: Setup 30564 1726882831.31429: in run() - task 0e448fcc-3ce9-4216-acec-000000000a4c 30564 1726882831.31441: variable 'ansible_search_path' from source: unknown 30564 1726882831.31446: variable 'ansible_search_path' from source: unknown 30564 1726882831.31480: variable 'lsr_setup' from source: include params 30564 1726882831.31623: variable 'lsr_setup' from source: include params 30564 1726882831.31673: variable 'omit' from source: magic vars 30564 1726882831.31766: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882831.31773: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882831.31783: variable 'omit' from source: magic vars 30564 1726882831.31942: variable 'ansible_distribution_major_version' from source: facts 30564 1726882831.31951: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882831.31955: variable 'item' from source: unknown 30564 1726882831.32006: variable 'item' from source: unknown 30564 1726882831.32029: variable 'item' from source: unknown 30564 1726882831.32074: variable 'item' from source: unknown 30564 1726882831.32193: dumping result to json 30564 1726882831.32196: done dumping result, returning 30564 1726882831.32198: done running TaskExecutor() for managed_node2/TASK: Setup [0e448fcc-3ce9-4216-acec-000000000a4c] 30564 1726882831.32200: sending task result for task 0e448fcc-3ce9-4216-acec-000000000a4c 30564 1726882831.32238: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000a4c 30564 1726882831.32241: WORKER PROCESS EXITING 30564 1726882831.32266: no more pending results, returning what we have 30564 1726882831.32272: in VariableManager get_vars() 30564 1726882831.32301: Calling all_inventory to load vars for managed_node2 30564 1726882831.32303: Calling groups_inventory to load vars for managed_node2 30564 1726882831.32306: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882831.32314: Calling all_plugins_play to load vars for managed_node2 30564 1726882831.32316: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882831.32319: Calling groups_plugins_play to load vars for managed_node2 30564 1726882831.33180: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882831.34127: done with get_vars() 30564 1726882831.34140: variable 'ansible_search_path' from source: unknown 30564 1726882831.34140: variable 'ansible_search_path' from source: unknown 30564 1726882831.34167: we have included files to process 30564 1726882831.34168: generating all_blocks data 30564 1726882831.34169: done generating all_blocks data 30564 1726882831.34172: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 30564 1726882831.34173: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 30564 1726882831.34174: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 30564 1726882831.34323: done processing included file 30564 1726882831.34324: iterating over new_blocks loaded from include file 30564 1726882831.34325: in VariableManager get_vars() 30564 1726882831.34334: done with get_vars() 30564 1726882831.34335: filtering new block on tags 30564 1726882831.34354: done filtering new block on tags 30564 1726882831.34355: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml for managed_node2 => (item=tasks/create_bridge_profile.yml) 30564 1726882831.34358: extending task lists for all hosts with included blocks 30564 1726882831.34722: done extending task lists 30564 1726882831.34723: done processing included files 30564 1726882831.34724: results queue empty 30564 1726882831.34726: checking for any_errors_fatal 30564 1726882831.34729: done checking for any_errors_fatal 30564 1726882831.34730: checking for max_fail_percentage 30564 1726882831.34731: done checking for max_fail_percentage 30564 1726882831.34731: checking to see if all hosts have failed and the running result is not ok 30564 1726882831.34732: done checking to see if all hosts have failed 30564 1726882831.34733: getting the remaining hosts for this loop 30564 1726882831.34734: done getting the remaining hosts for this loop 30564 1726882831.34737: getting the next task for host managed_node2 30564 1726882831.34741: done getting next task for host managed_node2 30564 1726882831.34742: ^ task is: TASK: Include network role 30564 1726882831.34745: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882831.34747: getting variables 30564 1726882831.34748: in VariableManager get_vars() 30564 1726882831.34756: Calling all_inventory to load vars for managed_node2 30564 1726882831.34758: Calling groups_inventory to load vars for managed_node2 30564 1726882831.34760: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882831.34766: Calling all_plugins_play to load vars for managed_node2 30564 1726882831.34768: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882831.34771: Calling groups_plugins_play to load vars for managed_node2 30564 1726882831.36037: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882831.41445: done with get_vars() 30564 1726882831.41469: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml:3 Friday 20 September 2024 21:40:31 -0400 (0:00:00.105) 0:00:29.996 ****** 30564 1726882831.41541: entering _queue_task() for managed_node2/include_role 30564 1726882831.41841: worker is 1 (out of 1 available) 30564 1726882831.41854: exiting _queue_task() for managed_node2/include_role 30564 1726882831.41869: done queuing things up, now waiting for results queue to drain 30564 1726882831.41870: waiting for pending results... 30564 1726882831.42157: running TaskExecutor() for managed_node2/TASK: Include network role 30564 1726882831.42257: in run() - task 0e448fcc-3ce9-4216-acec-000000000ad1 30564 1726882831.42275: variable 'ansible_search_path' from source: unknown 30564 1726882831.42279: variable 'ansible_search_path' from source: unknown 30564 1726882831.42310: calling self._execute() 30564 1726882831.42403: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882831.42408: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882831.42420: variable 'omit' from source: magic vars 30564 1726882831.42795: variable 'ansible_distribution_major_version' from source: facts 30564 1726882831.42808: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882831.42814: _execute() done 30564 1726882831.42818: dumping result to json 30564 1726882831.42822: done dumping result, returning 30564 1726882831.42826: done running TaskExecutor() for managed_node2/TASK: Include network role [0e448fcc-3ce9-4216-acec-000000000ad1] 30564 1726882831.42833: sending task result for task 0e448fcc-3ce9-4216-acec-000000000ad1 30564 1726882831.42949: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000ad1 30564 1726882831.42953: WORKER PROCESS EXITING 30564 1726882831.42992: no more pending results, returning what we have 30564 1726882831.42998: in VariableManager get_vars() 30564 1726882831.43035: Calling all_inventory to load vars for managed_node2 30564 1726882831.43038: Calling groups_inventory to load vars for managed_node2 30564 1726882831.43042: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882831.43056: Calling all_plugins_play to load vars for managed_node2 30564 1726882831.43059: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882831.43062: Calling groups_plugins_play to load vars for managed_node2 30564 1726882831.44617: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882831.46207: done with get_vars() 30564 1726882831.46298: variable 'ansible_search_path' from source: unknown 30564 1726882831.46300: variable 'ansible_search_path' from source: unknown 30564 1726882831.46499: variable 'omit' from source: magic vars 30564 1726882831.46542: variable 'omit' from source: magic vars 30564 1726882831.46556: variable 'omit' from source: magic vars 30564 1726882831.46560: we have included files to process 30564 1726882831.46561: generating all_blocks data 30564 1726882831.46562: done generating all_blocks data 30564 1726882831.46567: processing included file: fedora.linux_system_roles.network 30564 1726882831.46589: in VariableManager get_vars() 30564 1726882831.46601: done with get_vars() 30564 1726882831.46628: in VariableManager get_vars() 30564 1726882831.46646: done with get_vars() 30564 1726882831.46691: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 30564 1726882831.46811: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 30564 1726882831.47082: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 30564 1726882831.47538: in VariableManager get_vars() 30564 1726882831.47555: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30564 1726882831.50635: iterating over new_blocks loaded from include file 30564 1726882831.50637: in VariableManager get_vars() 30564 1726882831.50653: done with get_vars() 30564 1726882831.50655: filtering new block on tags 30564 1726882831.51263: done filtering new block on tags 30564 1726882831.51269: in VariableManager get_vars() 30564 1726882831.51282: done with get_vars() 30564 1726882831.51284: filtering new block on tags 30564 1726882831.51299: done filtering new block on tags 30564 1726882831.51300: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node2 30564 1726882831.51305: extending task lists for all hosts with included blocks 30564 1726882831.51472: done extending task lists 30564 1726882831.51473: done processing included files 30564 1726882831.51474: results queue empty 30564 1726882831.51475: checking for any_errors_fatal 30564 1726882831.51479: done checking for any_errors_fatal 30564 1726882831.51479: checking for max_fail_percentage 30564 1726882831.51480: done checking for max_fail_percentage 30564 1726882831.51481: checking to see if all hosts have failed and the running result is not ok 30564 1726882831.51482: done checking to see if all hosts have failed 30564 1726882831.51483: getting the remaining hosts for this loop 30564 1726882831.51484: done getting the remaining hosts for this loop 30564 1726882831.51486: getting the next task for host managed_node2 30564 1726882831.51491: done getting next task for host managed_node2 30564 1726882831.51493: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30564 1726882831.51497: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882831.51505: getting variables 30564 1726882831.51506: in VariableManager get_vars() 30564 1726882831.51517: Calling all_inventory to load vars for managed_node2 30564 1726882831.51519: Calling groups_inventory to load vars for managed_node2 30564 1726882831.51521: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882831.51526: Calling all_plugins_play to load vars for managed_node2 30564 1726882831.51528: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882831.51530: Calling groups_plugins_play to load vars for managed_node2 30564 1726882831.54312: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882831.56321: done with get_vars() 30564 1726882831.56338: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:40:31 -0400 (0:00:00.148) 0:00:30.145 ****** 30564 1726882831.56396: entering _queue_task() for managed_node2/include_tasks 30564 1726882831.56621: worker is 1 (out of 1 available) 30564 1726882831.56636: exiting _queue_task() for managed_node2/include_tasks 30564 1726882831.56647: done queuing things up, now waiting for results queue to drain 30564 1726882831.56648: waiting for pending results... 30564 1726882831.56837: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30564 1726882831.56927: in run() - task 0e448fcc-3ce9-4216-acec-000000000b33 30564 1726882831.56938: variable 'ansible_search_path' from source: unknown 30564 1726882831.56941: variable 'ansible_search_path' from source: unknown 30564 1726882831.56972: calling self._execute() 30564 1726882831.57048: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882831.57054: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882831.57062: variable 'omit' from source: magic vars 30564 1726882831.57341: variable 'ansible_distribution_major_version' from source: facts 30564 1726882831.57353: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882831.57358: _execute() done 30564 1726882831.57361: dumping result to json 30564 1726882831.57365: done dumping result, returning 30564 1726882831.57373: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0e448fcc-3ce9-4216-acec-000000000b33] 30564 1726882831.57378: sending task result for task 0e448fcc-3ce9-4216-acec-000000000b33 30564 1726882831.57460: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000b33 30564 1726882831.57463: WORKER PROCESS EXITING 30564 1726882831.57512: no more pending results, returning what we have 30564 1726882831.57517: in VariableManager get_vars() 30564 1726882831.57556: Calling all_inventory to load vars for managed_node2 30564 1726882831.57559: Calling groups_inventory to load vars for managed_node2 30564 1726882831.57561: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882831.57581: Calling all_plugins_play to load vars for managed_node2 30564 1726882831.57584: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882831.57587: Calling groups_plugins_play to load vars for managed_node2 30564 1726882831.59052: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882831.60025: done with get_vars() 30564 1726882831.60039: variable 'ansible_search_path' from source: unknown 30564 1726882831.60040: variable 'ansible_search_path' from source: unknown 30564 1726882831.60067: we have included files to process 30564 1726882831.60068: generating all_blocks data 30564 1726882831.60069: done generating all_blocks data 30564 1726882831.60071: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30564 1726882831.60072: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30564 1726882831.60074: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30564 1726882831.60439: done processing included file 30564 1726882831.60440: iterating over new_blocks loaded from include file 30564 1726882831.60441: in VariableManager get_vars() 30564 1726882831.60455: done with get_vars() 30564 1726882831.60456: filtering new block on tags 30564 1726882831.60477: done filtering new block on tags 30564 1726882831.60479: in VariableManager get_vars() 30564 1726882831.60492: done with get_vars() 30564 1726882831.60493: filtering new block on tags 30564 1726882831.60521: done filtering new block on tags 30564 1726882831.60524: in VariableManager get_vars() 30564 1726882831.60537: done with get_vars() 30564 1726882831.60538: filtering new block on tags 30564 1726882831.60562: done filtering new block on tags 30564 1726882831.60564: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 30564 1726882831.60569: extending task lists for all hosts with included blocks 30564 1726882831.62259: done extending task lists 30564 1726882831.62261: done processing included files 30564 1726882831.62261: results queue empty 30564 1726882831.62262: checking for any_errors_fatal 30564 1726882831.62266: done checking for any_errors_fatal 30564 1726882831.62269: checking for max_fail_percentage 30564 1726882831.62271: done checking for max_fail_percentage 30564 1726882831.62271: checking to see if all hosts have failed and the running result is not ok 30564 1726882831.62272: done checking to see if all hosts have failed 30564 1726882831.62273: getting the remaining hosts for this loop 30564 1726882831.62274: done getting the remaining hosts for this loop 30564 1726882831.62281: getting the next task for host managed_node2 30564 1726882831.62286: done getting next task for host managed_node2 30564 1726882831.62289: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30564 1726882831.62293: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882831.62303: getting variables 30564 1726882831.62304: in VariableManager get_vars() 30564 1726882831.62316: Calling all_inventory to load vars for managed_node2 30564 1726882831.62318: Calling groups_inventory to load vars for managed_node2 30564 1726882831.62320: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882831.62325: Calling all_plugins_play to load vars for managed_node2 30564 1726882831.62327: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882831.62330: Calling groups_plugins_play to load vars for managed_node2 30564 1726882831.63302: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882831.64838: done with get_vars() 30564 1726882831.64856: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:40:31 -0400 (0:00:00.085) 0:00:30.230 ****** 30564 1726882831.64910: entering _queue_task() for managed_node2/setup 30564 1726882831.65218: worker is 1 (out of 1 available) 30564 1726882831.65231: exiting _queue_task() for managed_node2/setup 30564 1726882831.65247: done queuing things up, now waiting for results queue to drain 30564 1726882831.65248: waiting for pending results... 30564 1726882831.65684: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30564 1726882831.65813: in run() - task 0e448fcc-3ce9-4216-acec-000000000b90 30564 1726882831.65818: variable 'ansible_search_path' from source: unknown 30564 1726882831.65821: variable 'ansible_search_path' from source: unknown 30564 1726882831.65824: calling self._execute() 30564 1726882831.65873: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882831.65878: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882831.65888: variable 'omit' from source: magic vars 30564 1726882831.66283: variable 'ansible_distribution_major_version' from source: facts 30564 1726882831.66287: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882831.66584: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882831.69662: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882831.69759: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882831.69800: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882831.69838: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882831.69875: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882831.69986: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882831.70016: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882831.70048: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882831.70117: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882831.70129: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882831.70193: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882831.70219: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882831.70244: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882831.70317: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882831.70333: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882831.70626: variable '__network_required_facts' from source: role '' defaults 30564 1726882831.70629: variable 'ansible_facts' from source: unknown 30564 1726882831.71604: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 30564 1726882831.71608: when evaluation is False, skipping this task 30564 1726882831.71611: _execute() done 30564 1726882831.71613: dumping result to json 30564 1726882831.71615: done dumping result, returning 30564 1726882831.71622: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0e448fcc-3ce9-4216-acec-000000000b90] 30564 1726882831.71628: sending task result for task 0e448fcc-3ce9-4216-acec-000000000b90 30564 1726882831.71742: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000b90 30564 1726882831.71747: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30564 1726882831.71798: no more pending results, returning what we have 30564 1726882831.71802: results queue empty 30564 1726882831.71803: checking for any_errors_fatal 30564 1726882831.71805: done checking for any_errors_fatal 30564 1726882831.71806: checking for max_fail_percentage 30564 1726882831.71808: done checking for max_fail_percentage 30564 1726882831.71809: checking to see if all hosts have failed and the running result is not ok 30564 1726882831.71810: done checking to see if all hosts have failed 30564 1726882831.71811: getting the remaining hosts for this loop 30564 1726882831.71812: done getting the remaining hosts for this loop 30564 1726882831.71817: getting the next task for host managed_node2 30564 1726882831.71828: done getting next task for host managed_node2 30564 1726882831.71832: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 30564 1726882831.71838: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882831.71878: getting variables 30564 1726882831.71881: in VariableManager get_vars() 30564 1726882831.71918: Calling all_inventory to load vars for managed_node2 30564 1726882831.71921: Calling groups_inventory to load vars for managed_node2 30564 1726882831.71924: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882831.71935: Calling all_plugins_play to load vars for managed_node2 30564 1726882831.71938: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882831.71947: Calling groups_plugins_play to load vars for managed_node2 30564 1726882831.73669: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882831.74650: done with get_vars() 30564 1726882831.74668: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:40:31 -0400 (0:00:00.098) 0:00:30.328 ****** 30564 1726882831.74762: entering _queue_task() for managed_node2/stat 30564 1726882831.75036: worker is 1 (out of 1 available) 30564 1726882831.75050: exiting _queue_task() for managed_node2/stat 30564 1726882831.75062: done queuing things up, now waiting for results queue to drain 30564 1726882831.75065: waiting for pending results... 30564 1726882831.75399: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 30564 1726882831.75571: in run() - task 0e448fcc-3ce9-4216-acec-000000000b92 30564 1726882831.75597: variable 'ansible_search_path' from source: unknown 30564 1726882831.75604: variable 'ansible_search_path' from source: unknown 30564 1726882831.75646: calling self._execute() 30564 1726882831.75828: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882831.75840: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882831.75855: variable 'omit' from source: magic vars 30564 1726882831.76287: variable 'ansible_distribution_major_version' from source: facts 30564 1726882831.76300: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882831.76501: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882831.76828: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882831.76832: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882831.76858: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882831.76894: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882831.76987: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882831.77017: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882831.77042: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882831.77073: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882831.77161: variable '__network_is_ostree' from source: set_fact 30564 1726882831.77165: Evaluated conditional (not __network_is_ostree is defined): False 30564 1726882831.77180: when evaluation is False, skipping this task 30564 1726882831.77183: _execute() done 30564 1726882831.77186: dumping result to json 30564 1726882831.77188: done dumping result, returning 30564 1726882831.77191: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0e448fcc-3ce9-4216-acec-000000000b92] 30564 1726882831.77193: sending task result for task 0e448fcc-3ce9-4216-acec-000000000b92 30564 1726882831.77291: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000b92 30564 1726882831.77294: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30564 1726882831.77352: no more pending results, returning what we have 30564 1726882831.77356: results queue empty 30564 1726882831.77358: checking for any_errors_fatal 30564 1726882831.77369: done checking for any_errors_fatal 30564 1726882831.77370: checking for max_fail_percentage 30564 1726882831.77372: done checking for max_fail_percentage 30564 1726882831.77373: checking to see if all hosts have failed and the running result is not ok 30564 1726882831.77374: done checking to see if all hosts have failed 30564 1726882831.77375: getting the remaining hosts for this loop 30564 1726882831.77377: done getting the remaining hosts for this loop 30564 1726882831.77381: getting the next task for host managed_node2 30564 1726882831.77390: done getting next task for host managed_node2 30564 1726882831.77394: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30564 1726882831.77401: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882831.77421: getting variables 30564 1726882831.77423: in VariableManager get_vars() 30564 1726882831.77459: Calling all_inventory to load vars for managed_node2 30564 1726882831.77462: Calling groups_inventory to load vars for managed_node2 30564 1726882831.77466: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882831.77478: Calling all_plugins_play to load vars for managed_node2 30564 1726882831.77481: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882831.77484: Calling groups_plugins_play to load vars for managed_node2 30564 1726882831.79605: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882831.82000: done with get_vars() 30564 1726882831.82021: done getting variables 30564 1726882831.82103: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:40:31 -0400 (0:00:00.073) 0:00:30.402 ****** 30564 1726882831.82138: entering _queue_task() for managed_node2/set_fact 30564 1726882831.82416: worker is 1 (out of 1 available) 30564 1726882831.82428: exiting _queue_task() for managed_node2/set_fact 30564 1726882831.82441: done queuing things up, now waiting for results queue to drain 30564 1726882831.82442: waiting for pending results... 30564 1726882831.82741: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30564 1726882831.82884: in run() - task 0e448fcc-3ce9-4216-acec-000000000b93 30564 1726882831.82903: variable 'ansible_search_path' from source: unknown 30564 1726882831.82907: variable 'ansible_search_path' from source: unknown 30564 1726882831.82941: calling self._execute() 30564 1726882831.83038: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882831.83042: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882831.83053: variable 'omit' from source: magic vars 30564 1726882831.83422: variable 'ansible_distribution_major_version' from source: facts 30564 1726882831.83439: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882831.83609: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882831.83916: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882831.83958: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882831.84008: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882831.84036: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882831.84126: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882831.84150: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882831.84467: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882831.84472: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882831.84474: variable '__network_is_ostree' from source: set_fact 30564 1726882831.84476: Evaluated conditional (not __network_is_ostree is defined): False 30564 1726882831.84478: when evaluation is False, skipping this task 30564 1726882831.84480: _execute() done 30564 1726882831.84482: dumping result to json 30564 1726882831.84483: done dumping result, returning 30564 1726882831.84486: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0e448fcc-3ce9-4216-acec-000000000b93] 30564 1726882831.84488: sending task result for task 0e448fcc-3ce9-4216-acec-000000000b93 skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30564 1726882831.84585: no more pending results, returning what we have 30564 1726882831.84589: results queue empty 30564 1726882831.84590: checking for any_errors_fatal 30564 1726882831.84594: done checking for any_errors_fatal 30564 1726882831.84595: checking for max_fail_percentage 30564 1726882831.84597: done checking for max_fail_percentage 30564 1726882831.84598: checking to see if all hosts have failed and the running result is not ok 30564 1726882831.84598: done checking to see if all hosts have failed 30564 1726882831.84599: getting the remaining hosts for this loop 30564 1726882831.84600: done getting the remaining hosts for this loop 30564 1726882831.84604: getting the next task for host managed_node2 30564 1726882831.84613: done getting next task for host managed_node2 30564 1726882831.84616: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 30564 1726882831.84622: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882831.84639: getting variables 30564 1726882831.84640: in VariableManager get_vars() 30564 1726882831.84671: Calling all_inventory to load vars for managed_node2 30564 1726882831.84675: Calling groups_inventory to load vars for managed_node2 30564 1726882831.84677: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882831.84688: Calling all_plugins_play to load vars for managed_node2 30564 1726882831.84691: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882831.84694: Calling groups_plugins_play to load vars for managed_node2 30564 1726882831.85627: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000b93 30564 1726882831.85631: WORKER PROCESS EXITING 30564 1726882831.86356: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882831.88540: done with get_vars() 30564 1726882831.88573: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:40:31 -0400 (0:00:00.068) 0:00:30.471 ****** 30564 1726882831.88981: entering _queue_task() for managed_node2/service_facts 30564 1726882831.89276: worker is 1 (out of 1 available) 30564 1726882831.89288: exiting _queue_task() for managed_node2/service_facts 30564 1726882831.89316: done queuing things up, now waiting for results queue to drain 30564 1726882831.89318: waiting for pending results... 30564 1726882831.89748: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 30564 1726882831.89927: in run() - task 0e448fcc-3ce9-4216-acec-000000000b95 30564 1726882831.89948: variable 'ansible_search_path' from source: unknown 30564 1726882831.89955: variable 'ansible_search_path' from source: unknown 30564 1726882831.90007: calling self._execute() 30564 1726882831.90119: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882831.90131: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882831.90145: variable 'omit' from source: magic vars 30564 1726882831.91218: variable 'ansible_distribution_major_version' from source: facts 30564 1726882831.91237: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882831.91248: variable 'omit' from source: magic vars 30564 1726882831.91446: variable 'omit' from source: magic vars 30564 1726882831.91489: variable 'omit' from source: magic vars 30564 1726882831.91605: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882831.91715: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882831.91822: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882831.91873: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882831.91902: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882831.91955: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882831.91972: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882831.91982: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882831.92123: Set connection var ansible_timeout to 10 30564 1726882831.92134: Set connection var ansible_pipelining to False 30564 1726882831.92141: Set connection var ansible_shell_type to sh 30564 1726882831.92153: Set connection var ansible_shell_executable to /bin/sh 30564 1726882831.92177: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882831.92187: Set connection var ansible_connection to ssh 30564 1726882831.92215: variable 'ansible_shell_executable' from source: unknown 30564 1726882831.92223: variable 'ansible_connection' from source: unknown 30564 1726882831.92231: variable 'ansible_module_compression' from source: unknown 30564 1726882831.92241: variable 'ansible_shell_type' from source: unknown 30564 1726882831.92249: variable 'ansible_shell_executable' from source: unknown 30564 1726882831.92255: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882831.92263: variable 'ansible_pipelining' from source: unknown 30564 1726882831.92282: variable 'ansible_timeout' from source: unknown 30564 1726882831.92295: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882831.92605: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30564 1726882831.92623: variable 'omit' from source: magic vars 30564 1726882831.92636: starting attempt loop 30564 1726882831.92650: running the handler 30564 1726882831.92699: _low_level_execute_command(): starting 30564 1726882831.92711: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882831.95005: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882831.95018: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882831.95029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882831.95043: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882831.95082: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882831.95779: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882831.95788: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882831.95801: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882831.95810: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882831.95816: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882831.95824: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882831.95834: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882831.95845: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882831.95853: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882831.95860: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882831.95873: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882831.95946: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882831.95960: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882831.95970: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882831.96108: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882831.97758: stdout chunk (state=3): >>>/root <<< 30564 1726882831.97930: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882831.97933: stdout chunk (state=3): >>><<< 30564 1726882831.97944: stderr chunk (state=3): >>><<< 30564 1726882831.97964: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882831.97981: _low_level_execute_command(): starting 30564 1726882831.97988: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882831.97965-31964-156060553384170 `" && echo ansible-tmp-1726882831.97965-31964-156060553384170="` echo /root/.ansible/tmp/ansible-tmp-1726882831.97965-31964-156060553384170 `" ) && sleep 0' 30564 1726882831.99199: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882832.00071: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882832.00075: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882832.00078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882832.00080: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882832.00082: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882832.00085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882832.00087: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882832.00089: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882832.00091: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882832.00093: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882832.00095: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882832.00097: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882832.00099: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882832.00101: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882832.00103: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882832.00105: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882832.00123: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882832.00131: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882832.00397: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882832.02289: stdout chunk (state=3): >>>ansible-tmp-1726882831.97965-31964-156060553384170=/root/.ansible/tmp/ansible-tmp-1726882831.97965-31964-156060553384170 <<< 30564 1726882832.02470: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882832.02474: stdout chunk (state=3): >>><<< 30564 1726882832.02479: stderr chunk (state=3): >>><<< 30564 1726882832.02496: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882831.97965-31964-156060553384170=/root/.ansible/tmp/ansible-tmp-1726882831.97965-31964-156060553384170 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882832.02543: variable 'ansible_module_compression' from source: unknown 30564 1726882832.02589: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 30564 1726882832.02628: variable 'ansible_facts' from source: unknown 30564 1726882832.02716: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882831.97965-31964-156060553384170/AnsiballZ_service_facts.py 30564 1726882832.03322: Sending initial data 30564 1726882832.03326: Sent initial data (160 bytes) 30564 1726882832.05370: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882832.05781: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882832.05791: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882832.05805: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882832.05843: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882832.05850: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882832.05860: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882832.05878: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882832.05884: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882832.05890: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882832.05899: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882832.05910: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882832.05919: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882832.05926: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882832.05933: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882832.05942: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882832.06015: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882832.06087: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882832.06114: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882832.06396: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882832.08184: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 <<< 30564 1726882832.08191: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882832.08285: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882832.08389: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmp6xwlopjb /root/.ansible/tmp/ansible-tmp-1726882831.97965-31964-156060553384170/AnsiballZ_service_facts.py <<< 30564 1726882832.08492: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882832.10037: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882832.10045: stderr chunk (state=3): >>><<< 30564 1726882832.10048: stdout chunk (state=3): >>><<< 30564 1726882832.10071: done transferring module to remote 30564 1726882832.10080: _low_level_execute_command(): starting 30564 1726882832.10085: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882831.97965-31964-156060553384170/ /root/.ansible/tmp/ansible-tmp-1726882831.97965-31964-156060553384170/AnsiballZ_service_facts.py && sleep 0' 30564 1726882832.10918: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882832.10932: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882832.10938: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882832.10977: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 30564 1726882832.10983: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882832.10998: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882832.11003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882832.11016: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882832.11020: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882832.11107: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882832.11125: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882832.11259: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882832.13071: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882832.13079: stderr chunk (state=3): >>><<< 30564 1726882832.13085: stdout chunk (state=3): >>><<< 30564 1726882832.13100: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882832.13103: _low_level_execute_command(): starting 30564 1726882832.13106: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882831.97965-31964-156060553384170/AnsiballZ_service_facts.py && sleep 0' 30564 1726882832.14890: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882832.14983: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882832.15000: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882832.15028: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882832.15073: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882832.15131: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882832.15146: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882832.15187: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882832.15207: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882832.15235: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882832.15249: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882832.15278: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882832.15296: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882832.15365: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882832.15387: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882832.15402: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882832.15596: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882832.15613: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882832.15628: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882832.15809: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882833.54763: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "s<<< 30564 1726882833.54812: stdout chunk (state=3): >>>tate": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 30564 1726882833.56181: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882833.56185: stdout chunk (state=3): >>><<< 30564 1726882833.56187: stderr chunk (state=3): >>><<< 30564 1726882833.56278: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882833.57087: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882831.97965-31964-156060553384170/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882833.57105: _low_level_execute_command(): starting 30564 1726882833.57119: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882831.97965-31964-156060553384170/ > /dev/null 2>&1 && sleep 0' 30564 1726882833.57883: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882833.57916: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882833.57938: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882833.57984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882833.58078: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882833.58091: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882833.58109: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882833.58140: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882833.58173: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882833.58186: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882833.58236: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882833.58245: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882833.58378: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882833.60291: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882833.60295: stdout chunk (state=3): >>><<< 30564 1726882833.60298: stderr chunk (state=3): >>><<< 30564 1726882833.60577: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882833.60580: handler run complete 30564 1726882833.60582: variable 'ansible_facts' from source: unknown 30564 1726882833.60673: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882833.61199: variable 'ansible_facts' from source: unknown 30564 1726882833.61350: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882833.61566: attempt loop complete, returning result 30564 1726882833.61581: _execute() done 30564 1726882833.61588: dumping result to json 30564 1726882833.61648: done dumping result, returning 30564 1726882833.61678: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [0e448fcc-3ce9-4216-acec-000000000b95] 30564 1726882833.61690: sending task result for task 0e448fcc-3ce9-4216-acec-000000000b95 ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30564 1726882833.62482: no more pending results, returning what we have 30564 1726882833.62486: results queue empty 30564 1726882833.62487: checking for any_errors_fatal 30564 1726882833.62491: done checking for any_errors_fatal 30564 1726882833.62492: checking for max_fail_percentage 30564 1726882833.62494: done checking for max_fail_percentage 30564 1726882833.62494: checking to see if all hosts have failed and the running result is not ok 30564 1726882833.62495: done checking to see if all hosts have failed 30564 1726882833.62496: getting the remaining hosts for this loop 30564 1726882833.62497: done getting the remaining hosts for this loop 30564 1726882833.62500: getting the next task for host managed_node2 30564 1726882833.62507: done getting next task for host managed_node2 30564 1726882833.62510: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 30564 1726882833.62515: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882833.62526: getting variables 30564 1726882833.62527: in VariableManager get_vars() 30564 1726882833.62555: Calling all_inventory to load vars for managed_node2 30564 1726882833.62557: Calling groups_inventory to load vars for managed_node2 30564 1726882833.62560: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882833.62579: Calling all_plugins_play to load vars for managed_node2 30564 1726882833.62582: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882833.62590: Calling groups_plugins_play to load vars for managed_node2 30564 1726882833.63212: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000b95 30564 1726882833.63215: WORKER PROCESS EXITING 30564 1726882833.63766: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882833.65394: done with get_vars() 30564 1726882833.65411: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:40:33 -0400 (0:00:01.765) 0:00:32.236 ****** 30564 1726882833.65484: entering _queue_task() for managed_node2/package_facts 30564 1726882833.65700: worker is 1 (out of 1 available) 30564 1726882833.65714: exiting _queue_task() for managed_node2/package_facts 30564 1726882833.65727: done queuing things up, now waiting for results queue to drain 30564 1726882833.65729: waiting for pending results... 30564 1726882833.65907: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 30564 1726882833.66016: in run() - task 0e448fcc-3ce9-4216-acec-000000000b96 30564 1726882833.66028: variable 'ansible_search_path' from source: unknown 30564 1726882833.66032: variable 'ansible_search_path' from source: unknown 30564 1726882833.66061: calling self._execute() 30564 1726882833.66131: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882833.66135: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882833.66147: variable 'omit' from source: magic vars 30564 1726882833.66416: variable 'ansible_distribution_major_version' from source: facts 30564 1726882833.66426: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882833.66432: variable 'omit' from source: magic vars 30564 1726882833.66484: variable 'omit' from source: magic vars 30564 1726882833.66505: variable 'omit' from source: magic vars 30564 1726882833.66537: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882833.66562: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882833.66581: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882833.66595: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882833.66606: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882833.66629: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882833.66632: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882833.66635: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882833.66708: Set connection var ansible_timeout to 10 30564 1726882833.66711: Set connection var ansible_pipelining to False 30564 1726882833.66715: Set connection var ansible_shell_type to sh 30564 1726882833.66720: Set connection var ansible_shell_executable to /bin/sh 30564 1726882833.66727: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882833.66730: Set connection var ansible_connection to ssh 30564 1726882833.66747: variable 'ansible_shell_executable' from source: unknown 30564 1726882833.66750: variable 'ansible_connection' from source: unknown 30564 1726882833.66753: variable 'ansible_module_compression' from source: unknown 30564 1726882833.66758: variable 'ansible_shell_type' from source: unknown 30564 1726882833.66760: variable 'ansible_shell_executable' from source: unknown 30564 1726882833.66762: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882833.66778: variable 'ansible_pipelining' from source: unknown 30564 1726882833.66792: variable 'ansible_timeout' from source: unknown 30564 1726882833.66805: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882833.66976: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30564 1726882833.66998: variable 'omit' from source: magic vars 30564 1726882833.67009: starting attempt loop 30564 1726882833.67018: running the handler 30564 1726882833.67033: _low_level_execute_command(): starting 30564 1726882833.67039: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882833.67915: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882833.68031: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882833.69674: stdout chunk (state=3): >>>/root <<< 30564 1726882833.69784: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882833.69827: stderr chunk (state=3): >>><<< 30564 1726882833.69829: stdout chunk (state=3): >>><<< 30564 1726882833.69868: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882833.69872: _low_level_execute_command(): starting 30564 1726882833.69875: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882833.6984081-32054-78947761185384 `" && echo ansible-tmp-1726882833.6984081-32054-78947761185384="` echo /root/.ansible/tmp/ansible-tmp-1726882833.6984081-32054-78947761185384 `" ) && sleep 0' 30564 1726882833.70268: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882833.70276: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882833.70304: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882833.70310: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882833.70322: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882833.70329: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882833.70338: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882833.70347: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882833.70352: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882833.70407: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882833.70433: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882833.70436: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882833.70534: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882833.72416: stdout chunk (state=3): >>>ansible-tmp-1726882833.6984081-32054-78947761185384=/root/.ansible/tmp/ansible-tmp-1726882833.6984081-32054-78947761185384 <<< 30564 1726882833.72601: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882833.72604: stdout chunk (state=3): >>><<< 30564 1726882833.72606: stderr chunk (state=3): >>><<< 30564 1726882833.72672: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882833.6984081-32054-78947761185384=/root/.ansible/tmp/ansible-tmp-1726882833.6984081-32054-78947761185384 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882833.72676: variable 'ansible_module_compression' from source: unknown 30564 1726882833.72873: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 30564 1726882833.72877: variable 'ansible_facts' from source: unknown 30564 1726882833.72976: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882833.6984081-32054-78947761185384/AnsiballZ_package_facts.py 30564 1726882833.73133: Sending initial data 30564 1726882833.73136: Sent initial data (161 bytes) 30564 1726882833.74083: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882833.74098: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882833.74113: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882833.74132: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882833.74177: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882833.74191: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882833.74203: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882833.74218: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882833.74229: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882833.74240: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882833.74253: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882833.74273: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882833.74300: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882833.74311: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882833.74356: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882833.74370: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882833.74491: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882833.76290: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882833.76389: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882833.76490: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmpnlsz8_fl /root/.ansible/tmp/ansible-tmp-1726882833.6984081-32054-78947761185384/AnsiballZ_package_facts.py <<< 30564 1726882833.76590: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882833.78627: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882833.78771: stderr chunk (state=3): >>><<< 30564 1726882833.78778: stdout chunk (state=3): >>><<< 30564 1726882833.78781: done transferring module to remote 30564 1726882833.78783: _low_level_execute_command(): starting 30564 1726882833.78786: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882833.6984081-32054-78947761185384/ /root/.ansible/tmp/ansible-tmp-1726882833.6984081-32054-78947761185384/AnsiballZ_package_facts.py && sleep 0' 30564 1726882833.79283: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882833.79292: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882833.79301: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882833.79315: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882833.79350: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882833.79357: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882833.79367: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882833.79384: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882833.79391: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882833.79400: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882833.79405: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882833.79412: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882833.79423: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882833.79430: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882833.79436: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882833.79445: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882833.79520: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882833.79535: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882833.79547: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882833.79668: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882833.81447: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882833.81531: stderr chunk (state=3): >>><<< 30564 1726882833.81536: stdout chunk (state=3): >>><<< 30564 1726882833.81554: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882833.81557: _low_level_execute_command(): starting 30564 1726882833.81561: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882833.6984081-32054-78947761185384/AnsiballZ_package_facts.py && sleep 0' 30564 1726882833.82220: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882833.82228: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882833.82239: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882833.82252: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882833.82294: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882833.82309: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882833.82319: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882833.82332: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882833.82339: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882833.82345: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882833.82352: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882833.82361: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882833.82377: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882833.82384: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882833.82390: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882833.82399: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882833.82472: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882833.82490: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882833.82502: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882833.82645: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882834.28844: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "e<<< 30564 1726882834.28905: stdout chunk (state=3): >>>poch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release<<< 30564 1726882834.28925: stdout chunk (state=3): >>>": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-<<< 30564 1726882834.28958: stdout chunk (state=3): >>>base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "sour<<< 30564 1726882834.28997: stdout chunk (state=3): >>>ce": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, <<< 30564 1726882834.29018: stdout chunk (state=3): >>>"arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_6<<< 30564 1726882834.29039: stdout chunk (state=3): >>>4", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", <<< 30564 1726882834.29060: stdout chunk (state=3): >>>"release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch<<< 30564 1726882834.29067: stdout chunk (state=3): >>>", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 30564 1726882834.30515: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882834.30567: stderr chunk (state=3): >>><<< 30564 1726882834.30573: stdout chunk (state=3): >>><<< 30564 1726882834.30610: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882834.31994: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882833.6984081-32054-78947761185384/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882834.32009: _low_level_execute_command(): starting 30564 1726882834.32015: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882833.6984081-32054-78947761185384/ > /dev/null 2>&1 && sleep 0' 30564 1726882834.32448: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882834.32453: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882834.32487: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882834.32500: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882834.32555: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882834.32567: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882834.32678: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882834.34529: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882834.34533: stdout chunk (state=3): >>><<< 30564 1726882834.34538: stderr chunk (state=3): >>><<< 30564 1726882834.34549: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882834.34554: handler run complete 30564 1726882834.35375: variable 'ansible_facts' from source: unknown 30564 1726882834.35516: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882834.37351: variable 'ansible_facts' from source: unknown 30564 1726882834.37614: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882834.38044: attempt loop complete, returning result 30564 1726882834.38054: _execute() done 30564 1726882834.38057: dumping result to json 30564 1726882834.38184: done dumping result, returning 30564 1726882834.38192: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0e448fcc-3ce9-4216-acec-000000000b96] 30564 1726882834.38197: sending task result for task 0e448fcc-3ce9-4216-acec-000000000b96 30564 1726882834.39510: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000b96 30564 1726882834.39513: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30564 1726882834.39599: no more pending results, returning what we have 30564 1726882834.39601: results queue empty 30564 1726882834.39602: checking for any_errors_fatal 30564 1726882834.39609: done checking for any_errors_fatal 30564 1726882834.39610: checking for max_fail_percentage 30564 1726882834.39611: done checking for max_fail_percentage 30564 1726882834.39611: checking to see if all hosts have failed and the running result is not ok 30564 1726882834.39612: done checking to see if all hosts have failed 30564 1726882834.39612: getting the remaining hosts for this loop 30564 1726882834.39613: done getting the remaining hosts for this loop 30564 1726882834.39616: getting the next task for host managed_node2 30564 1726882834.39621: done getting next task for host managed_node2 30564 1726882834.39623: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 30564 1726882834.39627: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882834.39635: getting variables 30564 1726882834.39635: in VariableManager get_vars() 30564 1726882834.39656: Calling all_inventory to load vars for managed_node2 30564 1726882834.39658: Calling groups_inventory to load vars for managed_node2 30564 1726882834.39659: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882834.39669: Calling all_plugins_play to load vars for managed_node2 30564 1726882834.39671: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882834.39675: Calling groups_plugins_play to load vars for managed_node2 30564 1726882834.40382: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882834.41368: done with get_vars() 30564 1726882834.41385: done getting variables 30564 1726882834.41426: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:40:34 -0400 (0:00:00.759) 0:00:32.995 ****** 30564 1726882834.41465: entering _queue_task() for managed_node2/debug 30564 1726882834.41675: worker is 1 (out of 1 available) 30564 1726882834.41690: exiting _queue_task() for managed_node2/debug 30564 1726882834.41703: done queuing things up, now waiting for results queue to drain 30564 1726882834.41704: waiting for pending results... 30564 1726882834.41892: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 30564 1726882834.41986: in run() - task 0e448fcc-3ce9-4216-acec-000000000b34 30564 1726882834.41997: variable 'ansible_search_path' from source: unknown 30564 1726882834.42001: variable 'ansible_search_path' from source: unknown 30564 1726882834.42033: calling self._execute() 30564 1726882834.42107: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882834.42113: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882834.42123: variable 'omit' from source: magic vars 30564 1726882834.42397: variable 'ansible_distribution_major_version' from source: facts 30564 1726882834.42407: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882834.42413: variable 'omit' from source: magic vars 30564 1726882834.42457: variable 'omit' from source: magic vars 30564 1726882834.42524: variable 'network_provider' from source: set_fact 30564 1726882834.42538: variable 'omit' from source: magic vars 30564 1726882834.42576: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882834.42602: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882834.42616: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882834.42630: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882834.42638: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882834.42661: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882834.42672: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882834.42677: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882834.42745: Set connection var ansible_timeout to 10 30564 1726882834.42748: Set connection var ansible_pipelining to False 30564 1726882834.42751: Set connection var ansible_shell_type to sh 30564 1726882834.42757: Set connection var ansible_shell_executable to /bin/sh 30564 1726882834.42765: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882834.42771: Set connection var ansible_connection to ssh 30564 1726882834.42789: variable 'ansible_shell_executable' from source: unknown 30564 1726882834.42792: variable 'ansible_connection' from source: unknown 30564 1726882834.42797: variable 'ansible_module_compression' from source: unknown 30564 1726882834.42799: variable 'ansible_shell_type' from source: unknown 30564 1726882834.42802: variable 'ansible_shell_executable' from source: unknown 30564 1726882834.42804: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882834.42806: variable 'ansible_pipelining' from source: unknown 30564 1726882834.42808: variable 'ansible_timeout' from source: unknown 30564 1726882834.42810: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882834.42913: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882834.42922: variable 'omit' from source: magic vars 30564 1726882834.42928: starting attempt loop 30564 1726882834.42930: running the handler 30564 1726882834.42967: handler run complete 30564 1726882834.42978: attempt loop complete, returning result 30564 1726882834.42981: _execute() done 30564 1726882834.42985: dumping result to json 30564 1726882834.42987: done dumping result, returning 30564 1726882834.42992: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [0e448fcc-3ce9-4216-acec-000000000b34] 30564 1726882834.43000: sending task result for task 0e448fcc-3ce9-4216-acec-000000000b34 30564 1726882834.43082: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000b34 30564 1726882834.43085: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: Using network provider: nm 30564 1726882834.43144: no more pending results, returning what we have 30564 1726882834.43147: results queue empty 30564 1726882834.43148: checking for any_errors_fatal 30564 1726882834.43154: done checking for any_errors_fatal 30564 1726882834.43155: checking for max_fail_percentage 30564 1726882834.43157: done checking for max_fail_percentage 30564 1726882834.43158: checking to see if all hosts have failed and the running result is not ok 30564 1726882834.43158: done checking to see if all hosts have failed 30564 1726882834.43159: getting the remaining hosts for this loop 30564 1726882834.43161: done getting the remaining hosts for this loop 30564 1726882834.43166: getting the next task for host managed_node2 30564 1726882834.43173: done getting next task for host managed_node2 30564 1726882834.43177: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30564 1726882834.43181: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882834.43192: getting variables 30564 1726882834.43193: in VariableManager get_vars() 30564 1726882834.43220: Calling all_inventory to load vars for managed_node2 30564 1726882834.43223: Calling groups_inventory to load vars for managed_node2 30564 1726882834.43225: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882834.43233: Calling all_plugins_play to load vars for managed_node2 30564 1726882834.43235: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882834.43238: Calling groups_plugins_play to load vars for managed_node2 30564 1726882834.43994: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882834.44923: done with get_vars() 30564 1726882834.44939: done getting variables 30564 1726882834.44978: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:40:34 -0400 (0:00:00.035) 0:00:33.031 ****** 30564 1726882834.45005: entering _queue_task() for managed_node2/fail 30564 1726882834.45184: worker is 1 (out of 1 available) 30564 1726882834.45197: exiting _queue_task() for managed_node2/fail 30564 1726882834.45209: done queuing things up, now waiting for results queue to drain 30564 1726882834.45210: waiting for pending results... 30564 1726882834.45377: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30564 1726882834.45454: in run() - task 0e448fcc-3ce9-4216-acec-000000000b35 30564 1726882834.45466: variable 'ansible_search_path' from source: unknown 30564 1726882834.45469: variable 'ansible_search_path' from source: unknown 30564 1726882834.45499: calling self._execute() 30564 1726882834.45570: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882834.45577: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882834.45585: variable 'omit' from source: magic vars 30564 1726882834.45848: variable 'ansible_distribution_major_version' from source: facts 30564 1726882834.45859: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882834.45944: variable 'network_state' from source: role '' defaults 30564 1726882834.45953: Evaluated conditional (network_state != {}): False 30564 1726882834.45957: when evaluation is False, skipping this task 30564 1726882834.45960: _execute() done 30564 1726882834.45963: dumping result to json 30564 1726882834.45967: done dumping result, returning 30564 1726882834.45975: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0e448fcc-3ce9-4216-acec-000000000b35] 30564 1726882834.45981: sending task result for task 0e448fcc-3ce9-4216-acec-000000000b35 30564 1726882834.46070: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000b35 30564 1726882834.46074: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30564 1726882834.46125: no more pending results, returning what we have 30564 1726882834.46128: results queue empty 30564 1726882834.46129: checking for any_errors_fatal 30564 1726882834.46134: done checking for any_errors_fatal 30564 1726882834.46134: checking for max_fail_percentage 30564 1726882834.46136: done checking for max_fail_percentage 30564 1726882834.46136: checking to see if all hosts have failed and the running result is not ok 30564 1726882834.46137: done checking to see if all hosts have failed 30564 1726882834.46138: getting the remaining hosts for this loop 30564 1726882834.46139: done getting the remaining hosts for this loop 30564 1726882834.46142: getting the next task for host managed_node2 30564 1726882834.46148: done getting next task for host managed_node2 30564 1726882834.46151: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30564 1726882834.46155: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882834.46180: getting variables 30564 1726882834.46181: in VariableManager get_vars() 30564 1726882834.46204: Calling all_inventory to load vars for managed_node2 30564 1726882834.46206: Calling groups_inventory to load vars for managed_node2 30564 1726882834.46208: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882834.46214: Calling all_plugins_play to load vars for managed_node2 30564 1726882834.46215: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882834.46217: Calling groups_plugins_play to load vars for managed_node2 30564 1726882834.47072: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882834.47979: done with get_vars() 30564 1726882834.47993: done getting variables 30564 1726882834.48032: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:40:34 -0400 (0:00:00.030) 0:00:33.061 ****** 30564 1726882834.48055: entering _queue_task() for managed_node2/fail 30564 1726882834.48232: worker is 1 (out of 1 available) 30564 1726882834.48248: exiting _queue_task() for managed_node2/fail 30564 1726882834.48259: done queuing things up, now waiting for results queue to drain 30564 1726882834.48260: waiting for pending results... 30564 1726882834.48430: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30564 1726882834.48520: in run() - task 0e448fcc-3ce9-4216-acec-000000000b36 30564 1726882834.48530: variable 'ansible_search_path' from source: unknown 30564 1726882834.48533: variable 'ansible_search_path' from source: unknown 30564 1726882834.48562: calling self._execute() 30564 1726882834.48631: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882834.48635: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882834.48644: variable 'omit' from source: magic vars 30564 1726882834.48908: variable 'ansible_distribution_major_version' from source: facts 30564 1726882834.48919: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882834.49001: variable 'network_state' from source: role '' defaults 30564 1726882834.49010: Evaluated conditional (network_state != {}): False 30564 1726882834.49014: when evaluation is False, skipping this task 30564 1726882834.49016: _execute() done 30564 1726882834.49019: dumping result to json 30564 1726882834.49021: done dumping result, returning 30564 1726882834.49029: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0e448fcc-3ce9-4216-acec-000000000b36] 30564 1726882834.49033: sending task result for task 0e448fcc-3ce9-4216-acec-000000000b36 30564 1726882834.49119: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000b36 30564 1726882834.49121: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30564 1726882834.49183: no more pending results, returning what we have 30564 1726882834.49187: results queue empty 30564 1726882834.49188: checking for any_errors_fatal 30564 1726882834.49192: done checking for any_errors_fatal 30564 1726882834.49193: checking for max_fail_percentage 30564 1726882834.49195: done checking for max_fail_percentage 30564 1726882834.49195: checking to see if all hosts have failed and the running result is not ok 30564 1726882834.49196: done checking to see if all hosts have failed 30564 1726882834.49197: getting the remaining hosts for this loop 30564 1726882834.49198: done getting the remaining hosts for this loop 30564 1726882834.49201: getting the next task for host managed_node2 30564 1726882834.49207: done getting next task for host managed_node2 30564 1726882834.49210: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30564 1726882834.49215: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882834.49234: getting variables 30564 1726882834.49236: in VariableManager get_vars() 30564 1726882834.49258: Calling all_inventory to load vars for managed_node2 30564 1726882834.49259: Calling groups_inventory to load vars for managed_node2 30564 1726882834.49261: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882834.49271: Calling all_plugins_play to load vars for managed_node2 30564 1726882834.49273: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882834.49275: Calling groups_plugins_play to load vars for managed_node2 30564 1726882834.50046: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882834.51070: done with get_vars() 30564 1726882834.51086: done getting variables 30564 1726882834.51124: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:40:34 -0400 (0:00:00.030) 0:00:33.092 ****** 30564 1726882834.51145: entering _queue_task() for managed_node2/fail 30564 1726882834.51320: worker is 1 (out of 1 available) 30564 1726882834.51334: exiting _queue_task() for managed_node2/fail 30564 1726882834.51346: done queuing things up, now waiting for results queue to drain 30564 1726882834.51348: waiting for pending results... 30564 1726882834.51523: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30564 1726882834.51616: in run() - task 0e448fcc-3ce9-4216-acec-000000000b37 30564 1726882834.51627: variable 'ansible_search_path' from source: unknown 30564 1726882834.51630: variable 'ansible_search_path' from source: unknown 30564 1726882834.51655: calling self._execute() 30564 1726882834.51726: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882834.51730: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882834.51738: variable 'omit' from source: magic vars 30564 1726882834.51989: variable 'ansible_distribution_major_version' from source: facts 30564 1726882834.51999: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882834.52118: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882834.53665: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882834.53722: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882834.53748: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882834.53781: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882834.53801: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882834.53856: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882834.53885: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882834.53902: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882834.53927: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882834.53938: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882834.54007: variable 'ansible_distribution_major_version' from source: facts 30564 1726882834.54018: Evaluated conditional (ansible_distribution_major_version | int > 9): False 30564 1726882834.54021: when evaluation is False, skipping this task 30564 1726882834.54024: _execute() done 30564 1726882834.54026: dumping result to json 30564 1726882834.54029: done dumping result, returning 30564 1726882834.54036: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0e448fcc-3ce9-4216-acec-000000000b37] 30564 1726882834.54041: sending task result for task 0e448fcc-3ce9-4216-acec-000000000b37 30564 1726882834.54120: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000b37 30564 1726882834.54123: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 30564 1726882834.54169: no more pending results, returning what we have 30564 1726882834.54173: results queue empty 30564 1726882834.54174: checking for any_errors_fatal 30564 1726882834.54180: done checking for any_errors_fatal 30564 1726882834.54180: checking for max_fail_percentage 30564 1726882834.54182: done checking for max_fail_percentage 30564 1726882834.54183: checking to see if all hosts have failed and the running result is not ok 30564 1726882834.54183: done checking to see if all hosts have failed 30564 1726882834.54184: getting the remaining hosts for this loop 30564 1726882834.54186: done getting the remaining hosts for this loop 30564 1726882834.54189: getting the next task for host managed_node2 30564 1726882834.54196: done getting next task for host managed_node2 30564 1726882834.54199: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30564 1726882834.54204: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882834.54220: getting variables 30564 1726882834.54221: in VariableManager get_vars() 30564 1726882834.54253: Calling all_inventory to load vars for managed_node2 30564 1726882834.54256: Calling groups_inventory to load vars for managed_node2 30564 1726882834.54258: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882834.54267: Calling all_plugins_play to load vars for managed_node2 30564 1726882834.54270: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882834.54273: Calling groups_plugins_play to load vars for managed_node2 30564 1726882834.55028: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882834.55987: done with get_vars() 30564 1726882834.56001: done getting variables 30564 1726882834.56038: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:40:34 -0400 (0:00:00.049) 0:00:33.141 ****** 30564 1726882834.56060: entering _queue_task() for managed_node2/dnf 30564 1726882834.56273: worker is 1 (out of 1 available) 30564 1726882834.56286: exiting _queue_task() for managed_node2/dnf 30564 1726882834.56300: done queuing things up, now waiting for results queue to drain 30564 1726882834.56301: waiting for pending results... 30564 1726882834.56590: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30564 1726882834.56722: in run() - task 0e448fcc-3ce9-4216-acec-000000000b38 30564 1726882834.56744: variable 'ansible_search_path' from source: unknown 30564 1726882834.56754: variable 'ansible_search_path' from source: unknown 30564 1726882834.56796: calling self._execute() 30564 1726882834.56899: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882834.56911: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882834.56927: variable 'omit' from source: magic vars 30564 1726882834.57305: variable 'ansible_distribution_major_version' from source: facts 30564 1726882834.57323: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882834.57522: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882834.59151: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882834.59225: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882834.59261: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882834.59302: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882834.59322: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882834.59381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882834.59400: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882834.59417: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882834.59443: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882834.59454: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882834.59536: variable 'ansible_distribution' from source: facts 30564 1726882834.59540: variable 'ansible_distribution_major_version' from source: facts 30564 1726882834.59550: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 30564 1726882834.59627: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882834.59711: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882834.59727: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882834.59744: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882834.59772: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882834.59781: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882834.59810: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882834.59826: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882834.59842: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882834.59867: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882834.59879: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882834.59913: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882834.59937: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882834.59959: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882834.59997: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882834.60011: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882834.60153: variable 'network_connections' from source: include params 30564 1726882834.60165: variable 'interface' from source: play vars 30564 1726882834.60225: variable 'interface' from source: play vars 30564 1726882834.60287: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882834.60425: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882834.60457: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882834.60490: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882834.60514: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882834.60831: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882834.60853: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882834.60880: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882834.60903: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882834.60956: variable '__network_team_connections_defined' from source: role '' defaults 30564 1726882834.61105: variable 'network_connections' from source: include params 30564 1726882834.61109: variable 'interface' from source: play vars 30564 1726882834.61151: variable 'interface' from source: play vars 30564 1726882834.61176: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30564 1726882834.61179: when evaluation is False, skipping this task 30564 1726882834.61181: _execute() done 30564 1726882834.61184: dumping result to json 30564 1726882834.61186: done dumping result, returning 30564 1726882834.61195: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0e448fcc-3ce9-4216-acec-000000000b38] 30564 1726882834.61198: sending task result for task 0e448fcc-3ce9-4216-acec-000000000b38 30564 1726882834.61284: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000b38 30564 1726882834.61287: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30564 1726882834.61345: no more pending results, returning what we have 30564 1726882834.61349: results queue empty 30564 1726882834.61350: checking for any_errors_fatal 30564 1726882834.61355: done checking for any_errors_fatal 30564 1726882834.61356: checking for max_fail_percentage 30564 1726882834.61358: done checking for max_fail_percentage 30564 1726882834.61358: checking to see if all hosts have failed and the running result is not ok 30564 1726882834.61359: done checking to see if all hosts have failed 30564 1726882834.61360: getting the remaining hosts for this loop 30564 1726882834.61362: done getting the remaining hosts for this loop 30564 1726882834.61367: getting the next task for host managed_node2 30564 1726882834.61376: done getting next task for host managed_node2 30564 1726882834.61380: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30564 1726882834.61385: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882834.61403: getting variables 30564 1726882834.61404: in VariableManager get_vars() 30564 1726882834.61433: Calling all_inventory to load vars for managed_node2 30564 1726882834.61435: Calling groups_inventory to load vars for managed_node2 30564 1726882834.61438: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882834.61446: Calling all_plugins_play to load vars for managed_node2 30564 1726882834.61448: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882834.61451: Calling groups_plugins_play to load vars for managed_node2 30564 1726882834.67802: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882834.70845: done with get_vars() 30564 1726882834.70877: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30564 1726882834.70941: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:40:34 -0400 (0:00:00.149) 0:00:33.291 ****** 30564 1726882834.70976: entering _queue_task() for managed_node2/yum 30564 1726882834.71312: worker is 1 (out of 1 available) 30564 1726882834.71325: exiting _queue_task() for managed_node2/yum 30564 1726882834.71336: done queuing things up, now waiting for results queue to drain 30564 1726882834.71338: waiting for pending results... 30564 1726882834.71643: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30564 1726882834.71806: in run() - task 0e448fcc-3ce9-4216-acec-000000000b39 30564 1726882834.71827: variable 'ansible_search_path' from source: unknown 30564 1726882834.71837: variable 'ansible_search_path' from source: unknown 30564 1726882834.71884: calling self._execute() 30564 1726882834.71992: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882834.72012: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882834.72028: variable 'omit' from source: magic vars 30564 1726882834.72420: variable 'ansible_distribution_major_version' from source: facts 30564 1726882834.72445: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882834.72637: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882834.75119: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882834.75205: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882834.75243: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882834.75293: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882834.75324: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882834.75410: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882834.75445: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882834.75486: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882834.75534: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882834.75554: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882834.75661: variable 'ansible_distribution_major_version' from source: facts 30564 1726882834.75692: Evaluated conditional (ansible_distribution_major_version | int < 8): False 30564 1726882834.75701: when evaluation is False, skipping this task 30564 1726882834.75709: _execute() done 30564 1726882834.75717: dumping result to json 30564 1726882834.75724: done dumping result, returning 30564 1726882834.75734: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0e448fcc-3ce9-4216-acec-000000000b39] 30564 1726882834.75744: sending task result for task 0e448fcc-3ce9-4216-acec-000000000b39 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 30564 1726882834.75911: no more pending results, returning what we have 30564 1726882834.75915: results queue empty 30564 1726882834.75917: checking for any_errors_fatal 30564 1726882834.75926: done checking for any_errors_fatal 30564 1726882834.75927: checking for max_fail_percentage 30564 1726882834.75928: done checking for max_fail_percentage 30564 1726882834.75929: checking to see if all hosts have failed and the running result is not ok 30564 1726882834.75930: done checking to see if all hosts have failed 30564 1726882834.75931: getting the remaining hosts for this loop 30564 1726882834.75933: done getting the remaining hosts for this loop 30564 1726882834.75937: getting the next task for host managed_node2 30564 1726882834.75945: done getting next task for host managed_node2 30564 1726882834.75950: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30564 1726882834.75956: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882834.75981: getting variables 30564 1726882834.75984: in VariableManager get_vars() 30564 1726882834.76018: Calling all_inventory to load vars for managed_node2 30564 1726882834.76021: Calling groups_inventory to load vars for managed_node2 30564 1726882834.76023: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882834.76037: Calling all_plugins_play to load vars for managed_node2 30564 1726882834.76040: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882834.76043: Calling groups_plugins_play to load vars for managed_node2 30564 1726882834.77305: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000b39 30564 1726882834.77309: WORKER PROCESS EXITING 30564 1726882834.77855: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882834.79627: done with get_vars() 30564 1726882834.79649: done getting variables 30564 1726882834.79708: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:40:34 -0400 (0:00:00.087) 0:00:33.378 ****** 30564 1726882834.79742: entering _queue_task() for managed_node2/fail 30564 1726882834.80018: worker is 1 (out of 1 available) 30564 1726882834.80031: exiting _queue_task() for managed_node2/fail 30564 1726882834.80043: done queuing things up, now waiting for results queue to drain 30564 1726882834.80045: waiting for pending results... 30564 1726882834.80327: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30564 1726882834.80480: in run() - task 0e448fcc-3ce9-4216-acec-000000000b3a 30564 1726882834.80503: variable 'ansible_search_path' from source: unknown 30564 1726882834.80512: variable 'ansible_search_path' from source: unknown 30564 1726882834.80552: calling self._execute() 30564 1726882834.80659: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882834.80675: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882834.80689: variable 'omit' from source: magic vars 30564 1726882834.81051: variable 'ansible_distribution_major_version' from source: facts 30564 1726882834.81071: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882834.81198: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882834.81399: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882834.84095: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882834.84161: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882834.84220: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882834.84258: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882834.84297: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882834.84381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882834.84420: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882834.84451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882834.84508: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882834.84528: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882834.84580: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882834.84610: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882834.84641: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882834.84697: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882834.84716: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882834.84762: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882834.84797: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882834.84827: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882834.84878: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882834.84897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882834.85093: variable 'network_connections' from source: include params 30564 1726882834.85119: variable 'interface' from source: play vars 30564 1726882834.85199: variable 'interface' from source: play vars 30564 1726882834.85279: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882834.85456: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882834.85506: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882834.85541: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882834.85589: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882834.85660: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882834.85693: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882834.85727: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882834.85758: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882834.85838: variable '__network_team_connections_defined' from source: role '' defaults 30564 1726882834.86209: variable 'network_connections' from source: include params 30564 1726882834.86229: variable 'interface' from source: play vars 30564 1726882834.86324: variable 'interface' from source: play vars 30564 1726882834.86382: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30564 1726882834.86396: when evaluation is False, skipping this task 30564 1726882834.86403: _execute() done 30564 1726882834.86409: dumping result to json 30564 1726882834.86418: done dumping result, returning 30564 1726882834.86429: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-4216-acec-000000000b3a] 30564 1726882834.86445: sending task result for task 0e448fcc-3ce9-4216-acec-000000000b3a skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30564 1726882834.86653: no more pending results, returning what we have 30564 1726882834.86662: results queue empty 30564 1726882834.86665: checking for any_errors_fatal 30564 1726882834.86675: done checking for any_errors_fatal 30564 1726882834.86676: checking for max_fail_percentage 30564 1726882834.86678: done checking for max_fail_percentage 30564 1726882834.86679: checking to see if all hosts have failed and the running result is not ok 30564 1726882834.86683: done checking to see if all hosts have failed 30564 1726882834.86684: getting the remaining hosts for this loop 30564 1726882834.86686: done getting the remaining hosts for this loop 30564 1726882834.86692: getting the next task for host managed_node2 30564 1726882834.86702: done getting next task for host managed_node2 30564 1726882834.86706: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 30564 1726882834.86711: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882834.86730: getting variables 30564 1726882834.86732: in VariableManager get_vars() 30564 1726882834.86774: Calling all_inventory to load vars for managed_node2 30564 1726882834.86777: Calling groups_inventory to load vars for managed_node2 30564 1726882834.86780: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882834.86797: Calling all_plugins_play to load vars for managed_node2 30564 1726882834.86803: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882834.86811: Calling groups_plugins_play to load vars for managed_node2 30564 1726882834.87904: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000b3a 30564 1726882834.87908: WORKER PROCESS EXITING 30564 1726882834.88997: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882834.91000: done with get_vars() 30564 1726882834.91027: done getting variables 30564 1726882834.91092: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:40:34 -0400 (0:00:00.113) 0:00:33.492 ****** 30564 1726882834.91132: entering _queue_task() for managed_node2/package 30564 1726882834.91472: worker is 1 (out of 1 available) 30564 1726882834.91486: exiting _queue_task() for managed_node2/package 30564 1726882834.91499: done queuing things up, now waiting for results queue to drain 30564 1726882834.91501: waiting for pending results... 30564 1726882834.91890: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 30564 1726882834.92107: in run() - task 0e448fcc-3ce9-4216-acec-000000000b3b 30564 1726882834.92131: variable 'ansible_search_path' from source: unknown 30564 1726882834.92179: variable 'ansible_search_path' from source: unknown 30564 1726882834.92231: calling self._execute() 30564 1726882834.92384: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882834.92408: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882834.92425: variable 'omit' from source: magic vars 30564 1726882834.92916: variable 'ansible_distribution_major_version' from source: facts 30564 1726882834.92935: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882834.93172: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882834.93464: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882834.93536: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882834.93581: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882834.93644: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882834.93786: variable 'network_packages' from source: role '' defaults 30564 1726882834.93907: variable '__network_provider_setup' from source: role '' defaults 30564 1726882834.93928: variable '__network_service_name_default_nm' from source: role '' defaults 30564 1726882834.94007: variable '__network_service_name_default_nm' from source: role '' defaults 30564 1726882834.94020: variable '__network_packages_default_nm' from source: role '' defaults 30564 1726882834.94089: variable '__network_packages_default_nm' from source: role '' defaults 30564 1726882834.94304: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882834.96838: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882834.96922: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882834.96970: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882834.97017: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882834.97054: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882834.97161: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882834.97279: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882834.97358: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882834.97431: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882834.97453: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882834.97530: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882834.97587: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882834.97626: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882834.97680: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882834.97699: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882834.97983: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30564 1726882834.98095: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882834.98124: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882834.98154: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882834.98206: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882834.98226: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882834.98326: variable 'ansible_python' from source: facts 30564 1726882834.98347: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30564 1726882834.98439: variable '__network_wpa_supplicant_required' from source: role '' defaults 30564 1726882834.98540: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30564 1726882834.98695: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882834.98730: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882834.98760: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882834.98809: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882834.98834: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882834.98899: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882834.98948: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882834.98985: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882834.99028: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882834.99053: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882834.99220: variable 'network_connections' from source: include params 30564 1726882834.99231: variable 'interface' from source: play vars 30564 1726882834.99353: variable 'interface' from source: play vars 30564 1726882834.99436: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882834.99471: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882834.99520: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882834.99571: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882834.99627: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882834.99958: variable 'network_connections' from source: include params 30564 1726882834.99974: variable 'interface' from source: play vars 30564 1726882835.00086: variable 'interface' from source: play vars 30564 1726882835.00150: variable '__network_packages_default_wireless' from source: role '' defaults 30564 1726882835.00244: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882835.00610: variable 'network_connections' from source: include params 30564 1726882835.00623: variable 'interface' from source: play vars 30564 1726882835.00701: variable 'interface' from source: play vars 30564 1726882835.00728: variable '__network_packages_default_team' from source: role '' defaults 30564 1726882835.00822: variable '__network_team_connections_defined' from source: role '' defaults 30564 1726882835.01182: variable 'network_connections' from source: include params 30564 1726882835.01193: variable 'interface' from source: play vars 30564 1726882835.01266: variable 'interface' from source: play vars 30564 1726882835.01346: variable '__network_service_name_default_initscripts' from source: role '' defaults 30564 1726882835.01427: variable '__network_service_name_default_initscripts' from source: role '' defaults 30564 1726882835.01444: variable '__network_packages_default_initscripts' from source: role '' defaults 30564 1726882835.01511: variable '__network_packages_default_initscripts' from source: role '' defaults 30564 1726882835.01846: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30564 1726882835.03083: variable 'network_connections' from source: include params 30564 1726882835.03090: variable 'interface' from source: play vars 30564 1726882835.03223: variable 'interface' from source: play vars 30564 1726882835.03232: variable 'ansible_distribution' from source: facts 30564 1726882835.03235: variable '__network_rh_distros' from source: role '' defaults 30564 1726882835.03249: variable 'ansible_distribution_major_version' from source: facts 30564 1726882835.03291: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30564 1726882835.03548: variable 'ansible_distribution' from source: facts 30564 1726882835.03552: variable '__network_rh_distros' from source: role '' defaults 30564 1726882835.03557: variable 'ansible_distribution_major_version' from source: facts 30564 1726882835.03573: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30564 1726882835.03735: variable 'ansible_distribution' from source: facts 30564 1726882835.03738: variable '__network_rh_distros' from source: role '' defaults 30564 1726882835.03743: variable 'ansible_distribution_major_version' from source: facts 30564 1726882835.03782: variable 'network_provider' from source: set_fact 30564 1726882835.03801: variable 'ansible_facts' from source: unknown 30564 1726882835.04717: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 30564 1726882835.04721: when evaluation is False, skipping this task 30564 1726882835.04724: _execute() done 30564 1726882835.04728: dumping result to json 30564 1726882835.04730: done dumping result, returning 30564 1726882835.04736: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [0e448fcc-3ce9-4216-acec-000000000b3b] 30564 1726882835.04741: sending task result for task 0e448fcc-3ce9-4216-acec-000000000b3b 30564 1726882835.04858: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000b3b 30564 1726882835.04861: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 30564 1726882835.04914: no more pending results, returning what we have 30564 1726882835.04918: results queue empty 30564 1726882835.04919: checking for any_errors_fatal 30564 1726882835.04926: done checking for any_errors_fatal 30564 1726882835.04926: checking for max_fail_percentage 30564 1726882835.04928: done checking for max_fail_percentage 30564 1726882835.04929: checking to see if all hosts have failed and the running result is not ok 30564 1726882835.04929: done checking to see if all hosts have failed 30564 1726882835.04930: getting the remaining hosts for this loop 30564 1726882835.04932: done getting the remaining hosts for this loop 30564 1726882835.04936: getting the next task for host managed_node2 30564 1726882835.04944: done getting next task for host managed_node2 30564 1726882835.04948: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30564 1726882835.04953: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882835.04975: getting variables 30564 1726882835.04977: in VariableManager get_vars() 30564 1726882835.05015: Calling all_inventory to load vars for managed_node2 30564 1726882835.05018: Calling groups_inventory to load vars for managed_node2 30564 1726882835.05020: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882835.05029: Calling all_plugins_play to load vars for managed_node2 30564 1726882835.05032: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882835.05034: Calling groups_plugins_play to load vars for managed_node2 30564 1726882835.06682: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882835.08890: done with get_vars() 30564 1726882835.08978: done getting variables 30564 1726882835.09036: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:40:35 -0400 (0:00:00.179) 0:00:33.672 ****** 30564 1726882835.09074: entering _queue_task() for managed_node2/package 30564 1726882835.09373: worker is 1 (out of 1 available) 30564 1726882835.09386: exiting _queue_task() for managed_node2/package 30564 1726882835.09400: done queuing things up, now waiting for results queue to drain 30564 1726882835.09402: waiting for pending results... 30564 1726882835.09788: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30564 1726882835.09945: in run() - task 0e448fcc-3ce9-4216-acec-000000000b3c 30564 1726882835.09975: variable 'ansible_search_path' from source: unknown 30564 1726882835.09984: variable 'ansible_search_path' from source: unknown 30564 1726882835.10028: calling self._execute() 30564 1726882835.10241: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882835.10255: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882835.10395: variable 'omit' from source: magic vars 30564 1726882835.10801: variable 'ansible_distribution_major_version' from source: facts 30564 1726882835.10820: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882835.11088: variable 'network_state' from source: role '' defaults 30564 1726882835.11186: Evaluated conditional (network_state != {}): False 30564 1726882835.11194: when evaluation is False, skipping this task 30564 1726882835.11201: _execute() done 30564 1726882835.11208: dumping result to json 30564 1726882835.11215: done dumping result, returning 30564 1726882835.11227: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0e448fcc-3ce9-4216-acec-000000000b3c] 30564 1726882835.11239: sending task result for task 0e448fcc-3ce9-4216-acec-000000000b3c 30564 1726882835.11476: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000b3c 30564 1726882835.11486: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30564 1726882835.11540: no more pending results, returning what we have 30564 1726882835.11545: results queue empty 30564 1726882835.11546: checking for any_errors_fatal 30564 1726882835.11553: done checking for any_errors_fatal 30564 1726882835.11554: checking for max_fail_percentage 30564 1726882835.11557: done checking for max_fail_percentage 30564 1726882835.11558: checking to see if all hosts have failed and the running result is not ok 30564 1726882835.11558: done checking to see if all hosts have failed 30564 1726882835.11559: getting the remaining hosts for this loop 30564 1726882835.11561: done getting the remaining hosts for this loop 30564 1726882835.11570: getting the next task for host managed_node2 30564 1726882835.11579: done getting next task for host managed_node2 30564 1726882835.11584: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30564 1726882835.11591: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882835.11614: getting variables 30564 1726882835.11616: in VariableManager get_vars() 30564 1726882835.11652: Calling all_inventory to load vars for managed_node2 30564 1726882835.11655: Calling groups_inventory to load vars for managed_node2 30564 1726882835.11658: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882835.11676: Calling all_plugins_play to load vars for managed_node2 30564 1726882835.11680: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882835.11684: Calling groups_plugins_play to load vars for managed_node2 30564 1726882835.14077: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882835.17984: done with get_vars() 30564 1726882835.18013: done getting variables 30564 1726882835.18075: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:40:35 -0400 (0:00:00.090) 0:00:33.762 ****** 30564 1726882835.18112: entering _queue_task() for managed_node2/package 30564 1726882835.18436: worker is 1 (out of 1 available) 30564 1726882835.18450: exiting _queue_task() for managed_node2/package 30564 1726882835.19070: done queuing things up, now waiting for results queue to drain 30564 1726882835.19072: waiting for pending results... 30564 1726882835.19393: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30564 1726882835.19777: in run() - task 0e448fcc-3ce9-4216-acec-000000000b3d 30564 1726882835.19798: variable 'ansible_search_path' from source: unknown 30564 1726882835.19808: variable 'ansible_search_path' from source: unknown 30564 1726882835.19847: calling self._execute() 30564 1726882835.19957: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882835.20080: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882835.20097: variable 'omit' from source: magic vars 30564 1726882835.20905: variable 'ansible_distribution_major_version' from source: facts 30564 1726882835.20953: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882835.21193: variable 'network_state' from source: role '' defaults 30564 1726882835.21277: Evaluated conditional (network_state != {}): False 30564 1726882835.21286: when evaluation is False, skipping this task 30564 1726882835.21293: _execute() done 30564 1726882835.21300: dumping result to json 30564 1726882835.21308: done dumping result, returning 30564 1726882835.21320: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0e448fcc-3ce9-4216-acec-000000000b3d] 30564 1726882835.21331: sending task result for task 0e448fcc-3ce9-4216-acec-000000000b3d 30564 1726882835.21570: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000b3d 30564 1726882835.21579: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30564 1726882835.21632: no more pending results, returning what we have 30564 1726882835.21635: results queue empty 30564 1726882835.21637: checking for any_errors_fatal 30564 1726882835.21644: done checking for any_errors_fatal 30564 1726882835.21645: checking for max_fail_percentage 30564 1726882835.21647: done checking for max_fail_percentage 30564 1726882835.21647: checking to see if all hosts have failed and the running result is not ok 30564 1726882835.21648: done checking to see if all hosts have failed 30564 1726882835.21649: getting the remaining hosts for this loop 30564 1726882835.21650: done getting the remaining hosts for this loop 30564 1726882835.21654: getting the next task for host managed_node2 30564 1726882835.21662: done getting next task for host managed_node2 30564 1726882835.21670: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30564 1726882835.21677: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882835.21697: getting variables 30564 1726882835.21699: in VariableManager get_vars() 30564 1726882835.21736: Calling all_inventory to load vars for managed_node2 30564 1726882835.21739: Calling groups_inventory to load vars for managed_node2 30564 1726882835.21741: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882835.21755: Calling all_plugins_play to load vars for managed_node2 30564 1726882835.21758: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882835.21761: Calling groups_plugins_play to load vars for managed_node2 30564 1726882835.24716: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882835.27895: done with get_vars() 30564 1726882835.27922: done getting variables 30564 1726882835.27985: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:40:35 -0400 (0:00:00.099) 0:00:33.861 ****** 30564 1726882835.28020: entering _queue_task() for managed_node2/service 30564 1726882835.29052: worker is 1 (out of 1 available) 30564 1726882835.29070: exiting _queue_task() for managed_node2/service 30564 1726882835.29084: done queuing things up, now waiting for results queue to drain 30564 1726882835.29085: waiting for pending results... 30564 1726882835.30004: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30564 1726882835.30383: in run() - task 0e448fcc-3ce9-4216-acec-000000000b3e 30564 1726882835.30405: variable 'ansible_search_path' from source: unknown 30564 1726882835.30414: variable 'ansible_search_path' from source: unknown 30564 1726882835.30455: calling self._execute() 30564 1726882835.30565: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882835.30708: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882835.30726: variable 'omit' from source: magic vars 30564 1726882835.31443: variable 'ansible_distribution_major_version' from source: facts 30564 1726882835.31586: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882835.31823: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882835.32257: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882835.36830: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882835.36919: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882835.36962: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882835.37006: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882835.37042: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882835.37127: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882835.37167: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882835.37198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882835.37250: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882835.37275: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882835.37323: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882835.37357: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882835.37392: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882835.37435: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882835.37458: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882835.37507: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882835.37534: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882835.37569: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882835.37617: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882835.37636: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882835.37834: variable 'network_connections' from source: include params 30564 1726882835.37849: variable 'interface' from source: play vars 30564 1726882835.37932: variable 'interface' from source: play vars 30564 1726882835.38016: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882835.38188: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882835.38249: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882835.38288: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882835.38324: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882835.38375: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882835.38401: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882835.38435: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882835.38471: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882835.38541: variable '__network_team_connections_defined' from source: role '' defaults 30564 1726882835.38811: variable 'network_connections' from source: include params 30564 1726882835.38822: variable 'interface' from source: play vars 30564 1726882835.38895: variable 'interface' from source: play vars 30564 1726882835.38932: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30564 1726882835.38940: when evaluation is False, skipping this task 30564 1726882835.38947: _execute() done 30564 1726882835.38954: dumping result to json 30564 1726882835.38963: done dumping result, returning 30564 1726882835.38980: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-4216-acec-000000000b3e] 30564 1726882835.38995: sending task result for task 0e448fcc-3ce9-4216-acec-000000000b3e skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30564 1726882835.39158: no more pending results, returning what we have 30564 1726882835.39162: results queue empty 30564 1726882835.39166: checking for any_errors_fatal 30564 1726882835.39173: done checking for any_errors_fatal 30564 1726882835.39175: checking for max_fail_percentage 30564 1726882835.39177: done checking for max_fail_percentage 30564 1726882835.39178: checking to see if all hosts have failed and the running result is not ok 30564 1726882835.39178: done checking to see if all hosts have failed 30564 1726882835.39179: getting the remaining hosts for this loop 30564 1726882835.39181: done getting the remaining hosts for this loop 30564 1726882835.39186: getting the next task for host managed_node2 30564 1726882835.39194: done getting next task for host managed_node2 30564 1726882835.39199: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30564 1726882835.39204: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882835.39225: getting variables 30564 1726882835.39227: in VariableManager get_vars() 30564 1726882835.39266: Calling all_inventory to load vars for managed_node2 30564 1726882835.39269: Calling groups_inventory to load vars for managed_node2 30564 1726882835.39272: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882835.39283: Calling all_plugins_play to load vars for managed_node2 30564 1726882835.39285: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882835.39288: Calling groups_plugins_play to load vars for managed_node2 30564 1726882835.40344: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000b3e 30564 1726882835.40348: WORKER PROCESS EXITING 30564 1726882835.41116: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882835.43060: done with get_vars() 30564 1726882835.43085: done getting variables 30564 1726882835.43140: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:40:35 -0400 (0:00:00.151) 0:00:34.013 ****** 30564 1726882835.43183: entering _queue_task() for managed_node2/service 30564 1726882835.43507: worker is 1 (out of 1 available) 30564 1726882835.43519: exiting _queue_task() for managed_node2/service 30564 1726882835.43532: done queuing things up, now waiting for results queue to drain 30564 1726882835.43534: waiting for pending results... 30564 1726882835.43831: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30564 1726882835.43988: in run() - task 0e448fcc-3ce9-4216-acec-000000000b3f 30564 1726882835.44009: variable 'ansible_search_path' from source: unknown 30564 1726882835.44018: variable 'ansible_search_path' from source: unknown 30564 1726882835.44066: calling self._execute() 30564 1726882835.44178: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882835.44190: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882835.44209: variable 'omit' from source: magic vars 30564 1726882835.44608: variable 'ansible_distribution_major_version' from source: facts 30564 1726882835.44628: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882835.44994: variable 'network_provider' from source: set_fact 30564 1726882835.45004: variable 'network_state' from source: role '' defaults 30564 1726882835.45024: Evaluated conditional (network_provider == "nm" or network_state != {}): True 30564 1726882835.45035: variable 'omit' from source: magic vars 30564 1726882835.45104: variable 'omit' from source: magic vars 30564 1726882835.45141: variable 'network_service_name' from source: role '' defaults 30564 1726882835.45210: variable 'network_service_name' from source: role '' defaults 30564 1726882835.45477: variable '__network_provider_setup' from source: role '' defaults 30564 1726882835.45570: variable '__network_service_name_default_nm' from source: role '' defaults 30564 1726882835.45636: variable '__network_service_name_default_nm' from source: role '' defaults 30564 1726882835.45681: variable '__network_packages_default_nm' from source: role '' defaults 30564 1726882835.45744: variable '__network_packages_default_nm' from source: role '' defaults 30564 1726882835.46333: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882835.50203: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882835.50294: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882835.50343: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882835.50384: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882835.50414: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882835.50612: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882835.50646: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882835.50690: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882835.50815: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882835.50835: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882835.50928: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882835.51001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882835.51127: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882835.51173: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882835.51195: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882835.51714: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30564 1726882835.51982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882835.52010: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882835.52111: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882835.52156: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882835.52314: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882835.52420: variable 'ansible_python' from source: facts 30564 1726882835.52537: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30564 1726882835.52734: variable '__network_wpa_supplicant_required' from source: role '' defaults 30564 1726882835.52819: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30564 1726882835.53083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882835.53198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882835.53228: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882835.53274: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882835.53413: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882835.53463: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882835.53519: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882835.53569: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882835.53620: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882835.53639: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882835.53790: variable 'network_connections' from source: include params 30564 1726882835.53802: variable 'interface' from source: play vars 30564 1726882835.53887: variable 'interface' from source: play vars 30564 1726882835.54006: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882835.54212: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882835.54274: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882835.54321: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882835.54372: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882835.54437: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882835.54633: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882835.54673: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882835.54829: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882835.54882: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882835.55446: variable 'network_connections' from source: include params 30564 1726882835.55478: variable 'interface' from source: play vars 30564 1726882835.55605: variable 'interface' from source: play vars 30564 1726882835.55711: variable '__network_packages_default_wireless' from source: role '' defaults 30564 1726882835.55912: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882835.56277: variable 'network_connections' from source: include params 30564 1726882835.56288: variable 'interface' from source: play vars 30564 1726882835.56356: variable 'interface' from source: play vars 30564 1726882835.56387: variable '__network_packages_default_team' from source: role '' defaults 30564 1726882835.56467: variable '__network_team_connections_defined' from source: role '' defaults 30564 1726882835.56760: variable 'network_connections' from source: include params 30564 1726882835.56773: variable 'interface' from source: play vars 30564 1726882835.56848: variable 'interface' from source: play vars 30564 1726882835.56913: variable '__network_service_name_default_initscripts' from source: role '' defaults 30564 1726882835.56976: variable '__network_service_name_default_initscripts' from source: role '' defaults 30564 1726882835.56987: variable '__network_packages_default_initscripts' from source: role '' defaults 30564 1726882835.57047: variable '__network_packages_default_initscripts' from source: role '' defaults 30564 1726882835.57262: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30564 1726882835.57973: variable 'network_connections' from source: include params 30564 1726882835.57984: variable 'interface' from source: play vars 30564 1726882835.58045: variable 'interface' from source: play vars 30564 1726882835.58060: variable 'ansible_distribution' from source: facts 30564 1726882835.58072: variable '__network_rh_distros' from source: role '' defaults 30564 1726882835.58083: variable 'ansible_distribution_major_version' from source: facts 30564 1726882835.58113: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30564 1726882835.58286: variable 'ansible_distribution' from source: facts 30564 1726882835.58295: variable '__network_rh_distros' from source: role '' defaults 30564 1726882835.58305: variable 'ansible_distribution_major_version' from source: facts 30564 1726882835.58318: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30564 1726882835.58484: variable 'ansible_distribution' from source: facts 30564 1726882835.59015: variable '__network_rh_distros' from source: role '' defaults 30564 1726882835.59026: variable 'ansible_distribution_major_version' from source: facts 30564 1726882835.59067: variable 'network_provider' from source: set_fact 30564 1726882835.59097: variable 'omit' from source: magic vars 30564 1726882835.59127: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882835.59158: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882835.59186: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882835.59210: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882835.59225: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882835.59258: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882835.59269: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882835.59277: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882835.59376: Set connection var ansible_timeout to 10 30564 1726882835.59388: Set connection var ansible_pipelining to False 30564 1726882835.59395: Set connection var ansible_shell_type to sh 30564 1726882835.59406: Set connection var ansible_shell_executable to /bin/sh 30564 1726882835.59420: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882835.59427: Set connection var ansible_connection to ssh 30564 1726882835.59458: variable 'ansible_shell_executable' from source: unknown 30564 1726882835.59468: variable 'ansible_connection' from source: unknown 30564 1726882835.59476: variable 'ansible_module_compression' from source: unknown 30564 1726882835.59483: variable 'ansible_shell_type' from source: unknown 30564 1726882835.59489: variable 'ansible_shell_executable' from source: unknown 30564 1726882835.59496: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882835.59504: variable 'ansible_pipelining' from source: unknown 30564 1726882835.59510: variable 'ansible_timeout' from source: unknown 30564 1726882835.59517: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882835.59623: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882835.59809: variable 'omit' from source: magic vars 30564 1726882835.59820: starting attempt loop 30564 1726882835.59826: running the handler 30564 1726882835.59906: variable 'ansible_facts' from source: unknown 30564 1726882835.60781: _low_level_execute_command(): starting 30564 1726882835.60807: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882835.61453: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882835.61472: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882835.61488: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882835.61508: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882835.61548: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882835.61561: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882835.61582: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882835.61601: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882835.61613: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882835.61626: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882835.61639: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882835.61652: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882835.61672: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882835.61685: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882835.61698: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882835.61713: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882835.61787: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882835.61804: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882835.61819: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882835.62405: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882835.63747: stdout chunk (state=3): >>>/root <<< 30564 1726882835.63930: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882835.63933: stdout chunk (state=3): >>><<< 30564 1726882835.63936: stderr chunk (state=3): >>><<< 30564 1726882835.63970: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882835.63973: _low_level_execute_command(): starting 30564 1726882835.64047: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882835.6395297-32134-54630347138493 `" && echo ansible-tmp-1726882835.6395297-32134-54630347138493="` echo /root/.ansible/tmp/ansible-tmp-1726882835.6395297-32134-54630347138493 `" ) && sleep 0' 30564 1726882835.64994: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882835.65195: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882835.65210: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882835.65229: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882835.65289: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882835.65302: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882835.65316: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882835.65334: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882835.65347: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882835.65359: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882835.65410: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882835.65470: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882835.65489: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882835.65501: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882835.65512: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882835.65588: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882835.65660: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882835.65680: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882835.65695: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882835.65829: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882835.68320: stdout chunk (state=3): >>>ansible-tmp-1726882835.6395297-32134-54630347138493=/root/.ansible/tmp/ansible-tmp-1726882835.6395297-32134-54630347138493 <<< 30564 1726882835.68470: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882835.68542: stderr chunk (state=3): >>><<< 30564 1726882835.68555: stdout chunk (state=3): >>><<< 30564 1726882835.68629: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882835.6395297-32134-54630347138493=/root/.ansible/tmp/ansible-tmp-1726882835.6395297-32134-54630347138493 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882835.68637: variable 'ansible_module_compression' from source: unknown 30564 1726882835.68806: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 30564 1726882835.69116: variable 'ansible_facts' from source: unknown 30564 1726882835.69119: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882835.6395297-32134-54630347138493/AnsiballZ_systemd.py 30564 1726882835.69810: Sending initial data 30564 1726882835.69813: Sent initial data (155 bytes) 30564 1726882835.70735: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882835.70762: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882835.70780: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882835.70798: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882835.70856: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882835.70873: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882835.70888: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882835.70904: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882835.70915: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882835.70925: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882835.70936: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882835.70948: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882835.70963: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882835.70980: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882835.70991: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882835.71004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882835.71090: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882835.71111: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882835.71126: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882835.71253: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882835.73060: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882835.73156: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882835.73256: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmpcqyhfk1g /root/.ansible/tmp/ansible-tmp-1726882835.6395297-32134-54630347138493/AnsiballZ_systemd.py <<< 30564 1726882835.73352: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882835.77180: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882835.77300: stderr chunk (state=3): >>><<< 30564 1726882835.77303: stdout chunk (state=3): >>><<< 30564 1726882835.77305: done transferring module to remote 30564 1726882835.77307: _low_level_execute_command(): starting 30564 1726882835.77309: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882835.6395297-32134-54630347138493/ /root/.ansible/tmp/ansible-tmp-1726882835.6395297-32134-54630347138493/AnsiballZ_systemd.py && sleep 0' 30564 1726882835.78868: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882835.78881: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882835.78891: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882835.78905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882835.78943: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882835.78956: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882835.78974: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882835.78987: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882835.78995: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882835.79002: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882835.79009: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882835.79018: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882835.79029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882835.79037: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882835.79043: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882835.79053: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882835.79137: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882835.79285: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882835.79301: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882835.79499: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882835.81381: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882835.81426: stderr chunk (state=3): >>><<< 30564 1726882835.81429: stdout chunk (state=3): >>><<< 30564 1726882835.81446: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882835.81454: _low_level_execute_command(): starting 30564 1726882835.81457: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882835.6395297-32134-54630347138493/AnsiballZ_systemd.py && sleep 0' 30564 1726882835.83116: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882835.83124: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882835.83135: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882835.83149: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882835.83199: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882835.83280: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882835.83292: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882835.83305: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882835.83313: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882835.83320: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882835.83327: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882835.83336: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882835.83347: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882835.83354: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882835.83361: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882835.83374: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882835.83451: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882835.83510: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882835.83606: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882835.83833: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882836.09357: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6692", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ExecMainStartTimestampMonotonic": "202392137", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6692", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManag<<< 30564 1726882836.09380: stdout chunk (state=3): >>>er.service", "ControlGroupId": "3602", "MemoryCurrent": "9134080", "MemoryAvailable": "infinity", "CPUUsageNSec": "2114565000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Watchdo<<< 30564 1726882836.09391: stdout chunk (state=3): >>>gSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service multi-user.target network.target shutdown.target cloud-init.service", "After": "cloud-init-local.service dbus-broker.service network-pre.target system.slice dbus.socket systemd-journald.socket basic.target sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:57 EDT", "StateChangeTimestampMonotonic": "316658837", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveExitTimestampMonotonic": "202392395", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveEnterTimestampMonotonic": "202472383", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveExitTimestampMonotonic": "202362940", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveEnterTimestampMonotonic": "202381901", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ConditionTimestampMonotonic": "202382734", "AssertTimestamp": "Fri 2024-09-20 21:31:03 EDT", "AssertTimestampMonotonic": "202382737", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "55e27919215348fab37a11b7ea324f90", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 30564 1726882836.11008: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882836.11012: stdout chunk (state=3): >>><<< 30564 1726882836.11018: stderr chunk (state=3): >>><<< 30564 1726882836.11041: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6692", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ExecMainStartTimestampMonotonic": "202392137", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6692", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3602", "MemoryCurrent": "9134080", "MemoryAvailable": "infinity", "CPUUsageNSec": "2114565000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service multi-user.target network.target shutdown.target cloud-init.service", "After": "cloud-init-local.service dbus-broker.service network-pre.target system.slice dbus.socket systemd-journald.socket basic.target sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:57 EDT", "StateChangeTimestampMonotonic": "316658837", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveExitTimestampMonotonic": "202392395", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveEnterTimestampMonotonic": "202472383", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveExitTimestampMonotonic": "202362940", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveEnterTimestampMonotonic": "202381901", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ConditionTimestampMonotonic": "202382734", "AssertTimestamp": "Fri 2024-09-20 21:31:03 EDT", "AssertTimestampMonotonic": "202382737", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "55e27919215348fab37a11b7ea324f90", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882836.11231: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882835.6395297-32134-54630347138493/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882836.11252: _low_level_execute_command(): starting 30564 1726882836.11255: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882835.6395297-32134-54630347138493/ > /dev/null 2>&1 && sleep 0' 30564 1726882836.12756: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882836.12822: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882836.12838: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882836.12856: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882836.12900: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882836.13034: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882836.13051: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882836.13070: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882836.13083: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882836.13092: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882836.13103: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882836.13115: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882836.13131: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882836.13143: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882836.13157: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882836.13173: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882836.13273: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882836.13294: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882836.13309: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882836.13442: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882836.15392: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882836.15395: stdout chunk (state=3): >>><<< 30564 1726882836.15398: stderr chunk (state=3): >>><<< 30564 1726882836.15471: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882836.15475: handler run complete 30564 1726882836.15673: attempt loop complete, returning result 30564 1726882836.15676: _execute() done 30564 1726882836.15678: dumping result to json 30564 1726882836.15680: done dumping result, returning 30564 1726882836.15682: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0e448fcc-3ce9-4216-acec-000000000b3f] 30564 1726882836.15685: sending task result for task 0e448fcc-3ce9-4216-acec-000000000b3f ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30564 1726882836.15889: no more pending results, returning what we have 30564 1726882836.15892: results queue empty 30564 1726882836.15893: checking for any_errors_fatal 30564 1726882836.15900: done checking for any_errors_fatal 30564 1726882836.15901: checking for max_fail_percentage 30564 1726882836.15903: done checking for max_fail_percentage 30564 1726882836.15904: checking to see if all hosts have failed and the running result is not ok 30564 1726882836.15904: done checking to see if all hosts have failed 30564 1726882836.15905: getting the remaining hosts for this loop 30564 1726882836.15907: done getting the remaining hosts for this loop 30564 1726882836.15911: getting the next task for host managed_node2 30564 1726882836.15918: done getting next task for host managed_node2 30564 1726882836.15923: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30564 1726882836.15928: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882836.15941: getting variables 30564 1726882836.15943: in VariableManager get_vars() 30564 1726882836.15981: Calling all_inventory to load vars for managed_node2 30564 1726882836.15984: Calling groups_inventory to load vars for managed_node2 30564 1726882836.15986: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882836.15996: Calling all_plugins_play to load vars for managed_node2 30564 1726882836.15999: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882836.16001: Calling groups_plugins_play to load vars for managed_node2 30564 1726882836.16698: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000b3f 30564 1726882836.16705: WORKER PROCESS EXITING 30564 1726882836.18732: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882836.21480: done with get_vars() 30564 1726882836.21529: done getting variables 30564 1726882836.21615: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:40:36 -0400 (0:00:00.784) 0:00:34.797 ****** 30564 1726882836.21672: entering _queue_task() for managed_node2/service 30564 1726882836.22135: worker is 1 (out of 1 available) 30564 1726882836.22150: exiting _queue_task() for managed_node2/service 30564 1726882836.22169: done queuing things up, now waiting for results queue to drain 30564 1726882836.22171: waiting for pending results... 30564 1726882836.22811: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30564 1726882836.22976: in run() - task 0e448fcc-3ce9-4216-acec-000000000b40 30564 1726882836.22990: variable 'ansible_search_path' from source: unknown 30564 1726882836.22994: variable 'ansible_search_path' from source: unknown 30564 1726882836.23038: calling self._execute() 30564 1726882836.23157: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882836.23166: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882836.23181: variable 'omit' from source: magic vars 30564 1726882836.23618: variable 'ansible_distribution_major_version' from source: facts 30564 1726882836.23630: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882836.23752: variable 'network_provider' from source: set_fact 30564 1726882836.23759: Evaluated conditional (network_provider == "nm"): True 30564 1726882836.23867: variable '__network_wpa_supplicant_required' from source: role '' defaults 30564 1726882836.23978: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30564 1726882836.24180: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882836.29822: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882836.29826: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882836.29829: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882836.29831: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882836.29833: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882836.29836: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882836.29839: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882836.29841: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882836.29878: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882836.29900: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882836.29944: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882836.29967: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882836.30005: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882836.30044: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882836.30057: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882836.30111: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882836.30135: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882836.30157: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882836.30238: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882836.30253: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882836.30438: variable 'network_connections' from source: include params 30564 1726882836.30450: variable 'interface' from source: play vars 30564 1726882836.30528: variable 'interface' from source: play vars 30564 1726882836.30626: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882836.30813: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882836.30852: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882836.30892: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882836.30919: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882836.30963: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882836.30995: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882836.31020: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882836.31045: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882836.31108: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882836.31383: variable 'network_connections' from source: include params 30564 1726882836.31392: variable 'interface' from source: play vars 30564 1726882836.31460: variable 'interface' from source: play vars 30564 1726882836.31505: Evaluated conditional (__network_wpa_supplicant_required): False 30564 1726882836.31512: when evaluation is False, skipping this task 30564 1726882836.31519: _execute() done 30564 1726882836.31523: dumping result to json 30564 1726882836.31525: done dumping result, returning 30564 1726882836.31535: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0e448fcc-3ce9-4216-acec-000000000b40] 30564 1726882836.31544: sending task result for task 0e448fcc-3ce9-4216-acec-000000000b40 30564 1726882836.31639: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000b40 30564 1726882836.31643: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 30564 1726882836.31697: no more pending results, returning what we have 30564 1726882836.31701: results queue empty 30564 1726882836.31702: checking for any_errors_fatal 30564 1726882836.31724: done checking for any_errors_fatal 30564 1726882836.31725: checking for max_fail_percentage 30564 1726882836.31727: done checking for max_fail_percentage 30564 1726882836.31728: checking to see if all hosts have failed and the running result is not ok 30564 1726882836.31729: done checking to see if all hosts have failed 30564 1726882836.31729: getting the remaining hosts for this loop 30564 1726882836.31731: done getting the remaining hosts for this loop 30564 1726882836.31735: getting the next task for host managed_node2 30564 1726882836.31744: done getting next task for host managed_node2 30564 1726882836.31748: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 30564 1726882836.31753: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882836.31776: getting variables 30564 1726882836.31778: in VariableManager get_vars() 30564 1726882836.31814: Calling all_inventory to load vars for managed_node2 30564 1726882836.31817: Calling groups_inventory to load vars for managed_node2 30564 1726882836.31819: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882836.31830: Calling all_plugins_play to load vars for managed_node2 30564 1726882836.31833: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882836.31836: Calling groups_plugins_play to load vars for managed_node2 30564 1726882836.34639: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882836.38019: done with get_vars() 30564 1726882836.38041: done getting variables 30564 1726882836.38110: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:40:36 -0400 (0:00:00.164) 0:00:34.962 ****** 30564 1726882836.38141: entering _queue_task() for managed_node2/service 30564 1726882836.38942: worker is 1 (out of 1 available) 30564 1726882836.38955: exiting _queue_task() for managed_node2/service 30564 1726882836.38979: done queuing things up, now waiting for results queue to drain 30564 1726882836.38980: waiting for pending results... 30564 1726882836.39262: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 30564 1726882836.39428: in run() - task 0e448fcc-3ce9-4216-acec-000000000b41 30564 1726882836.39445: variable 'ansible_search_path' from source: unknown 30564 1726882836.39452: variable 'ansible_search_path' from source: unknown 30564 1726882836.39504: calling self._execute() 30564 1726882836.39609: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882836.39626: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882836.39652: variable 'omit' from source: magic vars 30564 1726882836.40198: variable 'ansible_distribution_major_version' from source: facts 30564 1726882836.40215: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882836.40332: variable 'network_provider' from source: set_fact 30564 1726882836.40384: Evaluated conditional (network_provider == "initscripts"): False 30564 1726882836.40576: when evaluation is False, skipping this task 30564 1726882836.40584: _execute() done 30564 1726882836.40592: dumping result to json 30564 1726882836.40599: done dumping result, returning 30564 1726882836.40608: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [0e448fcc-3ce9-4216-acec-000000000b41] 30564 1726882836.40618: sending task result for task 0e448fcc-3ce9-4216-acec-000000000b41 skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30564 1726882836.40767: no more pending results, returning what we have 30564 1726882836.40773: results queue empty 30564 1726882836.40774: checking for any_errors_fatal 30564 1726882836.40780: done checking for any_errors_fatal 30564 1726882836.40781: checking for max_fail_percentage 30564 1726882836.40783: done checking for max_fail_percentage 30564 1726882836.40785: checking to see if all hosts have failed and the running result is not ok 30564 1726882836.40785: done checking to see if all hosts have failed 30564 1726882836.40786: getting the remaining hosts for this loop 30564 1726882836.40787: done getting the remaining hosts for this loop 30564 1726882836.40791: getting the next task for host managed_node2 30564 1726882836.40799: done getting next task for host managed_node2 30564 1726882836.40803: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30564 1726882836.40808: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882836.40833: getting variables 30564 1726882836.40835: in VariableManager get_vars() 30564 1726882836.40872: Calling all_inventory to load vars for managed_node2 30564 1726882836.40875: Calling groups_inventory to load vars for managed_node2 30564 1726882836.40877: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882836.40890: Calling all_plugins_play to load vars for managed_node2 30564 1726882836.40893: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882836.40896: Calling groups_plugins_play to load vars for managed_node2 30564 1726882836.41437: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000b41 30564 1726882836.41440: WORKER PROCESS EXITING 30564 1726882836.42498: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882836.45755: done with get_vars() 30564 1726882836.45786: done getting variables 30564 1726882836.45840: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:40:36 -0400 (0:00:00.077) 0:00:35.040 ****** 30564 1726882836.45881: entering _queue_task() for managed_node2/copy 30564 1726882836.46165: worker is 1 (out of 1 available) 30564 1726882836.46179: exiting _queue_task() for managed_node2/copy 30564 1726882836.46192: done queuing things up, now waiting for results queue to drain 30564 1726882836.46193: waiting for pending results... 30564 1726882836.46480: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30564 1726882836.46628: in run() - task 0e448fcc-3ce9-4216-acec-000000000b42 30564 1726882836.46646: variable 'ansible_search_path' from source: unknown 30564 1726882836.46652: variable 'ansible_search_path' from source: unknown 30564 1726882836.46691: calling self._execute() 30564 1726882836.46793: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882836.46803: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882836.46815: variable 'omit' from source: magic vars 30564 1726882836.47180: variable 'ansible_distribution_major_version' from source: facts 30564 1726882836.47197: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882836.47321: variable 'network_provider' from source: set_fact 30564 1726882836.47332: Evaluated conditional (network_provider == "initscripts"): False 30564 1726882836.47339: when evaluation is False, skipping this task 30564 1726882836.47345: _execute() done 30564 1726882836.47351: dumping result to json 30564 1726882836.47357: done dumping result, returning 30564 1726882836.47370: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0e448fcc-3ce9-4216-acec-000000000b42] 30564 1726882836.47382: sending task result for task 0e448fcc-3ce9-4216-acec-000000000b42 skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 30564 1726882836.47540: no more pending results, returning what we have 30564 1726882836.47544: results queue empty 30564 1726882836.47545: checking for any_errors_fatal 30564 1726882836.47552: done checking for any_errors_fatal 30564 1726882836.47553: checking for max_fail_percentage 30564 1726882836.47555: done checking for max_fail_percentage 30564 1726882836.47556: checking to see if all hosts have failed and the running result is not ok 30564 1726882836.47556: done checking to see if all hosts have failed 30564 1726882836.47557: getting the remaining hosts for this loop 30564 1726882836.47559: done getting the remaining hosts for this loop 30564 1726882836.47562: getting the next task for host managed_node2 30564 1726882836.47573: done getting next task for host managed_node2 30564 1726882836.47576: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30564 1726882836.47582: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882836.47602: getting variables 30564 1726882836.47604: in VariableManager get_vars() 30564 1726882836.47636: Calling all_inventory to load vars for managed_node2 30564 1726882836.47639: Calling groups_inventory to load vars for managed_node2 30564 1726882836.47641: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882836.47652: Calling all_plugins_play to load vars for managed_node2 30564 1726882836.47655: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882836.47658: Calling groups_plugins_play to load vars for managed_node2 30564 1726882836.48845: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000b42 30564 1726882836.48849: WORKER PROCESS EXITING 30564 1726882836.50958: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882836.53739: done with get_vars() 30564 1726882836.53760: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:40:36 -0400 (0:00:00.079) 0:00:35.119 ****** 30564 1726882836.53856: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 30564 1726882836.54145: worker is 1 (out of 1 available) 30564 1726882836.54159: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 30564 1726882836.54173: done queuing things up, now waiting for results queue to drain 30564 1726882836.54174: waiting for pending results... 30564 1726882836.54457: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30564 1726882836.54600: in run() - task 0e448fcc-3ce9-4216-acec-000000000b43 30564 1726882836.54623: variable 'ansible_search_path' from source: unknown 30564 1726882836.54629: variable 'ansible_search_path' from source: unknown 30564 1726882836.54673: calling self._execute() 30564 1726882836.54775: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882836.54787: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882836.54801: variable 'omit' from source: magic vars 30564 1726882836.55742: variable 'ansible_distribution_major_version' from source: facts 30564 1726882836.55764: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882836.55775: variable 'omit' from source: magic vars 30564 1726882836.56034: variable 'omit' from source: magic vars 30564 1726882836.56570: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882836.60355: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882836.60429: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882836.60469: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882836.60513: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882836.60568: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882836.60658: variable 'network_provider' from source: set_fact 30564 1726882836.60933: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882836.61095: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882836.61178: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882836.61224: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882836.61274: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882836.61424: variable 'omit' from source: magic vars 30564 1726882836.61630: variable 'omit' from source: magic vars 30564 1726882836.61905: variable 'network_connections' from source: include params 30564 1726882836.61921: variable 'interface' from source: play vars 30564 1726882836.62120: variable 'interface' from source: play vars 30564 1726882836.62552: variable 'omit' from source: magic vars 30564 1726882836.62585: variable '__lsr_ansible_managed' from source: task vars 30564 1726882836.62718: variable '__lsr_ansible_managed' from source: task vars 30564 1726882836.62926: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 30564 1726882836.63935: Loaded config def from plugin (lookup/template) 30564 1726882836.63969: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 30564 1726882836.64143: File lookup term: get_ansible_managed.j2 30564 1726882836.64167: variable 'ansible_search_path' from source: unknown 30564 1726882836.64193: evaluation_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 30564 1726882836.64249: search_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 30564 1726882836.64305: variable 'ansible_search_path' from source: unknown 30564 1726882836.77745: variable 'ansible_managed' from source: unknown 30564 1726882836.78024: variable 'omit' from source: magic vars 30564 1726882836.78175: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882836.78206: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882836.78227: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882836.78248: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882836.78265: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882836.78633: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882836.78643: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882836.78728: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882836.78944: Set connection var ansible_timeout to 10 30564 1726882836.78956: Set connection var ansible_pipelining to False 30564 1726882836.78962: Set connection var ansible_shell_type to sh 30564 1726882836.78976: Set connection var ansible_shell_executable to /bin/sh 30564 1726882836.78988: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882836.78995: Set connection var ansible_connection to ssh 30564 1726882836.79024: variable 'ansible_shell_executable' from source: unknown 30564 1726882836.79161: variable 'ansible_connection' from source: unknown 30564 1726882836.79173: variable 'ansible_module_compression' from source: unknown 30564 1726882836.79181: variable 'ansible_shell_type' from source: unknown 30564 1726882836.79187: variable 'ansible_shell_executable' from source: unknown 30564 1726882836.79195: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882836.79202: variable 'ansible_pipelining' from source: unknown 30564 1726882836.79208: variable 'ansible_timeout' from source: unknown 30564 1726882836.79215: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882836.79355: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30564 1726882836.79496: variable 'omit' from source: magic vars 30564 1726882836.79508: starting attempt loop 30564 1726882836.79516: running the handler 30564 1726882836.79533: _low_level_execute_command(): starting 30564 1726882836.79544: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882836.81375: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882836.81379: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882836.81418: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 30564 1726882836.81422: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882836.81424: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 30564 1726882836.81428: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882836.81612: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882836.81615: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882836.81670: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882836.81893: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882836.83547: stdout chunk (state=3): >>>/root <<< 30564 1726882836.83654: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882836.83717: stderr chunk (state=3): >>><<< 30564 1726882836.83720: stdout chunk (state=3): >>><<< 30564 1726882836.83743: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882836.83754: _low_level_execute_command(): starting 30564 1726882836.83761: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882836.8374267-32176-242010467236188 `" && echo ansible-tmp-1726882836.8374267-32176-242010467236188="` echo /root/.ansible/tmp/ansible-tmp-1726882836.8374267-32176-242010467236188 `" ) && sleep 0' 30564 1726882836.84357: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882836.84367: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882836.84386: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882836.84399: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882836.84435: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882836.84442: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882836.84452: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882836.84468: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882836.84480: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882836.84487: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882836.84495: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882836.84504: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882836.84519: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882836.84523: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882836.84529: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882836.84538: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882836.84613: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882836.84637: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882836.84641: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882836.84777: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882836.86683: stdout chunk (state=3): >>>ansible-tmp-1726882836.8374267-32176-242010467236188=/root/.ansible/tmp/ansible-tmp-1726882836.8374267-32176-242010467236188 <<< 30564 1726882836.86853: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882836.86856: stderr chunk (state=3): >>><<< 30564 1726882836.86858: stdout chunk (state=3): >>><<< 30564 1726882836.86878: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882836.8374267-32176-242010467236188=/root/.ansible/tmp/ansible-tmp-1726882836.8374267-32176-242010467236188 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882836.86923: variable 'ansible_module_compression' from source: unknown 30564 1726882836.86968: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 30564 1726882836.87023: variable 'ansible_facts' from source: unknown 30564 1726882836.87152: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882836.8374267-32176-242010467236188/AnsiballZ_network_connections.py 30564 1726882836.87297: Sending initial data 30564 1726882836.87300: Sent initial data (168 bytes) 30564 1726882836.88282: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882836.88286: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882836.88289: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882836.88291: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882836.88477: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882836.88484: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882836.88488: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882836.88491: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882836.88493: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882836.88495: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882836.88497: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882836.88499: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882836.88501: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882836.88503: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882836.88505: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882836.88507: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882836.88508: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882836.88510: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882836.88512: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882836.89194: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882836.90999: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882836.91104: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882836.91202: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmpgj1p34sb /root/.ansible/tmp/ansible-tmp-1726882836.8374267-32176-242010467236188/AnsiballZ_network_connections.py <<< 30564 1726882836.91298: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882836.93380: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882836.93487: stderr chunk (state=3): >>><<< 30564 1726882836.93490: stdout chunk (state=3): >>><<< 30564 1726882836.93512: done transferring module to remote 30564 1726882836.93523: _low_level_execute_command(): starting 30564 1726882836.93528: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882836.8374267-32176-242010467236188/ /root/.ansible/tmp/ansible-tmp-1726882836.8374267-32176-242010467236188/AnsiballZ_network_connections.py && sleep 0' 30564 1726882836.95023: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882836.95783: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882836.95793: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882836.95808: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882836.95846: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882836.95853: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882836.95865: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882836.95883: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882836.95891: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882836.95898: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882836.95905: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882836.95915: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882836.95926: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882836.95936: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882836.95943: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882836.95950: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882836.96025: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882836.96039: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882836.96072: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882836.96177: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882836.98054: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882836.98058: stdout chunk (state=3): >>><<< 30564 1726882836.98067: stderr chunk (state=3): >>><<< 30564 1726882836.98089: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882836.98092: _low_level_execute_command(): starting 30564 1726882836.98097: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882836.8374267-32176-242010467236188/AnsiballZ_network_connections.py && sleep 0' 30564 1726882836.99928: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882836.99937: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882836.99947: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882836.99961: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882837.00005: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882837.00013: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882837.00020: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882837.00033: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882837.00041: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882837.00047: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882837.00055: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882837.00065: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882837.00080: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882837.00088: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882837.00094: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882837.00103: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882837.00178: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882837.00292: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882837.00302: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882837.01004: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882837.26144: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 6d0eee33-2e09-457c-9193-5de1eabb8deb\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 30564 1726882837.28980: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882837.28984: stdout chunk (state=3): >>><<< 30564 1726882837.28989: stderr chunk (state=3): >>><<< 30564 1726882837.29015: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 6d0eee33-2e09-457c-9193-5de1eabb8deb\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882837.29056: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'persistent_state': 'present', 'type': 'bridge', 'ip': {'dhcp4': False, 'auto6': False}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882836.8374267-32176-242010467236188/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882837.29066: _low_level_execute_command(): starting 30564 1726882837.29072: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882836.8374267-32176-242010467236188/ > /dev/null 2>&1 && sleep 0' 30564 1726882837.29917: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882837.29923: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882837.29962: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882837.29967: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882837.29984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882837.29988: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882837.30066: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882837.30069: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882837.30082: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882837.30208: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882837.32078: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882837.32097: stderr chunk (state=3): >>><<< 30564 1726882837.32100: stdout chunk (state=3): >>><<< 30564 1726882837.32551: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882837.32554: handler run complete 30564 1726882837.32556: attempt loop complete, returning result 30564 1726882837.32558: _execute() done 30564 1726882837.32561: dumping result to json 30564 1726882837.32563: done dumping result, returning 30564 1726882837.32567: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0e448fcc-3ce9-4216-acec-000000000b43] 30564 1726882837.32569: sending task result for task 0e448fcc-3ce9-4216-acec-000000000b43 30564 1726882837.32648: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000b43 30564 1726882837.32652: WORKER PROCESS EXITING changed: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 6d0eee33-2e09-457c-9193-5de1eabb8deb 30564 1726882837.32758: no more pending results, returning what we have 30564 1726882837.32761: results queue empty 30564 1726882837.32762: checking for any_errors_fatal 30564 1726882837.32769: done checking for any_errors_fatal 30564 1726882837.32770: checking for max_fail_percentage 30564 1726882837.32772: done checking for max_fail_percentage 30564 1726882837.32773: checking to see if all hosts have failed and the running result is not ok 30564 1726882837.32774: done checking to see if all hosts have failed 30564 1726882837.32775: getting the remaining hosts for this loop 30564 1726882837.32776: done getting the remaining hosts for this loop 30564 1726882837.32779: getting the next task for host managed_node2 30564 1726882837.32785: done getting next task for host managed_node2 30564 1726882837.32789: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 30564 1726882837.32794: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882837.32805: getting variables 30564 1726882837.32807: in VariableManager get_vars() 30564 1726882837.32839: Calling all_inventory to load vars for managed_node2 30564 1726882837.32842: Calling groups_inventory to load vars for managed_node2 30564 1726882837.32844: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882837.32853: Calling all_plugins_play to load vars for managed_node2 30564 1726882837.32856: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882837.32859: Calling groups_plugins_play to load vars for managed_node2 30564 1726882837.35082: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882837.37169: done with get_vars() 30564 1726882837.37193: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:40:37 -0400 (0:00:00.834) 0:00:35.954 ****** 30564 1726882837.37278: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 30564 1726882837.37600: worker is 1 (out of 1 available) 30564 1726882837.37613: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 30564 1726882837.37628: done queuing things up, now waiting for results queue to drain 30564 1726882837.37629: waiting for pending results... 30564 1726882837.37926: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 30564 1726882837.38084: in run() - task 0e448fcc-3ce9-4216-acec-000000000b44 30564 1726882837.38105: variable 'ansible_search_path' from source: unknown 30564 1726882837.38113: variable 'ansible_search_path' from source: unknown 30564 1726882837.38153: calling self._execute() 30564 1726882837.38259: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882837.38274: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882837.38295: variable 'omit' from source: magic vars 30564 1726882837.38708: variable 'ansible_distribution_major_version' from source: facts 30564 1726882837.38740: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882837.39474: variable 'network_state' from source: role '' defaults 30564 1726882837.39496: Evaluated conditional (network_state != {}): False 30564 1726882837.39503: when evaluation is False, skipping this task 30564 1726882837.39510: _execute() done 30564 1726882837.39518: dumping result to json 30564 1726882837.39525: done dumping result, returning 30564 1726882837.39535: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [0e448fcc-3ce9-4216-acec-000000000b44] 30564 1726882837.39545: sending task result for task 0e448fcc-3ce9-4216-acec-000000000b44 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30564 1726882837.39705: no more pending results, returning what we have 30564 1726882837.39711: results queue empty 30564 1726882837.39712: checking for any_errors_fatal 30564 1726882837.39729: done checking for any_errors_fatal 30564 1726882837.39730: checking for max_fail_percentage 30564 1726882837.39732: done checking for max_fail_percentage 30564 1726882837.39733: checking to see if all hosts have failed and the running result is not ok 30564 1726882837.39734: done checking to see if all hosts have failed 30564 1726882837.39734: getting the remaining hosts for this loop 30564 1726882837.39736: done getting the remaining hosts for this loop 30564 1726882837.39740: getting the next task for host managed_node2 30564 1726882837.39749: done getting next task for host managed_node2 30564 1726882837.39753: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30564 1726882837.39760: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882837.39787: getting variables 30564 1726882837.39789: in VariableManager get_vars() 30564 1726882837.39827: Calling all_inventory to load vars for managed_node2 30564 1726882837.39830: Calling groups_inventory to load vars for managed_node2 30564 1726882837.39832: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882837.39845: Calling all_plugins_play to load vars for managed_node2 30564 1726882837.39849: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882837.39852: Calling groups_plugins_play to load vars for managed_node2 30564 1726882837.41283: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000b44 30564 1726882837.41287: WORKER PROCESS EXITING 30564 1726882837.41605: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882837.44385: done with get_vars() 30564 1726882837.44410: done getting variables 30564 1726882837.44471: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:40:37 -0400 (0:00:00.072) 0:00:36.026 ****** 30564 1726882837.44506: entering _queue_task() for managed_node2/debug 30564 1726882837.44808: worker is 1 (out of 1 available) 30564 1726882837.44821: exiting _queue_task() for managed_node2/debug 30564 1726882837.44836: done queuing things up, now waiting for results queue to drain 30564 1726882837.44837: waiting for pending results... 30564 1726882837.45124: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30564 1726882837.45262: in run() - task 0e448fcc-3ce9-4216-acec-000000000b45 30564 1726882837.45289: variable 'ansible_search_path' from source: unknown 30564 1726882837.45297: variable 'ansible_search_path' from source: unknown 30564 1726882837.45337: calling self._execute() 30564 1726882837.45443: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882837.45456: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882837.45475: variable 'omit' from source: magic vars 30564 1726882837.45887: variable 'ansible_distribution_major_version' from source: facts 30564 1726882837.45905: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882837.45917: variable 'omit' from source: magic vars 30564 1726882837.45997: variable 'omit' from source: magic vars 30564 1726882837.46047: variable 'omit' from source: magic vars 30564 1726882837.46097: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882837.46133: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882837.46159: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882837.46182: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882837.46201: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882837.46233: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882837.46240: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882837.46247: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882837.46353: Set connection var ansible_timeout to 10 30564 1726882837.46371: Set connection var ansible_pipelining to False 30564 1726882837.46380: Set connection var ansible_shell_type to sh 30564 1726882837.46389: Set connection var ansible_shell_executable to /bin/sh 30564 1726882837.46400: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882837.46410: Set connection var ansible_connection to ssh 30564 1726882837.46435: variable 'ansible_shell_executable' from source: unknown 30564 1726882837.46442: variable 'ansible_connection' from source: unknown 30564 1726882837.46448: variable 'ansible_module_compression' from source: unknown 30564 1726882837.46454: variable 'ansible_shell_type' from source: unknown 30564 1726882837.46459: variable 'ansible_shell_executable' from source: unknown 30564 1726882837.46466: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882837.46476: variable 'ansible_pipelining' from source: unknown 30564 1726882837.46487: variable 'ansible_timeout' from source: unknown 30564 1726882837.46494: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882837.46646: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882837.46674: variable 'omit' from source: magic vars 30564 1726882837.46689: starting attempt loop 30564 1726882837.46701: running the handler 30564 1726882837.46829: variable '__network_connections_result' from source: set_fact 30564 1726882837.46887: handler run complete 30564 1726882837.46909: attempt loop complete, returning result 30564 1726882837.46919: _execute() done 30564 1726882837.46925: dumping result to json 30564 1726882837.46931: done dumping result, returning 30564 1726882837.46943: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0e448fcc-3ce9-4216-acec-000000000b45] 30564 1726882837.46956: sending task result for task 0e448fcc-3ce9-4216-acec-000000000b45 ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 6d0eee33-2e09-457c-9193-5de1eabb8deb" ] } 30564 1726882837.47129: no more pending results, returning what we have 30564 1726882837.47133: results queue empty 30564 1726882837.47134: checking for any_errors_fatal 30564 1726882837.47140: done checking for any_errors_fatal 30564 1726882837.47141: checking for max_fail_percentage 30564 1726882837.47143: done checking for max_fail_percentage 30564 1726882837.47143: checking to see if all hosts have failed and the running result is not ok 30564 1726882837.47144: done checking to see if all hosts have failed 30564 1726882837.47145: getting the remaining hosts for this loop 30564 1726882837.47147: done getting the remaining hosts for this loop 30564 1726882837.47150: getting the next task for host managed_node2 30564 1726882837.47158: done getting next task for host managed_node2 30564 1726882837.47162: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30564 1726882837.47170: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882837.47185: getting variables 30564 1726882837.47187: in VariableManager get_vars() 30564 1726882837.47222: Calling all_inventory to load vars for managed_node2 30564 1726882837.47225: Calling groups_inventory to load vars for managed_node2 30564 1726882837.47227: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882837.47238: Calling all_plugins_play to load vars for managed_node2 30564 1726882837.47241: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882837.47244: Calling groups_plugins_play to load vars for managed_node2 30564 1726882837.48283: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000b45 30564 1726882837.48290: WORKER PROCESS EXITING 30564 1726882837.49357: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882837.52559: done with get_vars() 30564 1726882837.52587: done getting variables 30564 1726882837.52670: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:40:37 -0400 (0:00:00.082) 0:00:36.108 ****** 30564 1726882837.52710: entering _queue_task() for managed_node2/debug 30564 1726882837.53013: worker is 1 (out of 1 available) 30564 1726882837.53026: exiting _queue_task() for managed_node2/debug 30564 1726882837.53038: done queuing things up, now waiting for results queue to drain 30564 1726882837.53039: waiting for pending results... 30564 1726882837.53339: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30564 1726882837.53506: in run() - task 0e448fcc-3ce9-4216-acec-000000000b46 30564 1726882837.53531: variable 'ansible_search_path' from source: unknown 30564 1726882837.53540: variable 'ansible_search_path' from source: unknown 30564 1726882837.53580: calling self._execute() 30564 1726882837.53683: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882837.53700: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882837.53715: variable 'omit' from source: magic vars 30564 1726882837.54334: variable 'ansible_distribution_major_version' from source: facts 30564 1726882837.54351: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882837.54361: variable 'omit' from source: magic vars 30564 1726882837.54433: variable 'omit' from source: magic vars 30564 1726882837.54471: variable 'omit' from source: magic vars 30564 1726882837.54539: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882837.54584: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882837.54611: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882837.54637: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882837.54654: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882837.54694: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882837.54705: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882837.54746: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882837.54886: Set connection var ansible_timeout to 10 30564 1726882837.54896: Set connection var ansible_pipelining to False 30564 1726882837.54902: Set connection var ansible_shell_type to sh 30564 1726882837.54944: Set connection var ansible_shell_executable to /bin/sh 30564 1726882837.54963: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882837.54972: Set connection var ansible_connection to ssh 30564 1726882837.55000: variable 'ansible_shell_executable' from source: unknown 30564 1726882837.55007: variable 'ansible_connection' from source: unknown 30564 1726882837.55037: variable 'ansible_module_compression' from source: unknown 30564 1726882837.55049: variable 'ansible_shell_type' from source: unknown 30564 1726882837.55058: variable 'ansible_shell_executable' from source: unknown 30564 1726882837.55072: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882837.55082: variable 'ansible_pipelining' from source: unknown 30564 1726882837.55088: variable 'ansible_timeout' from source: unknown 30564 1726882837.55095: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882837.55341: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882837.55359: variable 'omit' from source: magic vars 30564 1726882837.55373: starting attempt loop 30564 1726882837.55380: running the handler 30564 1726882837.55459: variable '__network_connections_result' from source: set_fact 30564 1726882837.55852: variable '__network_connections_result' from source: set_fact 30564 1726882837.56009: handler run complete 30564 1726882837.56049: attempt loop complete, returning result 30564 1726882837.56057: _execute() done 30564 1726882837.56066: dumping result to json 30564 1726882837.56075: done dumping result, returning 30564 1726882837.56085: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0e448fcc-3ce9-4216-acec-000000000b46] 30564 1726882837.56092: sending task result for task 0e448fcc-3ce9-4216-acec-000000000b46 ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 6d0eee33-2e09-457c-9193-5de1eabb8deb\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 6d0eee33-2e09-457c-9193-5de1eabb8deb" ] } } 30564 1726882837.56277: no more pending results, returning what we have 30564 1726882837.56281: results queue empty 30564 1726882837.56283: checking for any_errors_fatal 30564 1726882837.56292: done checking for any_errors_fatal 30564 1726882837.56293: checking for max_fail_percentage 30564 1726882837.56295: done checking for max_fail_percentage 30564 1726882837.56296: checking to see if all hosts have failed and the running result is not ok 30564 1726882837.56297: done checking to see if all hosts have failed 30564 1726882837.56298: getting the remaining hosts for this loop 30564 1726882837.56300: done getting the remaining hosts for this loop 30564 1726882837.56305: getting the next task for host managed_node2 30564 1726882837.56315: done getting next task for host managed_node2 30564 1726882837.56320: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30564 1726882837.56326: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882837.56339: getting variables 30564 1726882837.56341: in VariableManager get_vars() 30564 1726882837.56385: Calling all_inventory to load vars for managed_node2 30564 1726882837.56388: Calling groups_inventory to load vars for managed_node2 30564 1726882837.56396: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882837.56408: Calling all_plugins_play to load vars for managed_node2 30564 1726882837.56411: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882837.56414: Calling groups_plugins_play to load vars for managed_node2 30564 1726882837.57454: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000b46 30564 1726882837.57457: WORKER PROCESS EXITING 30564 1726882837.59661: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882837.61796: done with get_vars() 30564 1726882837.61822: done getting variables 30564 1726882837.61916: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:40:37 -0400 (0:00:00.092) 0:00:36.200 ****** 30564 1726882837.61950: entering _queue_task() for managed_node2/debug 30564 1726882837.62603: worker is 1 (out of 1 available) 30564 1726882837.62619: exiting _queue_task() for managed_node2/debug 30564 1726882837.62634: done queuing things up, now waiting for results queue to drain 30564 1726882837.62635: waiting for pending results... 30564 1726882837.62999: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30564 1726882837.63153: in run() - task 0e448fcc-3ce9-4216-acec-000000000b47 30564 1726882837.63177: variable 'ansible_search_path' from source: unknown 30564 1726882837.63181: variable 'ansible_search_path' from source: unknown 30564 1726882837.63216: calling self._execute() 30564 1726882837.63323: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882837.63328: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882837.63338: variable 'omit' from source: magic vars 30564 1726882837.63743: variable 'ansible_distribution_major_version' from source: facts 30564 1726882837.63756: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882837.63910: variable 'network_state' from source: role '' defaults 30564 1726882837.63921: Evaluated conditional (network_state != {}): False 30564 1726882837.63924: when evaluation is False, skipping this task 30564 1726882837.63927: _execute() done 30564 1726882837.63929: dumping result to json 30564 1726882837.63931: done dumping result, returning 30564 1726882837.63939: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0e448fcc-3ce9-4216-acec-000000000b47] 30564 1726882837.63950: sending task result for task 0e448fcc-3ce9-4216-acec-000000000b47 30564 1726882837.64046: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000b47 30564 1726882837.64050: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "network_state != {}" } 30564 1726882837.64104: no more pending results, returning what we have 30564 1726882837.64109: results queue empty 30564 1726882837.64110: checking for any_errors_fatal 30564 1726882837.64120: done checking for any_errors_fatal 30564 1726882837.64121: checking for max_fail_percentage 30564 1726882837.64124: done checking for max_fail_percentage 30564 1726882837.64125: checking to see if all hosts have failed and the running result is not ok 30564 1726882837.64125: done checking to see if all hosts have failed 30564 1726882837.64126: getting the remaining hosts for this loop 30564 1726882837.64128: done getting the remaining hosts for this loop 30564 1726882837.64132: getting the next task for host managed_node2 30564 1726882837.64141: done getting next task for host managed_node2 30564 1726882837.64145: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 30564 1726882837.64152: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882837.64179: getting variables 30564 1726882837.64181: in VariableManager get_vars() 30564 1726882837.64216: Calling all_inventory to load vars for managed_node2 30564 1726882837.64219: Calling groups_inventory to load vars for managed_node2 30564 1726882837.64222: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882837.64234: Calling all_plugins_play to load vars for managed_node2 30564 1726882837.64237: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882837.64240: Calling groups_plugins_play to load vars for managed_node2 30564 1726882837.66096: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882837.69424: done with get_vars() 30564 1726882837.69456: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:40:37 -0400 (0:00:00.077) 0:00:36.278 ****** 30564 1726882837.69817: entering _queue_task() for managed_node2/ping 30564 1726882837.70529: worker is 1 (out of 1 available) 30564 1726882837.70542: exiting _queue_task() for managed_node2/ping 30564 1726882837.70556: done queuing things up, now waiting for results queue to drain 30564 1726882837.70558: waiting for pending results... 30564 1726882837.71562: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 30564 1726882837.71839: in run() - task 0e448fcc-3ce9-4216-acec-000000000b48 30564 1726882837.71852: variable 'ansible_search_path' from source: unknown 30564 1726882837.71880: variable 'ansible_search_path' from source: unknown 30564 1726882837.72105: calling self._execute() 30564 1726882837.72243: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882837.72249: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882837.72270: variable 'omit' from source: magic vars 30564 1726882837.73311: variable 'ansible_distribution_major_version' from source: facts 30564 1726882837.73341: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882837.73344: variable 'omit' from source: magic vars 30564 1726882837.73539: variable 'omit' from source: magic vars 30564 1726882837.73590: variable 'omit' from source: magic vars 30564 1726882837.73747: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882837.73783: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882837.73801: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882837.73818: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882837.73829: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882837.73990: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882837.73994: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882837.74014: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882837.74240: Set connection var ansible_timeout to 10 30564 1726882837.74246: Set connection var ansible_pipelining to False 30564 1726882837.74250: Set connection var ansible_shell_type to sh 30564 1726882837.74252: Set connection var ansible_shell_executable to /bin/sh 30564 1726882837.74311: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882837.74315: Set connection var ansible_connection to ssh 30564 1726882837.74351: variable 'ansible_shell_executable' from source: unknown 30564 1726882837.74354: variable 'ansible_connection' from source: unknown 30564 1726882837.74357: variable 'ansible_module_compression' from source: unknown 30564 1726882837.74359: variable 'ansible_shell_type' from source: unknown 30564 1726882837.74369: variable 'ansible_shell_executable' from source: unknown 30564 1726882837.74388: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882837.74391: variable 'ansible_pipelining' from source: unknown 30564 1726882837.74501: variable 'ansible_timeout' from source: unknown 30564 1726882837.74505: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882837.75095: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30564 1726882837.75114: variable 'omit' from source: magic vars 30564 1726882837.75117: starting attempt loop 30564 1726882837.75119: running the handler 30564 1726882837.75141: _low_level_execute_command(): starting 30564 1726882837.75259: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882837.77456: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882837.77475: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882837.77487: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882837.77502: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882837.77595: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882837.77598: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882837.77605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882837.77619: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882837.77627: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882837.77634: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882837.77643: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882837.77656: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882837.77670: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882837.77681: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882837.77691: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882837.77765: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882837.77851: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882837.77949: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882837.77952: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882837.78095: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882837.79770: stdout chunk (state=3): >>>/root <<< 30564 1726882837.79882: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882837.79954: stderr chunk (state=3): >>><<< 30564 1726882837.79957: stdout chunk (state=3): >>><<< 30564 1726882837.79991: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882837.80005: _low_level_execute_command(): starting 30564 1726882837.80012: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882837.799899-32235-213649798561661 `" && echo ansible-tmp-1726882837.799899-32235-213649798561661="` echo /root/.ansible/tmp/ansible-tmp-1726882837.799899-32235-213649798561661 `" ) && sleep 0' 30564 1726882837.81800: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882837.81810: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882837.81869: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882837.81881: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882837.81983: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882837.81992: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882837.81998: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882837.82011: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882837.82028: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882837.82049: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882837.82052: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882837.82162: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882837.82181: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882837.82194: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882837.82207: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882837.82210: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882837.82308: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882837.82317: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882837.82329: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882837.82502: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882837.84431: stdout chunk (state=3): >>>ansible-tmp-1726882837.799899-32235-213649798561661=/root/.ansible/tmp/ansible-tmp-1726882837.799899-32235-213649798561661 <<< 30564 1726882837.84580: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882837.84615: stderr chunk (state=3): >>><<< 30564 1726882837.84618: stdout chunk (state=3): >>><<< 30564 1726882837.84637: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882837.799899-32235-213649798561661=/root/.ansible/tmp/ansible-tmp-1726882837.799899-32235-213649798561661 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882837.84690: variable 'ansible_module_compression' from source: unknown 30564 1726882837.84737: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 30564 1726882837.84765: variable 'ansible_facts' from source: unknown 30564 1726882837.84851: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882837.799899-32235-213649798561661/AnsiballZ_ping.py 30564 1726882837.85493: Sending initial data 30564 1726882837.85496: Sent initial data (152 bytes) 30564 1726882837.88811: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882837.88815: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882837.89027: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882837.89035: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882837.89110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 30564 1726882837.89116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882837.89304: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882837.89325: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882837.90490: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882837.91291: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882837.91388: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882837.91493: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmpuo7repxa /root/.ansible/tmp/ansible-tmp-1726882837.799899-32235-213649798561661/AnsiballZ_ping.py <<< 30564 1726882837.91594: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882837.93005: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882837.93235: stderr chunk (state=3): >>><<< 30564 1726882837.93239: stdout chunk (state=3): >>><<< 30564 1726882837.93241: done transferring module to remote 30564 1726882837.93243: _low_level_execute_command(): starting 30564 1726882837.93245: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882837.799899-32235-213649798561661/ /root/.ansible/tmp/ansible-tmp-1726882837.799899-32235-213649798561661/AnsiballZ_ping.py && sleep 0' 30564 1726882837.94082: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882837.94091: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882837.94101: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882837.94115: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882837.94158: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882837.94167: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882837.94180: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882837.94193: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882837.94200: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882837.94207: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882837.94215: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882837.94225: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882837.94236: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882837.94245: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882837.94254: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882837.94269: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882837.94343: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882837.94362: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882837.94381: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882837.94504: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882837.96390: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882837.96394: stdout chunk (state=3): >>><<< 30564 1726882837.96400: stderr chunk (state=3): >>><<< 30564 1726882837.96416: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882837.96419: _low_level_execute_command(): starting 30564 1726882837.96424: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882837.799899-32235-213649798561661/AnsiballZ_ping.py && sleep 0' 30564 1726882837.97722: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882837.97725: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882837.97728: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882837.97730: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882837.97732: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882837.97734: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882837.97736: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882837.97738: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882837.97739: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882837.97741: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882837.97743: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882837.97745: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882837.97747: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882837.97749: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882837.97751: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882837.97753: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882837.97851: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882837.97855: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882837.97857: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882837.98103: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882838.11314: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 30564 1726882838.12358: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882838.12362: stdout chunk (state=3): >>><<< 30564 1726882838.12370: stderr chunk (state=3): >>><<< 30564 1726882838.12391: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882838.12420: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882837.799899-32235-213649798561661/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882838.12427: _low_level_execute_command(): starting 30564 1726882838.12433: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882837.799899-32235-213649798561661/ > /dev/null 2>&1 && sleep 0' 30564 1726882838.13688: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882838.13691: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882838.13737: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882838.13743: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882838.13757: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882838.13765: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882838.13849: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882838.13855: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882838.13884: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882838.13993: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882838.15837: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882838.15909: stderr chunk (state=3): >>><<< 30564 1726882838.15912: stdout chunk (state=3): >>><<< 30564 1726882838.15928: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882838.15934: handler run complete 30564 1726882838.15955: attempt loop complete, returning result 30564 1726882838.15958: _execute() done 30564 1726882838.15960: dumping result to json 30564 1726882838.15962: done dumping result, returning 30564 1726882838.15973: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0e448fcc-3ce9-4216-acec-000000000b48] 30564 1726882838.15979: sending task result for task 0e448fcc-3ce9-4216-acec-000000000b48 30564 1726882838.16081: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000b48 30564 1726882838.16085: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 30564 1726882838.16146: no more pending results, returning what we have 30564 1726882838.16149: results queue empty 30564 1726882838.16150: checking for any_errors_fatal 30564 1726882838.16157: done checking for any_errors_fatal 30564 1726882838.16158: checking for max_fail_percentage 30564 1726882838.16160: done checking for max_fail_percentage 30564 1726882838.16160: checking to see if all hosts have failed and the running result is not ok 30564 1726882838.16161: done checking to see if all hosts have failed 30564 1726882838.16162: getting the remaining hosts for this loop 30564 1726882838.16168: done getting the remaining hosts for this loop 30564 1726882838.16173: getting the next task for host managed_node2 30564 1726882838.16184: done getting next task for host managed_node2 30564 1726882838.16186: ^ task is: TASK: meta (role_complete) 30564 1726882838.16192: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882838.16206: getting variables 30564 1726882838.16208: in VariableManager get_vars() 30564 1726882838.16243: Calling all_inventory to load vars for managed_node2 30564 1726882838.16245: Calling groups_inventory to load vars for managed_node2 30564 1726882838.16247: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882838.16256: Calling all_plugins_play to load vars for managed_node2 30564 1726882838.16259: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882838.16261: Calling groups_plugins_play to load vars for managed_node2 30564 1726882838.18879: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882838.21015: done with get_vars() 30564 1726882838.21038: done getting variables 30564 1726882838.21133: done queuing things up, now waiting for results queue to drain 30564 1726882838.21135: results queue empty 30564 1726882838.21136: checking for any_errors_fatal 30564 1726882838.21139: done checking for any_errors_fatal 30564 1726882838.21140: checking for max_fail_percentage 30564 1726882838.21141: done checking for max_fail_percentage 30564 1726882838.21141: checking to see if all hosts have failed and the running result is not ok 30564 1726882838.21142: done checking to see if all hosts have failed 30564 1726882838.21143: getting the remaining hosts for this loop 30564 1726882838.21144: done getting the remaining hosts for this loop 30564 1726882838.21146: getting the next task for host managed_node2 30564 1726882838.21151: done getting next task for host managed_node2 30564 1726882838.21154: ^ task is: TASK: Show result 30564 1726882838.21156: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882838.21163: getting variables 30564 1726882838.21166: in VariableManager get_vars() 30564 1726882838.21176: Calling all_inventory to load vars for managed_node2 30564 1726882838.21178: Calling groups_inventory to load vars for managed_node2 30564 1726882838.21180: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882838.21185: Calling all_plugins_play to load vars for managed_node2 30564 1726882838.21187: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882838.21190: Calling groups_plugins_play to load vars for managed_node2 30564 1726882838.23232: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882838.32534: done with get_vars() 30564 1726882838.32558: done getting variables 30564 1726882838.32613: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show result] ************************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml:14 Friday 20 September 2024 21:40:38 -0400 (0:00:00.629) 0:00:36.907 ****** 30564 1726882838.32638: entering _queue_task() for managed_node2/debug 30564 1726882838.33056: worker is 1 (out of 1 available) 30564 1726882838.33073: exiting _queue_task() for managed_node2/debug 30564 1726882838.33086: done queuing things up, now waiting for results queue to drain 30564 1726882838.33087: waiting for pending results... 30564 1726882838.33397: running TaskExecutor() for managed_node2/TASK: Show result 30564 1726882838.33532: in run() - task 0e448fcc-3ce9-4216-acec-000000000ad2 30564 1726882838.33566: variable 'ansible_search_path' from source: unknown 30564 1726882838.33570: variable 'ansible_search_path' from source: unknown 30564 1726882838.33597: calling self._execute() 30564 1726882838.33728: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882838.33734: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882838.33737: variable 'omit' from source: magic vars 30564 1726882838.34728: variable 'ansible_distribution_major_version' from source: facts 30564 1726882838.34750: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882838.34763: variable 'omit' from source: magic vars 30564 1726882838.34828: variable 'omit' from source: magic vars 30564 1726882838.34872: variable 'omit' from source: magic vars 30564 1726882838.34935: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882838.34980: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882838.35012: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882838.35045: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882838.35067: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882838.35108: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882838.35118: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882838.35131: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882838.35385: Set connection var ansible_timeout to 10 30564 1726882838.35524: Set connection var ansible_pipelining to False 30564 1726882838.35530: Set connection var ansible_shell_type to sh 30564 1726882838.35540: Set connection var ansible_shell_executable to /bin/sh 30564 1726882838.35551: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882838.35557: Set connection var ansible_connection to ssh 30564 1726882838.35587: variable 'ansible_shell_executable' from source: unknown 30564 1726882838.35594: variable 'ansible_connection' from source: unknown 30564 1726882838.35600: variable 'ansible_module_compression' from source: unknown 30564 1726882838.35607: variable 'ansible_shell_type' from source: unknown 30564 1726882838.35619: variable 'ansible_shell_executable' from source: unknown 30564 1726882838.35625: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882838.35631: variable 'ansible_pipelining' from source: unknown 30564 1726882838.35637: variable 'ansible_timeout' from source: unknown 30564 1726882838.35643: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882838.35809: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882838.35826: variable 'omit' from source: magic vars 30564 1726882838.35842: starting attempt loop 30564 1726882838.35848: running the handler 30564 1726882838.35899: variable '__network_connections_result' from source: set_fact 30564 1726882838.35997: variable '__network_connections_result' from source: set_fact 30564 1726882838.36120: handler run complete 30564 1726882838.36151: attempt loop complete, returning result 30564 1726882838.36168: _execute() done 30564 1726882838.36175: dumping result to json 30564 1726882838.36183: done dumping result, returning 30564 1726882838.36193: done running TaskExecutor() for managed_node2/TASK: Show result [0e448fcc-3ce9-4216-acec-000000000ad2] 30564 1726882838.36202: sending task result for task 0e448fcc-3ce9-4216-acec-000000000ad2 ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 6d0eee33-2e09-457c-9193-5de1eabb8deb\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 6d0eee33-2e09-457c-9193-5de1eabb8deb" ] } } 30564 1726882838.36408: no more pending results, returning what we have 30564 1726882838.36412: results queue empty 30564 1726882838.36413: checking for any_errors_fatal 30564 1726882838.36415: done checking for any_errors_fatal 30564 1726882838.36416: checking for max_fail_percentage 30564 1726882838.36418: done checking for max_fail_percentage 30564 1726882838.36418: checking to see if all hosts have failed and the running result is not ok 30564 1726882838.36419: done checking to see if all hosts have failed 30564 1726882838.36420: getting the remaining hosts for this loop 30564 1726882838.36422: done getting the remaining hosts for this loop 30564 1726882838.36426: getting the next task for host managed_node2 30564 1726882838.36437: done getting next task for host managed_node2 30564 1726882838.36441: ^ task is: TASK: Test 30564 1726882838.36444: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882838.36451: getting variables 30564 1726882838.36453: in VariableManager get_vars() 30564 1726882838.36490: Calling all_inventory to load vars for managed_node2 30564 1726882838.36493: Calling groups_inventory to load vars for managed_node2 30564 1726882838.36497: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882838.36509: Calling all_plugins_play to load vars for managed_node2 30564 1726882838.36513: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882838.36516: Calling groups_plugins_play to load vars for managed_node2 30564 1726882838.37216: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000ad2 30564 1726882838.37220: WORKER PROCESS EXITING 30564 1726882838.38741: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882838.44670: done with get_vars() 30564 1726882838.44708: done getting variables TASK [Test] ******************************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:30 Friday 20 September 2024 21:40:38 -0400 (0:00:00.121) 0:00:37.029 ****** 30564 1726882838.44840: entering _queue_task() for managed_node2/include_tasks 30564 1726882838.45269: worker is 1 (out of 1 available) 30564 1726882838.45283: exiting _queue_task() for managed_node2/include_tasks 30564 1726882838.45296: done queuing things up, now waiting for results queue to drain 30564 1726882838.45297: waiting for pending results... 30564 1726882838.45610: running TaskExecutor() for managed_node2/TASK: Test 30564 1726882838.45745: in run() - task 0e448fcc-3ce9-4216-acec-000000000a4d 30564 1726882838.45773: variable 'ansible_search_path' from source: unknown 30564 1726882838.45782: variable 'ansible_search_path' from source: unknown 30564 1726882838.45855: variable 'lsr_test' from source: include params 30564 1726882838.46108: variable 'lsr_test' from source: include params 30564 1726882838.46185: variable 'omit' from source: magic vars 30564 1726882838.46362: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882838.46382: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882838.46402: variable 'omit' from source: magic vars 30564 1726882838.46673: variable 'ansible_distribution_major_version' from source: facts 30564 1726882838.46691: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882838.46703: variable 'item' from source: unknown 30564 1726882838.47075: variable 'item' from source: unknown 30564 1726882838.47518: variable 'item' from source: unknown 30564 1726882838.48024: variable 'item' from source: unknown 30564 1726882838.48740: dumping result to json 30564 1726882838.48753: done dumping result, returning 30564 1726882838.48781: done running TaskExecutor() for managed_node2/TASK: Test [0e448fcc-3ce9-4216-acec-000000000a4d] 30564 1726882838.48796: sending task result for task 0e448fcc-3ce9-4216-acec-000000000a4d 30564 1726882838.48925: no more pending results, returning what we have 30564 1726882838.48934: in VariableManager get_vars() 30564 1726882838.48991: Calling all_inventory to load vars for managed_node2 30564 1726882838.48994: Calling groups_inventory to load vars for managed_node2 30564 1726882838.48998: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882838.49024: Calling all_plugins_play to load vars for managed_node2 30564 1726882838.49028: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882838.49032: Calling groups_plugins_play to load vars for managed_node2 30564 1726882838.50369: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000a4d 30564 1726882838.50373: WORKER PROCESS EXITING 30564 1726882838.52340: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882838.58138: done with get_vars() 30564 1726882838.58162: variable 'ansible_search_path' from source: unknown 30564 1726882838.58166: variable 'ansible_search_path' from source: unknown 30564 1726882838.58243: we have included files to process 30564 1726882838.58244: generating all_blocks data 30564 1726882838.58246: done generating all_blocks data 30564 1726882838.58251: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 30564 1726882838.58252: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 30564 1726882838.58254: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 30564 1726882838.58544: done processing included file 30564 1726882838.58553: iterating over new_blocks loaded from include file 30564 1726882838.58555: in VariableManager get_vars() 30564 1726882838.58580: done with get_vars() 30564 1726882838.58582: filtering new block on tags 30564 1726882838.58634: done filtering new block on tags 30564 1726882838.58636: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml for managed_node2 => (item=tasks/activate_profile.yml) 30564 1726882838.58641: extending task lists for all hosts with included blocks 30564 1726882838.60200: done extending task lists 30564 1726882838.60202: done processing included files 30564 1726882838.60202: results queue empty 30564 1726882838.60203: checking for any_errors_fatal 30564 1726882838.60207: done checking for any_errors_fatal 30564 1726882838.60208: checking for max_fail_percentage 30564 1726882838.60209: done checking for max_fail_percentage 30564 1726882838.60209: checking to see if all hosts have failed and the running result is not ok 30564 1726882838.60210: done checking to see if all hosts have failed 30564 1726882838.60211: getting the remaining hosts for this loop 30564 1726882838.60212: done getting the remaining hosts for this loop 30564 1726882838.60215: getting the next task for host managed_node2 30564 1726882838.60219: done getting next task for host managed_node2 30564 1726882838.60221: ^ task is: TASK: Include network role 30564 1726882838.60224: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882838.60226: getting variables 30564 1726882838.60227: in VariableManager get_vars() 30564 1726882838.60237: Calling all_inventory to load vars for managed_node2 30564 1726882838.60239: Calling groups_inventory to load vars for managed_node2 30564 1726882838.60241: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882838.60246: Calling all_plugins_play to load vars for managed_node2 30564 1726882838.60249: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882838.60251: Calling groups_plugins_play to load vars for managed_node2 30564 1726882838.62067: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882838.65996: done with get_vars() 30564 1726882838.66134: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml:3 Friday 20 September 2024 21:40:38 -0400 (0:00:00.213) 0:00:37.243 ****** 30564 1726882838.66230: entering _queue_task() for managed_node2/include_role 30564 1726882838.66972: worker is 1 (out of 1 available) 30564 1726882838.66984: exiting _queue_task() for managed_node2/include_role 30564 1726882838.67036: done queuing things up, now waiting for results queue to drain 30564 1726882838.67038: waiting for pending results... 30564 1726882838.67616: running TaskExecutor() for managed_node2/TASK: Include network role 30564 1726882838.67737: in run() - task 0e448fcc-3ce9-4216-acec-000000000caa 30564 1726882838.67747: variable 'ansible_search_path' from source: unknown 30564 1726882838.67753: variable 'ansible_search_path' from source: unknown 30564 1726882838.67796: calling self._execute() 30564 1726882838.67905: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882838.67911: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882838.67921: variable 'omit' from source: magic vars 30564 1726882838.68338: variable 'ansible_distribution_major_version' from source: facts 30564 1726882838.68354: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882838.68357: _execute() done 30564 1726882838.68360: dumping result to json 30564 1726882838.68362: done dumping result, returning 30564 1726882838.68370: done running TaskExecutor() for managed_node2/TASK: Include network role [0e448fcc-3ce9-4216-acec-000000000caa] 30564 1726882838.68380: sending task result for task 0e448fcc-3ce9-4216-acec-000000000caa 30564 1726882838.68490: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000caa 30564 1726882838.68493: WORKER PROCESS EXITING 30564 1726882838.68522: no more pending results, returning what we have 30564 1726882838.68528: in VariableManager get_vars() 30564 1726882838.68562: Calling all_inventory to load vars for managed_node2 30564 1726882838.68567: Calling groups_inventory to load vars for managed_node2 30564 1726882838.68571: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882838.68584: Calling all_plugins_play to load vars for managed_node2 30564 1726882838.68587: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882838.68590: Calling groups_plugins_play to load vars for managed_node2 30564 1726882838.71207: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882838.73795: done with get_vars() 30564 1726882838.73815: variable 'ansible_search_path' from source: unknown 30564 1726882838.73816: variable 'ansible_search_path' from source: unknown 30564 1726882838.73967: variable 'omit' from source: magic vars 30564 1726882838.74009: variable 'omit' from source: magic vars 30564 1726882838.74024: variable 'omit' from source: magic vars 30564 1726882838.74027: we have included files to process 30564 1726882838.74028: generating all_blocks data 30564 1726882838.74030: done generating all_blocks data 30564 1726882838.74031: processing included file: fedora.linux_system_roles.network 30564 1726882838.74060: in VariableManager get_vars() 30564 1726882838.74076: done with get_vars() 30564 1726882838.74159: in VariableManager get_vars() 30564 1726882838.74180: done with get_vars() 30564 1726882838.74219: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 30564 1726882838.74351: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 30564 1726882838.74437: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 30564 1726882838.75096: in VariableManager get_vars() 30564 1726882838.75116: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30564 1726882838.78177: iterating over new_blocks loaded from include file 30564 1726882838.78180: in VariableManager get_vars() 30564 1726882838.78196: done with get_vars() 30564 1726882838.78198: filtering new block on tags 30564 1726882838.78533: done filtering new block on tags 30564 1726882838.78536: in VariableManager get_vars() 30564 1726882838.78553: done with get_vars() 30564 1726882838.78555: filtering new block on tags 30564 1726882838.78572: done filtering new block on tags 30564 1726882838.78574: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node2 30564 1726882838.78580: extending task lists for all hosts with included blocks 30564 1726882838.78694: done extending task lists 30564 1726882838.78695: done processing included files 30564 1726882838.78696: results queue empty 30564 1726882838.78697: checking for any_errors_fatal 30564 1726882838.78700: done checking for any_errors_fatal 30564 1726882838.78701: checking for max_fail_percentage 30564 1726882838.78702: done checking for max_fail_percentage 30564 1726882838.78702: checking to see if all hosts have failed and the running result is not ok 30564 1726882838.78703: done checking to see if all hosts have failed 30564 1726882838.78704: getting the remaining hosts for this loop 30564 1726882838.78705: done getting the remaining hosts for this loop 30564 1726882838.78708: getting the next task for host managed_node2 30564 1726882838.78712: done getting next task for host managed_node2 30564 1726882838.78715: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30564 1726882838.78718: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882838.78727: getting variables 30564 1726882838.78728: in VariableManager get_vars() 30564 1726882838.78740: Calling all_inventory to load vars for managed_node2 30564 1726882838.78742: Calling groups_inventory to load vars for managed_node2 30564 1726882838.78744: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882838.78749: Calling all_plugins_play to load vars for managed_node2 30564 1726882838.78751: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882838.78754: Calling groups_plugins_play to load vars for managed_node2 30564 1726882838.80042: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882838.81761: done with get_vars() 30564 1726882838.81784: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:40:38 -0400 (0:00:00.156) 0:00:37.399 ****** 30564 1726882838.81858: entering _queue_task() for managed_node2/include_tasks 30564 1726882838.82187: worker is 1 (out of 1 available) 30564 1726882838.82199: exiting _queue_task() for managed_node2/include_tasks 30564 1726882838.82212: done queuing things up, now waiting for results queue to drain 30564 1726882838.82213: waiting for pending results... 30564 1726882838.82513: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30564 1726882838.82659: in run() - task 0e448fcc-3ce9-4216-acec-000000000d16 30564 1726882838.82686: variable 'ansible_search_path' from source: unknown 30564 1726882838.82694: variable 'ansible_search_path' from source: unknown 30564 1726882838.82733: calling self._execute() 30564 1726882838.82843: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882838.82857: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882838.82877: variable 'omit' from source: magic vars 30564 1726882838.83273: variable 'ansible_distribution_major_version' from source: facts 30564 1726882838.83293: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882838.83308: _execute() done 30564 1726882838.83321: dumping result to json 30564 1726882838.83328: done dumping result, returning 30564 1726882838.83338: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0e448fcc-3ce9-4216-acec-000000000d16] 30564 1726882838.83348: sending task result for task 0e448fcc-3ce9-4216-acec-000000000d16 30564 1726882838.83505: no more pending results, returning what we have 30564 1726882838.83510: in VariableManager get_vars() 30564 1726882838.83554: Calling all_inventory to load vars for managed_node2 30564 1726882838.83557: Calling groups_inventory to load vars for managed_node2 30564 1726882838.83560: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882838.83577: Calling all_plugins_play to load vars for managed_node2 30564 1726882838.83581: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882838.83584: Calling groups_plugins_play to load vars for managed_node2 30564 1726882838.84604: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000d16 30564 1726882838.84607: WORKER PROCESS EXITING 30564 1726882838.85469: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882838.87259: done with get_vars() 30564 1726882838.87285: variable 'ansible_search_path' from source: unknown 30564 1726882838.87286: variable 'ansible_search_path' from source: unknown 30564 1726882838.87323: we have included files to process 30564 1726882838.87324: generating all_blocks data 30564 1726882838.87325: done generating all_blocks data 30564 1726882838.87328: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30564 1726882838.87329: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30564 1726882838.87331: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30564 1726882838.87924: done processing included file 30564 1726882838.87926: iterating over new_blocks loaded from include file 30564 1726882838.87928: in VariableManager get_vars() 30564 1726882838.87952: done with get_vars() 30564 1726882838.87955: filtering new block on tags 30564 1726882838.87988: done filtering new block on tags 30564 1726882838.87992: in VariableManager get_vars() 30564 1726882838.88021: done with get_vars() 30564 1726882838.88023: filtering new block on tags 30564 1726882838.88070: done filtering new block on tags 30564 1726882838.88073: in VariableManager get_vars() 30564 1726882838.88094: done with get_vars() 30564 1726882838.88096: filtering new block on tags 30564 1726882838.88144: done filtering new block on tags 30564 1726882838.88146: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 30564 1726882838.88152: extending task lists for all hosts with included blocks 30564 1726882838.89994: done extending task lists 30564 1726882838.89995: done processing included files 30564 1726882838.89996: results queue empty 30564 1726882838.89996: checking for any_errors_fatal 30564 1726882838.90000: done checking for any_errors_fatal 30564 1726882838.90001: checking for max_fail_percentage 30564 1726882838.90002: done checking for max_fail_percentage 30564 1726882838.90002: checking to see if all hosts have failed and the running result is not ok 30564 1726882838.90003: done checking to see if all hosts have failed 30564 1726882838.90004: getting the remaining hosts for this loop 30564 1726882838.90005: done getting the remaining hosts for this loop 30564 1726882838.90008: getting the next task for host managed_node2 30564 1726882838.90012: done getting next task for host managed_node2 30564 1726882838.90014: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30564 1726882838.90018: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882838.90027: getting variables 30564 1726882838.90029: in VariableManager get_vars() 30564 1726882838.90041: Calling all_inventory to load vars for managed_node2 30564 1726882838.90044: Calling groups_inventory to load vars for managed_node2 30564 1726882838.90046: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882838.90051: Calling all_plugins_play to load vars for managed_node2 30564 1726882838.90054: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882838.90057: Calling groups_plugins_play to load vars for managed_node2 30564 1726882838.91326: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882838.93177: done with get_vars() 30564 1726882838.93198: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:40:38 -0400 (0:00:00.114) 0:00:37.514 ****** 30564 1726882838.93282: entering _queue_task() for managed_node2/setup 30564 1726882838.93625: worker is 1 (out of 1 available) 30564 1726882838.93638: exiting _queue_task() for managed_node2/setup 30564 1726882838.93656: done queuing things up, now waiting for results queue to drain 30564 1726882838.93658: waiting for pending results... 30564 1726882838.94199: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30564 1726882838.94360: in run() - task 0e448fcc-3ce9-4216-acec-000000000d6d 30564 1726882838.94389: variable 'ansible_search_path' from source: unknown 30564 1726882838.94398: variable 'ansible_search_path' from source: unknown 30564 1726882838.94442: calling self._execute() 30564 1726882838.94548: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882838.94561: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882838.94581: variable 'omit' from source: magic vars 30564 1726882838.95932: variable 'ansible_distribution_major_version' from source: facts 30564 1726882838.95950: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882838.96336: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882839.01994: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882839.02236: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882839.02431: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882839.02469: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882839.02554: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882839.02652: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882839.02691: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882839.02744: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882839.02789: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882839.02806: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882839.02872: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882839.02900: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882839.02927: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882839.02980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882839.02997: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882839.03181: variable '__network_required_facts' from source: role '' defaults 30564 1726882839.03194: variable 'ansible_facts' from source: unknown 30564 1726882839.04018: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 30564 1726882839.04030: when evaluation is False, skipping this task 30564 1726882839.04042: _execute() done 30564 1726882839.04049: dumping result to json 30564 1726882839.04057: done dumping result, returning 30564 1726882839.04071: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0e448fcc-3ce9-4216-acec-000000000d6d] 30564 1726882839.04082: sending task result for task 0e448fcc-3ce9-4216-acec-000000000d6d skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30564 1726882839.04232: no more pending results, returning what we have 30564 1726882839.04237: results queue empty 30564 1726882839.04238: checking for any_errors_fatal 30564 1726882839.04240: done checking for any_errors_fatal 30564 1726882839.04241: checking for max_fail_percentage 30564 1726882839.04243: done checking for max_fail_percentage 30564 1726882839.04244: checking to see if all hosts have failed and the running result is not ok 30564 1726882839.04244: done checking to see if all hosts have failed 30564 1726882839.04245: getting the remaining hosts for this loop 30564 1726882839.04247: done getting the remaining hosts for this loop 30564 1726882839.04252: getting the next task for host managed_node2 30564 1726882839.04270: done getting next task for host managed_node2 30564 1726882839.04275: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 30564 1726882839.04282: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882839.04304: getting variables 30564 1726882839.04306: in VariableManager get_vars() 30564 1726882839.04347: Calling all_inventory to load vars for managed_node2 30564 1726882839.04350: Calling groups_inventory to load vars for managed_node2 30564 1726882839.04352: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882839.04366: Calling all_plugins_play to load vars for managed_node2 30564 1726882839.04370: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882839.04373: Calling groups_plugins_play to load vars for managed_node2 30564 1726882839.06207: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000d6d 30564 1726882839.06215: WORKER PROCESS EXITING 30564 1726882839.07638: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882839.09624: done with get_vars() 30564 1726882839.09654: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:40:39 -0400 (0:00:00.164) 0:00:37.679 ****** 30564 1726882839.09773: entering _queue_task() for managed_node2/stat 30564 1726882839.10120: worker is 1 (out of 1 available) 30564 1726882839.10133: exiting _queue_task() for managed_node2/stat 30564 1726882839.10148: done queuing things up, now waiting for results queue to drain 30564 1726882839.10150: waiting for pending results... 30564 1726882839.10478: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 30564 1726882839.10690: in run() - task 0e448fcc-3ce9-4216-acec-000000000d6f 30564 1726882839.10742: variable 'ansible_search_path' from source: unknown 30564 1726882839.10746: variable 'ansible_search_path' from source: unknown 30564 1726882839.10800: calling self._execute() 30564 1726882839.10929: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882839.10940: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882839.10954: variable 'omit' from source: magic vars 30564 1726882839.11395: variable 'ansible_distribution_major_version' from source: facts 30564 1726882839.11409: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882839.11578: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882839.11889: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882839.11945: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882839.11983: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882839.12030: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882839.12135: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882839.12160: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882839.12191: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882839.12221: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882839.12321: variable '__network_is_ostree' from source: set_fact 30564 1726882839.12332: Evaluated conditional (not __network_is_ostree is defined): False 30564 1726882839.12340: when evaluation is False, skipping this task 30564 1726882839.12349: _execute() done 30564 1726882839.12352: dumping result to json 30564 1726882839.12355: done dumping result, returning 30564 1726882839.12360: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0e448fcc-3ce9-4216-acec-000000000d6f] 30564 1726882839.12368: sending task result for task 0e448fcc-3ce9-4216-acec-000000000d6f 30564 1726882839.12471: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000d6f 30564 1726882839.12475: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30564 1726882839.12528: no more pending results, returning what we have 30564 1726882839.12533: results queue empty 30564 1726882839.12534: checking for any_errors_fatal 30564 1726882839.12545: done checking for any_errors_fatal 30564 1726882839.12546: checking for max_fail_percentage 30564 1726882839.12549: done checking for max_fail_percentage 30564 1726882839.12549: checking to see if all hosts have failed and the running result is not ok 30564 1726882839.12550: done checking to see if all hosts have failed 30564 1726882839.12551: getting the remaining hosts for this loop 30564 1726882839.12553: done getting the remaining hosts for this loop 30564 1726882839.12558: getting the next task for host managed_node2 30564 1726882839.12579: done getting next task for host managed_node2 30564 1726882839.12583: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30564 1726882839.12590: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882839.12611: getting variables 30564 1726882839.12613: in VariableManager get_vars() 30564 1726882839.12651: Calling all_inventory to load vars for managed_node2 30564 1726882839.12654: Calling groups_inventory to load vars for managed_node2 30564 1726882839.12657: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882839.12674: Calling all_plugins_play to load vars for managed_node2 30564 1726882839.12679: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882839.12683: Calling groups_plugins_play to load vars for managed_node2 30564 1726882839.14712: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882839.17693: done with get_vars() 30564 1726882839.17724: done getting variables 30564 1726882839.17884: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:40:39 -0400 (0:00:00.081) 0:00:37.760 ****** 30564 1726882839.17929: entering _queue_task() for managed_node2/set_fact 30564 1726882839.18302: worker is 1 (out of 1 available) 30564 1726882839.18319: exiting _queue_task() for managed_node2/set_fact 30564 1726882839.18334: done queuing things up, now waiting for results queue to drain 30564 1726882839.18335: waiting for pending results... 30564 1726882839.18657: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30564 1726882839.18813: in run() - task 0e448fcc-3ce9-4216-acec-000000000d70 30564 1726882839.18826: variable 'ansible_search_path' from source: unknown 30564 1726882839.18829: variable 'ansible_search_path' from source: unknown 30564 1726882839.18871: calling self._execute() 30564 1726882839.18993: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882839.19004: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882839.19016: variable 'omit' from source: magic vars 30564 1726882839.19447: variable 'ansible_distribution_major_version' from source: facts 30564 1726882839.19460: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882839.19660: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882839.19959: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882839.20012: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882839.20044: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882839.20096: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882839.20187: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882839.20217: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882839.20245: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882839.20274: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882839.20370: variable '__network_is_ostree' from source: set_fact 30564 1726882839.20384: Evaluated conditional (not __network_is_ostree is defined): False 30564 1726882839.20387: when evaluation is False, skipping this task 30564 1726882839.20390: _execute() done 30564 1726882839.20392: dumping result to json 30564 1726882839.20400: done dumping result, returning 30564 1726882839.20413: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0e448fcc-3ce9-4216-acec-000000000d70] 30564 1726882839.20416: sending task result for task 0e448fcc-3ce9-4216-acec-000000000d70 30564 1726882839.20507: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000d70 30564 1726882839.20510: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30564 1726882839.20581: no more pending results, returning what we have 30564 1726882839.20586: results queue empty 30564 1726882839.20587: checking for any_errors_fatal 30564 1726882839.20596: done checking for any_errors_fatal 30564 1726882839.20597: checking for max_fail_percentage 30564 1726882839.20600: done checking for max_fail_percentage 30564 1726882839.20600: checking to see if all hosts have failed and the running result is not ok 30564 1726882839.20601: done checking to see if all hosts have failed 30564 1726882839.20602: getting the remaining hosts for this loop 30564 1726882839.20604: done getting the remaining hosts for this loop 30564 1726882839.20610: getting the next task for host managed_node2 30564 1726882839.20623: done getting next task for host managed_node2 30564 1726882839.20626: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 30564 1726882839.20633: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882839.20662: getting variables 30564 1726882839.20666: in VariableManager get_vars() 30564 1726882839.20707: Calling all_inventory to load vars for managed_node2 30564 1726882839.20710: Calling groups_inventory to load vars for managed_node2 30564 1726882839.20713: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882839.20724: Calling all_plugins_play to load vars for managed_node2 30564 1726882839.20727: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882839.20731: Calling groups_plugins_play to load vars for managed_node2 30564 1726882839.23104: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882839.24875: done with get_vars() 30564 1726882839.24891: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:40:39 -0400 (0:00:00.070) 0:00:37.830 ****** 30564 1726882839.24958: entering _queue_task() for managed_node2/service_facts 30564 1726882839.25219: worker is 1 (out of 1 available) 30564 1726882839.25232: exiting _queue_task() for managed_node2/service_facts 30564 1726882839.25246: done queuing things up, now waiting for results queue to drain 30564 1726882839.25248: waiting for pending results... 30564 1726882839.25559: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 30564 1726882839.25729: in run() - task 0e448fcc-3ce9-4216-acec-000000000d72 30564 1726882839.25748: variable 'ansible_search_path' from source: unknown 30564 1726882839.25755: variable 'ansible_search_path' from source: unknown 30564 1726882839.25795: calling self._execute() 30564 1726882839.25897: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882839.25915: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882839.25932: variable 'omit' from source: magic vars 30564 1726882839.26322: variable 'ansible_distribution_major_version' from source: facts 30564 1726882839.26342: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882839.26358: variable 'omit' from source: magic vars 30564 1726882839.26437: variable 'omit' from source: magic vars 30564 1726882839.26461: variable 'omit' from source: magic vars 30564 1726882839.26496: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882839.26522: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882839.26537: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882839.26551: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882839.26561: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882839.26601: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882839.26604: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882839.26607: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882839.26670: Set connection var ansible_timeout to 10 30564 1726882839.26673: Set connection var ansible_pipelining to False 30564 1726882839.26679: Set connection var ansible_shell_type to sh 30564 1726882839.26681: Set connection var ansible_shell_executable to /bin/sh 30564 1726882839.26694: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882839.26696: Set connection var ansible_connection to ssh 30564 1726882839.26712: variable 'ansible_shell_executable' from source: unknown 30564 1726882839.26715: variable 'ansible_connection' from source: unknown 30564 1726882839.26718: variable 'ansible_module_compression' from source: unknown 30564 1726882839.26720: variable 'ansible_shell_type' from source: unknown 30564 1726882839.26722: variable 'ansible_shell_executable' from source: unknown 30564 1726882839.26724: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882839.26726: variable 'ansible_pipelining' from source: unknown 30564 1726882839.26730: variable 'ansible_timeout' from source: unknown 30564 1726882839.26734: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882839.26880: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30564 1726882839.26890: variable 'omit' from source: magic vars 30564 1726882839.26895: starting attempt loop 30564 1726882839.26899: running the handler 30564 1726882839.26914: _low_level_execute_command(): starting 30564 1726882839.26917: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882839.27434: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882839.27442: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882839.27479: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882839.27493: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 30564 1726882839.27503: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882839.27622: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882839.27713: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882839.29378: stdout chunk (state=3): >>>/root <<< 30564 1726882839.29537: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882839.29541: stderr chunk (state=3): >>><<< 30564 1726882839.29545: stdout chunk (state=3): >>><<< 30564 1726882839.29569: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882839.29586: _low_level_execute_command(): starting 30564 1726882839.29592: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882839.295715-32304-220820449135921 `" && echo ansible-tmp-1726882839.295715-32304-220820449135921="` echo /root/.ansible/tmp/ansible-tmp-1726882839.295715-32304-220820449135921 `" ) && sleep 0' 30564 1726882839.30147: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882839.30156: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882839.30161: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882839.30179: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882839.30208: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882839.30214: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882839.30224: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882839.30235: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882839.30245: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882839.30252: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882839.30261: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882839.30268: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882839.30325: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882839.30333: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882839.30347: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882839.30461: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882839.32330: stdout chunk (state=3): >>>ansible-tmp-1726882839.295715-32304-220820449135921=/root/.ansible/tmp/ansible-tmp-1726882839.295715-32304-220820449135921 <<< 30564 1726882839.32444: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882839.32529: stderr chunk (state=3): >>><<< 30564 1726882839.32539: stdout chunk (state=3): >>><<< 30564 1726882839.32776: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882839.295715-32304-220820449135921=/root/.ansible/tmp/ansible-tmp-1726882839.295715-32304-220820449135921 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882839.32780: variable 'ansible_module_compression' from source: unknown 30564 1726882839.32782: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 30564 1726882839.32784: variable 'ansible_facts' from source: unknown 30564 1726882839.32870: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882839.295715-32304-220820449135921/AnsiballZ_service_facts.py 30564 1726882839.32949: Sending initial data 30564 1726882839.32952: Sent initial data (161 bytes) 30564 1726882839.33924: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882839.33927: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882839.33930: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882839.33932: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882839.33950: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882839.33955: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882839.33971: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882839.33979: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882839.33985: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882839.33996: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882839.34004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882839.34024: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 30564 1726882839.34027: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882839.34086: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882839.34094: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882839.34106: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882839.34228: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882839.35947: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882839.36040: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882839.36139: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmppulni_z9 /root/.ansible/tmp/ansible-tmp-1726882839.295715-32304-220820449135921/AnsiballZ_service_facts.py <<< 30564 1726882839.36235: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882839.37487: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882839.37610: stderr chunk (state=3): >>><<< 30564 1726882839.37613: stdout chunk (state=3): >>><<< 30564 1726882839.37616: done transferring module to remote 30564 1726882839.37618: _low_level_execute_command(): starting 30564 1726882839.37620: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882839.295715-32304-220820449135921/ /root/.ansible/tmp/ansible-tmp-1726882839.295715-32304-220820449135921/AnsiballZ_service_facts.py && sleep 0' 30564 1726882839.38048: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882839.38051: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882839.38095: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882839.38099: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882839.38105: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882839.38139: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882839.38143: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882839.38254: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882839.40018: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882839.40107: stderr chunk (state=3): >>><<< 30564 1726882839.40110: stdout chunk (state=3): >>><<< 30564 1726882839.40155: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882839.40158: _low_level_execute_command(): starting 30564 1726882839.40186: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882839.295715-32304-220820449135921/AnsiballZ_service_facts.py && sleep 0' 30564 1726882839.40798: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882839.40801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882839.40837: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882839.40841: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882839.40843: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882839.40892: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882839.40895: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882839.41013: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882840.74995: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-qu<<< 30564 1726882840.75026: stdout chunk (state=3): >>>it-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.servi<<< 30564 1726882840.75046: stdout chunk (state=3): >>>ce": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 30564 1726882840.76210: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882840.76269: stderr chunk (state=3): >>><<< 30564 1726882840.76272: stdout chunk (state=3): >>><<< 30564 1726882840.76295: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882840.76989: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882839.295715-32304-220820449135921/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882840.76996: _low_level_execute_command(): starting 30564 1726882840.77002: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882839.295715-32304-220820449135921/ > /dev/null 2>&1 && sleep 0' 30564 1726882840.77449: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882840.77453: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882840.77490: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882840.77502: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882840.77548: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882840.77565: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882840.77675: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882840.79504: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882840.79535: stderr chunk (state=3): >>><<< 30564 1726882840.79538: stdout chunk (state=3): >>><<< 30564 1726882840.79550: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882840.79556: handler run complete 30564 1726882840.79669: variable 'ansible_facts' from source: unknown 30564 1726882840.79775: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882840.80228: variable 'ansible_facts' from source: unknown 30564 1726882840.80350: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882840.80532: attempt loop complete, returning result 30564 1726882840.80535: _execute() done 30564 1726882840.80538: dumping result to json 30564 1726882840.80597: done dumping result, returning 30564 1726882840.80606: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [0e448fcc-3ce9-4216-acec-000000000d72] 30564 1726882840.80613: sending task result for task 0e448fcc-3ce9-4216-acec-000000000d72 30564 1726882840.81725: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000d72 30564 1726882840.81729: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30564 1726882840.81854: no more pending results, returning what we have 30564 1726882840.81857: results queue empty 30564 1726882840.81859: checking for any_errors_fatal 30564 1726882840.81862: done checking for any_errors_fatal 30564 1726882840.81866: checking for max_fail_percentage 30564 1726882840.81870: done checking for max_fail_percentage 30564 1726882840.81871: checking to see if all hosts have failed and the running result is not ok 30564 1726882840.81872: done checking to see if all hosts have failed 30564 1726882840.81873: getting the remaining hosts for this loop 30564 1726882840.81874: done getting the remaining hosts for this loop 30564 1726882840.81880: getting the next task for host managed_node2 30564 1726882840.81886: done getting next task for host managed_node2 30564 1726882840.81890: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 30564 1726882840.81897: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882840.81907: getting variables 30564 1726882840.81909: in VariableManager get_vars() 30564 1726882840.81936: Calling all_inventory to load vars for managed_node2 30564 1726882840.81939: Calling groups_inventory to load vars for managed_node2 30564 1726882840.81941: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882840.81950: Calling all_plugins_play to load vars for managed_node2 30564 1726882840.81953: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882840.81956: Calling groups_plugins_play to load vars for managed_node2 30564 1726882840.83330: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882840.85480: done with get_vars() 30564 1726882840.85496: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:40:40 -0400 (0:00:01.606) 0:00:39.436 ****** 30564 1726882840.85562: entering _queue_task() for managed_node2/package_facts 30564 1726882840.85798: worker is 1 (out of 1 available) 30564 1726882840.85809: exiting _queue_task() for managed_node2/package_facts 30564 1726882840.85823: done queuing things up, now waiting for results queue to drain 30564 1726882840.85824: waiting for pending results... 30564 1726882840.86010: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 30564 1726882840.86119: in run() - task 0e448fcc-3ce9-4216-acec-000000000d73 30564 1726882840.86136: variable 'ansible_search_path' from source: unknown 30564 1726882840.86140: variable 'ansible_search_path' from source: unknown 30564 1726882840.86165: calling self._execute() 30564 1726882840.86239: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882840.86243: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882840.86251: variable 'omit' from source: magic vars 30564 1726882840.86524: variable 'ansible_distribution_major_version' from source: facts 30564 1726882840.86534: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882840.86540: variable 'omit' from source: magic vars 30564 1726882840.86594: variable 'omit' from source: magic vars 30564 1726882840.86614: variable 'omit' from source: magic vars 30564 1726882840.86645: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882840.86673: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882840.86691: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882840.86706: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882840.86716: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882840.86738: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882840.86741: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882840.86744: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882840.86817: Set connection var ansible_timeout to 10 30564 1726882840.86821: Set connection var ansible_pipelining to False 30564 1726882840.86823: Set connection var ansible_shell_type to sh 30564 1726882840.86829: Set connection var ansible_shell_executable to /bin/sh 30564 1726882840.86836: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882840.86838: Set connection var ansible_connection to ssh 30564 1726882840.86857: variable 'ansible_shell_executable' from source: unknown 30564 1726882840.86860: variable 'ansible_connection' from source: unknown 30564 1726882840.86863: variable 'ansible_module_compression' from source: unknown 30564 1726882840.86867: variable 'ansible_shell_type' from source: unknown 30564 1726882840.86872: variable 'ansible_shell_executable' from source: unknown 30564 1726882840.86874: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882840.86876: variable 'ansible_pipelining' from source: unknown 30564 1726882840.86878: variable 'ansible_timeout' from source: unknown 30564 1726882840.86882: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882840.87027: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30564 1726882840.87031: variable 'omit' from source: magic vars 30564 1726882840.87038: starting attempt loop 30564 1726882840.87041: running the handler 30564 1726882840.87052: _low_level_execute_command(): starting 30564 1726882840.87059: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882840.87549: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882840.87566: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882840.87591: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 30564 1726882840.87607: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882840.87650: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882840.87662: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882840.87787: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882840.89437: stdout chunk (state=3): >>>/root <<< 30564 1726882840.89540: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882840.89586: stderr chunk (state=3): >>><<< 30564 1726882840.89589: stdout chunk (state=3): >>><<< 30564 1726882840.89611: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882840.89625: _low_level_execute_command(): starting 30564 1726882840.89630: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882840.896125-32372-141862981354677 `" && echo ansible-tmp-1726882840.896125-32372-141862981354677="` echo /root/.ansible/tmp/ansible-tmp-1726882840.896125-32372-141862981354677 `" ) && sleep 0' 30564 1726882840.90051: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882840.90079: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 30564 1726882840.90087: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882840.90105: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882840.90170: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882840.90174: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882840.90268: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882840.92126: stdout chunk (state=3): >>>ansible-tmp-1726882840.896125-32372-141862981354677=/root/.ansible/tmp/ansible-tmp-1726882840.896125-32372-141862981354677 <<< 30564 1726882840.92235: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882840.92280: stderr chunk (state=3): >>><<< 30564 1726882840.92285: stdout chunk (state=3): >>><<< 30564 1726882840.92300: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882840.896125-32372-141862981354677=/root/.ansible/tmp/ansible-tmp-1726882840.896125-32372-141862981354677 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882840.92333: variable 'ansible_module_compression' from source: unknown 30564 1726882840.92370: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 30564 1726882840.92414: variable 'ansible_facts' from source: unknown 30564 1726882840.92541: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882840.896125-32372-141862981354677/AnsiballZ_package_facts.py 30564 1726882840.92651: Sending initial data 30564 1726882840.92654: Sent initial data (161 bytes) 30564 1726882840.93307: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882840.93313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882840.93340: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882840.93352: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882840.93409: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882840.93421: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882840.93526: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882840.95362: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 <<< 30564 1726882840.95369: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882840.95457: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882840.95554: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmpg7ekqzf4 /root/.ansible/tmp/ansible-tmp-1726882840.896125-32372-141862981354677/AnsiballZ_package_facts.py <<< 30564 1726882840.95652: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882840.97643: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882840.97733: stderr chunk (state=3): >>><<< 30564 1726882840.97736: stdout chunk (state=3): >>><<< 30564 1726882840.97750: done transferring module to remote 30564 1726882840.97758: _low_level_execute_command(): starting 30564 1726882840.97762: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882840.896125-32372-141862981354677/ /root/.ansible/tmp/ansible-tmp-1726882840.896125-32372-141862981354677/AnsiballZ_package_facts.py && sleep 0' 30564 1726882840.98187: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882840.98193: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882840.98221: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882840.98232: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882840.98290: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882840.98302: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882840.98404: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882841.00165: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882841.00211: stderr chunk (state=3): >>><<< 30564 1726882841.00214: stdout chunk (state=3): >>><<< 30564 1726882841.00224: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882841.00232: _low_level_execute_command(): starting 30564 1726882841.00234: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882840.896125-32372-141862981354677/AnsiballZ_package_facts.py && sleep 0' 30564 1726882841.00636: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882841.00643: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882841.00674: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882841.00687: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882841.00744: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882841.00753: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882841.00869: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882841.47339: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "<<< 30564 1726882841.47503: stdout chunk (state=3): >>>rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", <<< 30564 1726882841.47578: stdout chunk (state=3): >>>"release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 30564 1726882841.48974: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882841.49050: stderr chunk (state=3): >>><<< 30564 1726882841.49053: stdout chunk (state=3): >>><<< 30564 1726882841.49280: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882841.51491: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882840.896125-32372-141862981354677/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882841.51516: _low_level_execute_command(): starting 30564 1726882841.51526: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882840.896125-32372-141862981354677/ > /dev/null 2>&1 && sleep 0' 30564 1726882841.52145: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882841.52160: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882841.52182: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882841.52201: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882841.52244: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882841.52257: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882841.52278: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882841.52297: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882841.52310: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882841.52322: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882841.52335: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882841.52349: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882841.52370: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882841.52384: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882841.52396: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882841.52410: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882841.52489: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882841.52506: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882841.52520: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882841.52670: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882841.54517: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882841.54582: stderr chunk (state=3): >>><<< 30564 1726882841.54585: stdout chunk (state=3): >>><<< 30564 1726882841.55272: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882841.55276: handler run complete 30564 1726882841.55576: variable 'ansible_facts' from source: unknown 30564 1726882841.56089: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882841.58239: variable 'ansible_facts' from source: unknown 30564 1726882841.58728: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882841.59530: attempt loop complete, returning result 30564 1726882841.59547: _execute() done 30564 1726882841.59553: dumping result to json 30564 1726882841.59775: done dumping result, returning 30564 1726882841.59787: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0e448fcc-3ce9-4216-acec-000000000d73] 30564 1726882841.59797: sending task result for task 0e448fcc-3ce9-4216-acec-000000000d73 30564 1726882841.61862: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000d73 30564 1726882841.61871: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30564 1726882841.62026: no more pending results, returning what we have 30564 1726882841.62029: results queue empty 30564 1726882841.62030: checking for any_errors_fatal 30564 1726882841.62038: done checking for any_errors_fatal 30564 1726882841.62039: checking for max_fail_percentage 30564 1726882841.62041: done checking for max_fail_percentage 30564 1726882841.62041: checking to see if all hosts have failed and the running result is not ok 30564 1726882841.62042: done checking to see if all hosts have failed 30564 1726882841.62043: getting the remaining hosts for this loop 30564 1726882841.62045: done getting the remaining hosts for this loop 30564 1726882841.62048: getting the next task for host managed_node2 30564 1726882841.62056: done getting next task for host managed_node2 30564 1726882841.62060: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 30564 1726882841.62072: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882841.62083: getting variables 30564 1726882841.62085: in VariableManager get_vars() 30564 1726882841.62112: Calling all_inventory to load vars for managed_node2 30564 1726882841.62114: Calling groups_inventory to load vars for managed_node2 30564 1726882841.62120: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882841.62130: Calling all_plugins_play to load vars for managed_node2 30564 1726882841.62133: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882841.62136: Calling groups_plugins_play to load vars for managed_node2 30564 1726882841.63470: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882841.64498: done with get_vars() 30564 1726882841.64515: done getting variables 30564 1726882841.64558: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:40:41 -0400 (0:00:00.790) 0:00:40.227 ****** 30564 1726882841.64587: entering _queue_task() for managed_node2/debug 30564 1726882841.64807: worker is 1 (out of 1 available) 30564 1726882841.64822: exiting _queue_task() for managed_node2/debug 30564 1726882841.64835: done queuing things up, now waiting for results queue to drain 30564 1726882841.64839: waiting for pending results... 30564 1726882841.65037: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 30564 1726882841.65138: in run() - task 0e448fcc-3ce9-4216-acec-000000000d17 30564 1726882841.65150: variable 'ansible_search_path' from source: unknown 30564 1726882841.65158: variable 'ansible_search_path' from source: unknown 30564 1726882841.65241: calling self._execute() 30564 1726882841.65417: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882841.65423: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882841.65427: variable 'omit' from source: magic vars 30564 1726882841.66256: variable 'ansible_distribution_major_version' from source: facts 30564 1726882841.66266: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882841.66271: variable 'omit' from source: magic vars 30564 1726882841.66274: variable 'omit' from source: magic vars 30564 1726882841.66276: variable 'network_provider' from source: set_fact 30564 1726882841.66280: variable 'omit' from source: magic vars 30564 1726882841.66282: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882841.66285: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882841.66287: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882841.66290: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882841.66292: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882841.66294: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882841.66296: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882841.66299: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882841.66301: Set connection var ansible_timeout to 10 30564 1726882841.66303: Set connection var ansible_pipelining to False 30564 1726882841.66305: Set connection var ansible_shell_type to sh 30564 1726882841.66307: Set connection var ansible_shell_executable to /bin/sh 30564 1726882841.66309: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882841.66311: Set connection var ansible_connection to ssh 30564 1726882841.66313: variable 'ansible_shell_executable' from source: unknown 30564 1726882841.66315: variable 'ansible_connection' from source: unknown 30564 1726882841.66317: variable 'ansible_module_compression' from source: unknown 30564 1726882841.66320: variable 'ansible_shell_type' from source: unknown 30564 1726882841.66321: variable 'ansible_shell_executable' from source: unknown 30564 1726882841.66324: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882841.66326: variable 'ansible_pipelining' from source: unknown 30564 1726882841.66328: variable 'ansible_timeout' from source: unknown 30564 1726882841.66330: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882841.66333: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882841.66336: variable 'omit' from source: magic vars 30564 1726882841.66338: starting attempt loop 30564 1726882841.66340: running the handler 30564 1726882841.66689: handler run complete 30564 1726882841.66692: attempt loop complete, returning result 30564 1726882841.66695: _execute() done 30564 1726882841.66697: dumping result to json 30564 1726882841.66699: done dumping result, returning 30564 1726882841.66700: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [0e448fcc-3ce9-4216-acec-000000000d17] 30564 1726882841.66702: sending task result for task 0e448fcc-3ce9-4216-acec-000000000d17 30564 1726882841.66767: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000d17 30564 1726882841.66770: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: Using network provider: nm 30564 1726882841.66840: no more pending results, returning what we have 30564 1726882841.66843: results queue empty 30564 1726882841.66844: checking for any_errors_fatal 30564 1726882841.66854: done checking for any_errors_fatal 30564 1726882841.66855: checking for max_fail_percentage 30564 1726882841.66857: done checking for max_fail_percentage 30564 1726882841.66857: checking to see if all hosts have failed and the running result is not ok 30564 1726882841.66858: done checking to see if all hosts have failed 30564 1726882841.66859: getting the remaining hosts for this loop 30564 1726882841.66860: done getting the remaining hosts for this loop 30564 1726882841.66865: getting the next task for host managed_node2 30564 1726882841.66872: done getting next task for host managed_node2 30564 1726882841.66876: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30564 1726882841.66880: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882841.66900: getting variables 30564 1726882841.66902: in VariableManager get_vars() 30564 1726882841.66935: Calling all_inventory to load vars for managed_node2 30564 1726882841.66937: Calling groups_inventory to load vars for managed_node2 30564 1726882841.66939: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882841.66946: Calling all_plugins_play to load vars for managed_node2 30564 1726882841.66947: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882841.66949: Calling groups_plugins_play to load vars for managed_node2 30564 1726882841.68295: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882841.69385: done with get_vars() 30564 1726882841.69400: done getting variables 30564 1726882841.69442: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:40:41 -0400 (0:00:00.048) 0:00:40.276 ****** 30564 1726882841.69472: entering _queue_task() for managed_node2/fail 30564 1726882841.69672: worker is 1 (out of 1 available) 30564 1726882841.69685: exiting _queue_task() for managed_node2/fail 30564 1726882841.69697: done queuing things up, now waiting for results queue to drain 30564 1726882841.69699: waiting for pending results... 30564 1726882841.69894: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30564 1726882841.69992: in run() - task 0e448fcc-3ce9-4216-acec-000000000d18 30564 1726882841.70004: variable 'ansible_search_path' from source: unknown 30564 1726882841.70008: variable 'ansible_search_path' from source: unknown 30564 1726882841.70036: calling self._execute() 30564 1726882841.70111: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882841.70116: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882841.70125: variable 'omit' from source: magic vars 30564 1726882841.70471: variable 'ansible_distribution_major_version' from source: facts 30564 1726882841.70498: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882841.70636: variable 'network_state' from source: role '' defaults 30564 1726882841.70654: Evaluated conditional (network_state != {}): False 30564 1726882841.70662: when evaluation is False, skipping this task 30564 1726882841.70674: _execute() done 30564 1726882841.70683: dumping result to json 30564 1726882841.70691: done dumping result, returning 30564 1726882841.70711: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0e448fcc-3ce9-4216-acec-000000000d18] 30564 1726882841.70721: sending task result for task 0e448fcc-3ce9-4216-acec-000000000d18 30564 1726882841.70837: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000d18 30564 1726882841.70844: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30564 1726882841.70899: no more pending results, returning what we have 30564 1726882841.70903: results queue empty 30564 1726882841.70904: checking for any_errors_fatal 30564 1726882841.70911: done checking for any_errors_fatal 30564 1726882841.70911: checking for max_fail_percentage 30564 1726882841.70913: done checking for max_fail_percentage 30564 1726882841.70914: checking to see if all hosts have failed and the running result is not ok 30564 1726882841.70915: done checking to see if all hosts have failed 30564 1726882841.70915: getting the remaining hosts for this loop 30564 1726882841.70917: done getting the remaining hosts for this loop 30564 1726882841.70921: getting the next task for host managed_node2 30564 1726882841.70929: done getting next task for host managed_node2 30564 1726882841.70942: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30564 1726882841.70948: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882841.70971: getting variables 30564 1726882841.70974: in VariableManager get_vars() 30564 1726882841.71007: Calling all_inventory to load vars for managed_node2 30564 1726882841.71010: Calling groups_inventory to load vars for managed_node2 30564 1726882841.71012: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882841.71025: Calling all_plugins_play to load vars for managed_node2 30564 1726882841.71028: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882841.71030: Calling groups_plugins_play to load vars for managed_node2 30564 1726882841.75795: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882841.76881: done with get_vars() 30564 1726882841.76904: done getting variables 30564 1726882841.76949: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:40:41 -0400 (0:00:00.075) 0:00:40.351 ****** 30564 1726882841.76979: entering _queue_task() for managed_node2/fail 30564 1726882841.77286: worker is 1 (out of 1 available) 30564 1726882841.77297: exiting _queue_task() for managed_node2/fail 30564 1726882841.77308: done queuing things up, now waiting for results queue to drain 30564 1726882841.77309: waiting for pending results... 30564 1726882841.77769: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30564 1726882841.77919: in run() - task 0e448fcc-3ce9-4216-acec-000000000d19 30564 1726882841.77937: variable 'ansible_search_path' from source: unknown 30564 1726882841.77942: variable 'ansible_search_path' from source: unknown 30564 1726882841.77984: calling self._execute() 30564 1726882841.78100: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882841.78105: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882841.78116: variable 'omit' from source: magic vars 30564 1726882841.78525: variable 'ansible_distribution_major_version' from source: facts 30564 1726882841.78536: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882841.78668: variable 'network_state' from source: role '' defaults 30564 1726882841.78682: Evaluated conditional (network_state != {}): False 30564 1726882841.78686: when evaluation is False, skipping this task 30564 1726882841.78696: _execute() done 30564 1726882841.78699: dumping result to json 30564 1726882841.78702: done dumping result, returning 30564 1726882841.78706: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0e448fcc-3ce9-4216-acec-000000000d19] 30564 1726882841.78713: sending task result for task 0e448fcc-3ce9-4216-acec-000000000d19 30564 1726882841.78813: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000d19 30564 1726882841.78816: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30564 1726882841.78860: no more pending results, returning what we have 30564 1726882841.78866: results queue empty 30564 1726882841.78867: checking for any_errors_fatal 30564 1726882841.78877: done checking for any_errors_fatal 30564 1726882841.78878: checking for max_fail_percentage 30564 1726882841.78879: done checking for max_fail_percentage 30564 1726882841.78880: checking to see if all hosts have failed and the running result is not ok 30564 1726882841.78881: done checking to see if all hosts have failed 30564 1726882841.78882: getting the remaining hosts for this loop 30564 1726882841.78883: done getting the remaining hosts for this loop 30564 1726882841.78887: getting the next task for host managed_node2 30564 1726882841.78895: done getting next task for host managed_node2 30564 1726882841.78898: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30564 1726882841.78904: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882841.78922: getting variables 30564 1726882841.78924: in VariableManager get_vars() 30564 1726882841.78954: Calling all_inventory to load vars for managed_node2 30564 1726882841.78956: Calling groups_inventory to load vars for managed_node2 30564 1726882841.78958: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882841.78969: Calling all_plugins_play to load vars for managed_node2 30564 1726882841.78972: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882841.78975: Calling groups_plugins_play to load vars for managed_node2 30564 1726882841.80345: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882841.82111: done with get_vars() 30564 1726882841.82131: done getting variables 30564 1726882841.82193: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:40:41 -0400 (0:00:00.052) 0:00:40.403 ****** 30564 1726882841.82225: entering _queue_task() for managed_node2/fail 30564 1726882841.82506: worker is 1 (out of 1 available) 30564 1726882841.82519: exiting _queue_task() for managed_node2/fail 30564 1726882841.82532: done queuing things up, now waiting for results queue to drain 30564 1726882841.82533: waiting for pending results... 30564 1726882841.83081: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30564 1726882841.83223: in run() - task 0e448fcc-3ce9-4216-acec-000000000d1a 30564 1726882841.83247: variable 'ansible_search_path' from source: unknown 30564 1726882841.83255: variable 'ansible_search_path' from source: unknown 30564 1726882841.83297: calling self._execute() 30564 1726882841.83400: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882841.83413: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882841.83429: variable 'omit' from source: magic vars 30564 1726882841.83807: variable 'ansible_distribution_major_version' from source: facts 30564 1726882841.83825: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882841.84004: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882841.87879: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882841.87976: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882841.88182: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882841.88222: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882841.88257: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882841.88341: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882841.88380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882841.88411: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882841.88461: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882841.88485: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882841.88589: variable 'ansible_distribution_major_version' from source: facts 30564 1726882841.88610: Evaluated conditional (ansible_distribution_major_version | int > 9): False 30564 1726882841.88619: when evaluation is False, skipping this task 30564 1726882841.88627: _execute() done 30564 1726882841.88634: dumping result to json 30564 1726882841.88642: done dumping result, returning 30564 1726882841.88655: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0e448fcc-3ce9-4216-acec-000000000d1a] 30564 1726882841.88670: sending task result for task 0e448fcc-3ce9-4216-acec-000000000d1a skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 30564 1726882841.88825: no more pending results, returning what we have 30564 1726882841.88829: results queue empty 30564 1726882841.88830: checking for any_errors_fatal 30564 1726882841.88837: done checking for any_errors_fatal 30564 1726882841.88838: checking for max_fail_percentage 30564 1726882841.88840: done checking for max_fail_percentage 30564 1726882841.88841: checking to see if all hosts have failed and the running result is not ok 30564 1726882841.88842: done checking to see if all hosts have failed 30564 1726882841.88843: getting the remaining hosts for this loop 30564 1726882841.88845: done getting the remaining hosts for this loop 30564 1726882841.88849: getting the next task for host managed_node2 30564 1726882841.88859: done getting next task for host managed_node2 30564 1726882841.88866: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30564 1726882841.88873: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882841.88891: getting variables 30564 1726882841.88893: in VariableManager get_vars() 30564 1726882841.88941: Calling all_inventory to load vars for managed_node2 30564 1726882841.88944: Calling groups_inventory to load vars for managed_node2 30564 1726882841.88947: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882841.88958: Calling all_plugins_play to load vars for managed_node2 30564 1726882841.88961: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882841.88966: Calling groups_plugins_play to load vars for managed_node2 30564 1726882841.90141: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000d1a 30564 1726882841.90144: WORKER PROCESS EXITING 30564 1726882841.90881: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882841.92866: done with get_vars() 30564 1726882841.92890: done getting variables 30564 1726882841.92943: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:40:41 -0400 (0:00:00.107) 0:00:40.511 ****** 30564 1726882841.92979: entering _queue_task() for managed_node2/dnf 30564 1726882841.93204: worker is 1 (out of 1 available) 30564 1726882841.93218: exiting _queue_task() for managed_node2/dnf 30564 1726882841.93230: done queuing things up, now waiting for results queue to drain 30564 1726882841.93232: waiting for pending results... 30564 1726882841.93427: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30564 1726882841.93530: in run() - task 0e448fcc-3ce9-4216-acec-000000000d1b 30564 1726882841.93540: variable 'ansible_search_path' from source: unknown 30564 1726882841.93544: variable 'ansible_search_path' from source: unknown 30564 1726882841.93574: calling self._execute() 30564 1726882841.93650: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882841.93654: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882841.93663: variable 'omit' from source: magic vars 30564 1726882841.93936: variable 'ansible_distribution_major_version' from source: facts 30564 1726882841.93949: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882841.94088: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882841.95854: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882841.95945: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882841.95980: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882841.96022: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882841.96047: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882841.96130: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882841.96158: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882841.96185: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882841.96233: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882841.96248: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882841.96382: variable 'ansible_distribution' from source: facts 30564 1726882841.96389: variable 'ansible_distribution_major_version' from source: facts 30564 1726882841.96406: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 30564 1726882841.96612: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882841.96758: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882841.96782: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882841.96819: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882841.96845: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882841.96858: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882841.96891: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882841.96907: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882841.96923: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882841.96948: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882841.96958: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882841.96993: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882841.97009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882841.97028: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882841.97052: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882841.97062: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882841.97177: variable 'network_connections' from source: include params 30564 1726882841.97189: variable 'interface' from source: play vars 30564 1726882841.97244: variable 'interface' from source: play vars 30564 1726882841.97292: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882841.97413: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882841.97440: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882841.97464: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882841.97487: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882841.97521: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882841.97537: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882841.97558: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882841.97579: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882841.97616: variable '__network_team_connections_defined' from source: role '' defaults 30564 1726882841.97770: variable 'network_connections' from source: include params 30564 1726882841.97774: variable 'interface' from source: play vars 30564 1726882841.97814: variable 'interface' from source: play vars 30564 1726882841.97832: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30564 1726882841.97836: when evaluation is False, skipping this task 30564 1726882841.97840: _execute() done 30564 1726882841.97843: dumping result to json 30564 1726882841.97845: done dumping result, returning 30564 1726882841.97856: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0e448fcc-3ce9-4216-acec-000000000d1b] 30564 1726882841.97859: sending task result for task 0e448fcc-3ce9-4216-acec-000000000d1b 30564 1726882841.97947: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000d1b 30564 1726882841.97950: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30564 1726882841.98007: no more pending results, returning what we have 30564 1726882841.98011: results queue empty 30564 1726882841.98012: checking for any_errors_fatal 30564 1726882841.98019: done checking for any_errors_fatal 30564 1726882841.98020: checking for max_fail_percentage 30564 1726882841.98022: done checking for max_fail_percentage 30564 1726882841.98022: checking to see if all hosts have failed and the running result is not ok 30564 1726882841.98023: done checking to see if all hosts have failed 30564 1726882841.98024: getting the remaining hosts for this loop 30564 1726882841.98026: done getting the remaining hosts for this loop 30564 1726882841.98029: getting the next task for host managed_node2 30564 1726882841.98036: done getting next task for host managed_node2 30564 1726882841.98040: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30564 1726882841.98045: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882841.98063: getting variables 30564 1726882841.98065: in VariableManager get_vars() 30564 1726882841.98099: Calling all_inventory to load vars for managed_node2 30564 1726882841.98101: Calling groups_inventory to load vars for managed_node2 30564 1726882841.98103: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882841.98112: Calling all_plugins_play to load vars for managed_node2 30564 1726882841.98114: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882841.98117: Calling groups_plugins_play to load vars for managed_node2 30564 1726882841.99038: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882842.00389: done with get_vars() 30564 1726882842.00411: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30564 1726882842.00487: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:40:42 -0400 (0:00:00.075) 0:00:40.586 ****** 30564 1726882842.00517: entering _queue_task() for managed_node2/yum 30564 1726882842.00804: worker is 1 (out of 1 available) 30564 1726882842.00817: exiting _queue_task() for managed_node2/yum 30564 1726882842.00830: done queuing things up, now waiting for results queue to drain 30564 1726882842.00831: waiting for pending results... 30564 1726882842.01138: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30564 1726882842.01309: in run() - task 0e448fcc-3ce9-4216-acec-000000000d1c 30564 1726882842.01367: variable 'ansible_search_path' from source: unknown 30564 1726882842.01387: variable 'ansible_search_path' from source: unknown 30564 1726882842.01431: calling self._execute() 30564 1726882842.01551: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882842.01555: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882842.01566: variable 'omit' from source: magic vars 30564 1726882842.01854: variable 'ansible_distribution_major_version' from source: facts 30564 1726882842.01865: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882842.01995: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882842.03575: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882842.03627: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882842.03653: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882842.03683: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882842.03703: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882842.03757: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882842.03780: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882842.03804: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882842.03828: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882842.03838: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882842.03912: variable 'ansible_distribution_major_version' from source: facts 30564 1726882842.03920: Evaluated conditional (ansible_distribution_major_version | int < 8): False 30564 1726882842.03923: when evaluation is False, skipping this task 30564 1726882842.03926: _execute() done 30564 1726882842.03928: dumping result to json 30564 1726882842.03930: done dumping result, returning 30564 1726882842.03937: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0e448fcc-3ce9-4216-acec-000000000d1c] 30564 1726882842.03942: sending task result for task 0e448fcc-3ce9-4216-acec-000000000d1c 30564 1726882842.04031: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000d1c 30564 1726882842.04034: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 30564 1726882842.04124: no more pending results, returning what we have 30564 1726882842.04127: results queue empty 30564 1726882842.04128: checking for any_errors_fatal 30564 1726882842.04134: done checking for any_errors_fatal 30564 1726882842.04135: checking for max_fail_percentage 30564 1726882842.04136: done checking for max_fail_percentage 30564 1726882842.04137: checking to see if all hosts have failed and the running result is not ok 30564 1726882842.04138: done checking to see if all hosts have failed 30564 1726882842.04138: getting the remaining hosts for this loop 30564 1726882842.04140: done getting the remaining hosts for this loop 30564 1726882842.04145: getting the next task for host managed_node2 30564 1726882842.04151: done getting next task for host managed_node2 30564 1726882842.04155: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30564 1726882842.04159: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882842.04219: getting variables 30564 1726882842.04220: in VariableManager get_vars() 30564 1726882842.04247: Calling all_inventory to load vars for managed_node2 30564 1726882842.04249: Calling groups_inventory to load vars for managed_node2 30564 1726882842.04251: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882842.04257: Calling all_plugins_play to load vars for managed_node2 30564 1726882842.04259: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882842.04260: Calling groups_plugins_play to load vars for managed_node2 30564 1726882842.05566: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882842.07172: done with get_vars() 30564 1726882842.07187: done getting variables 30564 1726882842.07225: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:40:42 -0400 (0:00:00.067) 0:00:40.653 ****** 30564 1726882842.07248: entering _queue_task() for managed_node2/fail 30564 1726882842.07446: worker is 1 (out of 1 available) 30564 1726882842.07458: exiting _queue_task() for managed_node2/fail 30564 1726882842.07471: done queuing things up, now waiting for results queue to drain 30564 1726882842.07473: waiting for pending results... 30564 1726882842.07653: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30564 1726882842.07746: in run() - task 0e448fcc-3ce9-4216-acec-000000000d1d 30564 1726882842.07755: variable 'ansible_search_path' from source: unknown 30564 1726882842.07759: variable 'ansible_search_path' from source: unknown 30564 1726882842.07789: calling self._execute() 30564 1726882842.07869: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882842.07874: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882842.07882: variable 'omit' from source: magic vars 30564 1726882842.08149: variable 'ansible_distribution_major_version' from source: facts 30564 1726882842.08159: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882842.08243: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882842.08375: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882842.10688: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882842.10729: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882842.10774: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882842.10794: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882842.10813: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882842.10871: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882842.10893: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882842.10911: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882842.10936: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882842.10947: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882842.10983: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882842.11001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882842.11018: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882842.11043: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882842.11053: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882842.11085: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882842.11104: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882842.11129: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882842.11154: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882842.11165: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882842.11279: variable 'network_connections' from source: include params 30564 1726882842.11287: variable 'interface' from source: play vars 30564 1726882842.11337: variable 'interface' from source: play vars 30564 1726882842.11387: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882842.11495: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882842.11523: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882842.11547: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882842.11579: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882842.11608: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882842.11626: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882842.11645: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882842.11664: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882842.11702: variable '__network_team_connections_defined' from source: role '' defaults 30564 1726882842.11853: variable 'network_connections' from source: include params 30564 1726882842.11856: variable 'interface' from source: play vars 30564 1726882842.11902: variable 'interface' from source: play vars 30564 1726882842.11919: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30564 1726882842.11923: when evaluation is False, skipping this task 30564 1726882842.11925: _execute() done 30564 1726882842.11928: dumping result to json 30564 1726882842.11930: done dumping result, returning 30564 1726882842.11936: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-4216-acec-000000000d1d] 30564 1726882842.11941: sending task result for task 0e448fcc-3ce9-4216-acec-000000000d1d 30564 1726882842.12037: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000d1d 30564 1726882842.12039: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30564 1726882842.12088: no more pending results, returning what we have 30564 1726882842.12092: results queue empty 30564 1726882842.12093: checking for any_errors_fatal 30564 1726882842.12098: done checking for any_errors_fatal 30564 1726882842.12099: checking for max_fail_percentage 30564 1726882842.12100: done checking for max_fail_percentage 30564 1726882842.12101: checking to see if all hosts have failed and the running result is not ok 30564 1726882842.12102: done checking to see if all hosts have failed 30564 1726882842.12103: getting the remaining hosts for this loop 30564 1726882842.12105: done getting the remaining hosts for this loop 30564 1726882842.12108: getting the next task for host managed_node2 30564 1726882842.12115: done getting next task for host managed_node2 30564 1726882842.12119: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 30564 1726882842.12124: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882842.12142: getting variables 30564 1726882842.12143: in VariableManager get_vars() 30564 1726882842.12186: Calling all_inventory to load vars for managed_node2 30564 1726882842.12189: Calling groups_inventory to load vars for managed_node2 30564 1726882842.12191: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882842.12200: Calling all_plugins_play to load vars for managed_node2 30564 1726882842.12202: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882842.12205: Calling groups_plugins_play to load vars for managed_node2 30564 1726882842.13134: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882842.14086: done with get_vars() 30564 1726882842.14101: done getting variables 30564 1726882842.14143: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:40:42 -0400 (0:00:00.069) 0:00:40.723 ****** 30564 1726882842.14169: entering _queue_task() for managed_node2/package 30564 1726882842.14361: worker is 1 (out of 1 available) 30564 1726882842.14374: exiting _queue_task() for managed_node2/package 30564 1726882842.14386: done queuing things up, now waiting for results queue to drain 30564 1726882842.14387: waiting for pending results... 30564 1726882842.14587: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 30564 1726882842.14680: in run() - task 0e448fcc-3ce9-4216-acec-000000000d1e 30564 1726882842.14691: variable 'ansible_search_path' from source: unknown 30564 1726882842.14695: variable 'ansible_search_path' from source: unknown 30564 1726882842.14723: calling self._execute() 30564 1726882842.14805: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882842.14808: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882842.14817: variable 'omit' from source: magic vars 30564 1726882842.15098: variable 'ansible_distribution_major_version' from source: facts 30564 1726882842.15110: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882842.15249: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882842.15438: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882842.15477: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882842.15502: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882842.15545: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882842.15623: variable 'network_packages' from source: role '' defaults 30564 1726882842.15699: variable '__network_provider_setup' from source: role '' defaults 30564 1726882842.15707: variable '__network_service_name_default_nm' from source: role '' defaults 30564 1726882842.15751: variable '__network_service_name_default_nm' from source: role '' defaults 30564 1726882842.15762: variable '__network_packages_default_nm' from source: role '' defaults 30564 1726882842.15806: variable '__network_packages_default_nm' from source: role '' defaults 30564 1726882842.15924: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882842.17351: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882842.17398: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882842.17425: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882842.17449: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882842.17470: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882842.17536: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882842.17556: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882842.17578: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882842.17609: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882842.17619: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882842.17649: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882842.17667: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882842.17686: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882842.17717: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882842.17725: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882842.17868: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30564 1726882842.17940: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882842.17956: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882842.17977: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882842.18003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882842.18013: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882842.18080: variable 'ansible_python' from source: facts 30564 1726882842.18093: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30564 1726882842.18151: variable '__network_wpa_supplicant_required' from source: role '' defaults 30564 1726882842.18205: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30564 1726882842.18292: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882842.18308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882842.18327: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882842.18353: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882842.18368: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882842.18400: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882842.18419: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882842.18437: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882842.18466: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882842.18481: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882842.18571: variable 'network_connections' from source: include params 30564 1726882842.18580: variable 'interface' from source: play vars 30564 1726882842.18648: variable 'interface' from source: play vars 30564 1726882842.18702: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882842.18721: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882842.18741: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882842.18764: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882842.18807: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882842.18984: variable 'network_connections' from source: include params 30564 1726882842.18987: variable 'interface' from source: play vars 30564 1726882842.19057: variable 'interface' from source: play vars 30564 1726882842.19084: variable '__network_packages_default_wireless' from source: role '' defaults 30564 1726882842.19139: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882842.19339: variable 'network_connections' from source: include params 30564 1726882842.19342: variable 'interface' from source: play vars 30564 1726882842.19390: variable 'interface' from source: play vars 30564 1726882842.19406: variable '__network_packages_default_team' from source: role '' defaults 30564 1726882842.19462: variable '__network_team_connections_defined' from source: role '' defaults 30564 1726882842.19659: variable 'network_connections' from source: include params 30564 1726882842.19667: variable 'interface' from source: play vars 30564 1726882842.19716: variable 'interface' from source: play vars 30564 1726882842.19753: variable '__network_service_name_default_initscripts' from source: role '' defaults 30564 1726882842.19801: variable '__network_service_name_default_initscripts' from source: role '' defaults 30564 1726882842.19807: variable '__network_packages_default_initscripts' from source: role '' defaults 30564 1726882842.19848: variable '__network_packages_default_initscripts' from source: role '' defaults 30564 1726882842.19994: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30564 1726882842.20301: variable 'network_connections' from source: include params 30564 1726882842.20304: variable 'interface' from source: play vars 30564 1726882842.20352: variable 'interface' from source: play vars 30564 1726882842.20358: variable 'ansible_distribution' from source: facts 30564 1726882842.20361: variable '__network_rh_distros' from source: role '' defaults 30564 1726882842.20371: variable 'ansible_distribution_major_version' from source: facts 30564 1726882842.20382: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30564 1726882842.20491: variable 'ansible_distribution' from source: facts 30564 1726882842.20494: variable '__network_rh_distros' from source: role '' defaults 30564 1726882842.20499: variable 'ansible_distribution_major_version' from source: facts 30564 1726882842.20509: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30564 1726882842.20617: variable 'ansible_distribution' from source: facts 30564 1726882842.20620: variable '__network_rh_distros' from source: role '' defaults 30564 1726882842.20625: variable 'ansible_distribution_major_version' from source: facts 30564 1726882842.20652: variable 'network_provider' from source: set_fact 30564 1726882842.20664: variable 'ansible_facts' from source: unknown 30564 1726882842.21220: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 30564 1726882842.21224: when evaluation is False, skipping this task 30564 1726882842.21227: _execute() done 30564 1726882842.21229: dumping result to json 30564 1726882842.21231: done dumping result, returning 30564 1726882842.21237: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [0e448fcc-3ce9-4216-acec-000000000d1e] 30564 1726882842.21243: sending task result for task 0e448fcc-3ce9-4216-acec-000000000d1e 30564 1726882842.21340: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000d1e 30564 1726882842.21343: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 30564 1726882842.21398: no more pending results, returning what we have 30564 1726882842.21405: results queue empty 30564 1726882842.21407: checking for any_errors_fatal 30564 1726882842.21418: done checking for any_errors_fatal 30564 1726882842.21419: checking for max_fail_percentage 30564 1726882842.21421: done checking for max_fail_percentage 30564 1726882842.21421: checking to see if all hosts have failed and the running result is not ok 30564 1726882842.21422: done checking to see if all hosts have failed 30564 1726882842.21423: getting the remaining hosts for this loop 30564 1726882842.21425: done getting the remaining hosts for this loop 30564 1726882842.21428: getting the next task for host managed_node2 30564 1726882842.21435: done getting next task for host managed_node2 30564 1726882842.21439: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30564 1726882842.21445: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882842.21461: getting variables 30564 1726882842.21463: in VariableManager get_vars() 30564 1726882842.21505: Calling all_inventory to load vars for managed_node2 30564 1726882842.21507: Calling groups_inventory to load vars for managed_node2 30564 1726882842.21513: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882842.21525: Calling all_plugins_play to load vars for managed_node2 30564 1726882842.21528: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882842.21531: Calling groups_plugins_play to load vars for managed_node2 30564 1726882842.22334: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882842.23406: done with get_vars() 30564 1726882842.23421: done getting variables 30564 1726882842.23463: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:40:42 -0400 (0:00:00.093) 0:00:40.816 ****** 30564 1726882842.23490: entering _queue_task() for managed_node2/package 30564 1726882842.23687: worker is 1 (out of 1 available) 30564 1726882842.23702: exiting _queue_task() for managed_node2/package 30564 1726882842.23714: done queuing things up, now waiting for results queue to drain 30564 1726882842.23715: waiting for pending results... 30564 1726882842.23896: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30564 1726882842.23981: in run() - task 0e448fcc-3ce9-4216-acec-000000000d1f 30564 1726882842.23993: variable 'ansible_search_path' from source: unknown 30564 1726882842.23997: variable 'ansible_search_path' from source: unknown 30564 1726882842.24026: calling self._execute() 30564 1726882842.24100: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882842.24105: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882842.24114: variable 'omit' from source: magic vars 30564 1726882842.24385: variable 'ansible_distribution_major_version' from source: facts 30564 1726882842.24395: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882842.24482: variable 'network_state' from source: role '' defaults 30564 1726882842.24491: Evaluated conditional (network_state != {}): False 30564 1726882842.24495: when evaluation is False, skipping this task 30564 1726882842.24497: _execute() done 30564 1726882842.24500: dumping result to json 30564 1726882842.24502: done dumping result, returning 30564 1726882842.24509: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0e448fcc-3ce9-4216-acec-000000000d1f] 30564 1726882842.24514: sending task result for task 0e448fcc-3ce9-4216-acec-000000000d1f 30564 1726882842.24616: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000d1f 30564 1726882842.24620: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30564 1726882842.24662: no more pending results, returning what we have 30564 1726882842.24669: results queue empty 30564 1726882842.24670: checking for any_errors_fatal 30564 1726882842.24675: done checking for any_errors_fatal 30564 1726882842.24676: checking for max_fail_percentage 30564 1726882842.24678: done checking for max_fail_percentage 30564 1726882842.24678: checking to see if all hosts have failed and the running result is not ok 30564 1726882842.24679: done checking to see if all hosts have failed 30564 1726882842.24680: getting the remaining hosts for this loop 30564 1726882842.24681: done getting the remaining hosts for this loop 30564 1726882842.24692: getting the next task for host managed_node2 30564 1726882842.24699: done getting next task for host managed_node2 30564 1726882842.24703: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30564 1726882842.24707: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882842.24725: getting variables 30564 1726882842.24726: in VariableManager get_vars() 30564 1726882842.24752: Calling all_inventory to load vars for managed_node2 30564 1726882842.24754: Calling groups_inventory to load vars for managed_node2 30564 1726882842.24756: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882842.24762: Calling all_plugins_play to load vars for managed_node2 30564 1726882842.24767: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882842.24771: Calling groups_plugins_play to load vars for managed_node2 30564 1726882842.25547: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882842.26504: done with get_vars() 30564 1726882842.26519: done getting variables 30564 1726882842.26559: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:40:42 -0400 (0:00:00.030) 0:00:40.847 ****** 30564 1726882842.26585: entering _queue_task() for managed_node2/package 30564 1726882842.26774: worker is 1 (out of 1 available) 30564 1726882842.26787: exiting _queue_task() for managed_node2/package 30564 1726882842.26799: done queuing things up, now waiting for results queue to drain 30564 1726882842.26800: waiting for pending results... 30564 1726882842.26998: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30564 1726882842.27094: in run() - task 0e448fcc-3ce9-4216-acec-000000000d20 30564 1726882842.27105: variable 'ansible_search_path' from source: unknown 30564 1726882842.27108: variable 'ansible_search_path' from source: unknown 30564 1726882842.27137: calling self._execute() 30564 1726882842.27220: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882842.27223: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882842.27232: variable 'omit' from source: magic vars 30564 1726882842.27512: variable 'ansible_distribution_major_version' from source: facts 30564 1726882842.27523: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882842.27608: variable 'network_state' from source: role '' defaults 30564 1726882842.27616: Evaluated conditional (network_state != {}): False 30564 1726882842.27619: when evaluation is False, skipping this task 30564 1726882842.27623: _execute() done 30564 1726882842.27626: dumping result to json 30564 1726882842.27628: done dumping result, returning 30564 1726882842.27636: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0e448fcc-3ce9-4216-acec-000000000d20] 30564 1726882842.27642: sending task result for task 0e448fcc-3ce9-4216-acec-000000000d20 30564 1726882842.27732: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000d20 30564 1726882842.27735: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30564 1726882842.27787: no more pending results, returning what we have 30564 1726882842.27790: results queue empty 30564 1726882842.27791: checking for any_errors_fatal 30564 1726882842.27798: done checking for any_errors_fatal 30564 1726882842.27798: checking for max_fail_percentage 30564 1726882842.27800: done checking for max_fail_percentage 30564 1726882842.27801: checking to see if all hosts have failed and the running result is not ok 30564 1726882842.27801: done checking to see if all hosts have failed 30564 1726882842.27802: getting the remaining hosts for this loop 30564 1726882842.27803: done getting the remaining hosts for this loop 30564 1726882842.27807: getting the next task for host managed_node2 30564 1726882842.27813: done getting next task for host managed_node2 30564 1726882842.27816: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30564 1726882842.27821: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882842.27837: getting variables 30564 1726882842.27838: in VariableManager get_vars() 30564 1726882842.27876: Calling all_inventory to load vars for managed_node2 30564 1726882842.27879: Calling groups_inventory to load vars for managed_node2 30564 1726882842.27880: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882842.27887: Calling all_plugins_play to load vars for managed_node2 30564 1726882842.27888: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882842.27890: Calling groups_plugins_play to load vars for managed_node2 30564 1726882842.28797: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882842.29750: done with get_vars() 30564 1726882842.29767: done getting variables 30564 1726882842.29811: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:40:42 -0400 (0:00:00.032) 0:00:40.879 ****** 30564 1726882842.29834: entering _queue_task() for managed_node2/service 30564 1726882842.30022: worker is 1 (out of 1 available) 30564 1726882842.30033: exiting _queue_task() for managed_node2/service 30564 1726882842.30047: done queuing things up, now waiting for results queue to drain 30564 1726882842.30048: waiting for pending results... 30564 1726882842.30239: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30564 1726882842.30338: in run() - task 0e448fcc-3ce9-4216-acec-000000000d21 30564 1726882842.30350: variable 'ansible_search_path' from source: unknown 30564 1726882842.30356: variable 'ansible_search_path' from source: unknown 30564 1726882842.30387: calling self._execute() 30564 1726882842.30458: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882842.30462: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882842.30475: variable 'omit' from source: magic vars 30564 1726882842.30741: variable 'ansible_distribution_major_version' from source: facts 30564 1726882842.30751: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882842.30838: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882842.30970: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882842.32693: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882842.32696: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882842.32714: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882842.32749: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882842.32778: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882842.32851: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882842.32882: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882842.32907: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882842.32950: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882842.32965: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882842.33007: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882842.33030: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882842.33052: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882842.33092: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882842.33106: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882842.33143: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882842.33166: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882842.33190: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882842.33225: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882842.33239: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882842.33398: variable 'network_connections' from source: include params 30564 1726882842.33408: variable 'interface' from source: play vars 30564 1726882842.33474: variable 'interface' from source: play vars 30564 1726882842.33540: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882842.33691: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882842.33739: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882842.33764: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882842.33794: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882842.33831: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882842.33852: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882842.33878: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882842.33902: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882842.33955: variable '__network_team_connections_defined' from source: role '' defaults 30564 1726882842.34142: variable 'network_connections' from source: include params 30564 1726882842.34146: variable 'interface' from source: play vars 30564 1726882842.34192: variable 'interface' from source: play vars 30564 1726882842.34210: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30564 1726882842.34215: when evaluation is False, skipping this task 30564 1726882842.34218: _execute() done 30564 1726882842.34220: dumping result to json 30564 1726882842.34222: done dumping result, returning 30564 1726882842.34230: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-4216-acec-000000000d21] 30564 1726882842.34232: sending task result for task 0e448fcc-3ce9-4216-acec-000000000d21 30564 1726882842.34320: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000d21 30564 1726882842.34329: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30564 1726882842.34384: no more pending results, returning what we have 30564 1726882842.34387: results queue empty 30564 1726882842.34388: checking for any_errors_fatal 30564 1726882842.34395: done checking for any_errors_fatal 30564 1726882842.34396: checking for max_fail_percentage 30564 1726882842.34397: done checking for max_fail_percentage 30564 1726882842.34398: checking to see if all hosts have failed and the running result is not ok 30564 1726882842.34399: done checking to see if all hosts have failed 30564 1726882842.34400: getting the remaining hosts for this loop 30564 1726882842.34401: done getting the remaining hosts for this loop 30564 1726882842.34405: getting the next task for host managed_node2 30564 1726882842.34411: done getting next task for host managed_node2 30564 1726882842.34415: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30564 1726882842.34420: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882842.34436: getting variables 30564 1726882842.34438: in VariableManager get_vars() 30564 1726882842.34470: Calling all_inventory to load vars for managed_node2 30564 1726882842.34473: Calling groups_inventory to load vars for managed_node2 30564 1726882842.34476: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882842.34485: Calling all_plugins_play to load vars for managed_node2 30564 1726882842.34487: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882842.34489: Calling groups_plugins_play to load vars for managed_node2 30564 1726882842.35298: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882842.36365: done with get_vars() 30564 1726882842.36383: done getting variables 30564 1726882842.36421: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:40:42 -0400 (0:00:00.066) 0:00:40.945 ****** 30564 1726882842.36442: entering _queue_task() for managed_node2/service 30564 1726882842.36639: worker is 1 (out of 1 available) 30564 1726882842.36651: exiting _queue_task() for managed_node2/service 30564 1726882842.36665: done queuing things up, now waiting for results queue to drain 30564 1726882842.36666: waiting for pending results... 30564 1726882842.37083: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30564 1726882842.37089: in run() - task 0e448fcc-3ce9-4216-acec-000000000d22 30564 1726882842.37092: variable 'ansible_search_path' from source: unknown 30564 1726882842.37094: variable 'ansible_search_path' from source: unknown 30564 1726882842.37097: calling self._execute() 30564 1726882842.37280: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882842.37284: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882842.37286: variable 'omit' from source: magic vars 30564 1726882842.37544: variable 'ansible_distribution_major_version' from source: facts 30564 1726882842.37557: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882842.37715: variable 'network_provider' from source: set_fact 30564 1726882842.37719: variable 'network_state' from source: role '' defaults 30564 1726882842.37729: Evaluated conditional (network_provider == "nm" or network_state != {}): True 30564 1726882842.37735: variable 'omit' from source: magic vars 30564 1726882842.37798: variable 'omit' from source: magic vars 30564 1726882842.37823: variable 'network_service_name' from source: role '' defaults 30564 1726882842.37891: variable 'network_service_name' from source: role '' defaults 30564 1726882842.37992: variable '__network_provider_setup' from source: role '' defaults 30564 1726882842.38002: variable '__network_service_name_default_nm' from source: role '' defaults 30564 1726882842.38066: variable '__network_service_name_default_nm' from source: role '' defaults 30564 1726882842.38072: variable '__network_packages_default_nm' from source: role '' defaults 30564 1726882842.38137: variable '__network_packages_default_nm' from source: role '' defaults 30564 1726882842.38359: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882842.40602: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882842.40680: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882842.40721: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882842.40753: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882842.40781: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882842.40860: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882842.40890: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882842.40915: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882842.40970: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882842.40982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882842.41026: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882842.41055: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882842.41081: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882842.41121: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882842.41135: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882842.41366: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30564 1726882842.41473: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882842.41496: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882842.41518: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882842.41554: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882842.41570: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882842.41653: variable 'ansible_python' from source: facts 30564 1726882842.41671: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30564 1726882842.41754: variable '__network_wpa_supplicant_required' from source: role '' defaults 30564 1726882842.41834: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30564 1726882842.41957: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882842.41982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882842.42012: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882842.42047: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882842.42060: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882842.42105: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882842.42130: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882842.42151: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882842.42188: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882842.42200: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882842.42334: variable 'network_connections' from source: include params 30564 1726882842.42341: variable 'interface' from source: play vars 30564 1726882842.42414: variable 'interface' from source: play vars 30564 1726882842.42518: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882842.42709: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882842.42754: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882842.42801: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882842.42839: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882842.42900: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882842.42929: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882842.42960: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882842.43002: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882842.43047: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882842.43327: variable 'network_connections' from source: include params 30564 1726882842.43335: variable 'interface' from source: play vars 30564 1726882842.43405: variable 'interface' from source: play vars 30564 1726882842.43441: variable '__network_packages_default_wireless' from source: role '' defaults 30564 1726882842.43515: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882842.43811: variable 'network_connections' from source: include params 30564 1726882842.43816: variable 'interface' from source: play vars 30564 1726882842.43893: variable 'interface' from source: play vars 30564 1726882842.43912: variable '__network_packages_default_team' from source: role '' defaults 30564 1726882842.43991: variable '__network_team_connections_defined' from source: role '' defaults 30564 1726882842.44286: variable 'network_connections' from source: include params 30564 1726882842.44291: variable 'interface' from source: play vars 30564 1726882842.44357: variable 'interface' from source: play vars 30564 1726882842.44415: variable '__network_service_name_default_initscripts' from source: role '' defaults 30564 1726882842.44473: variable '__network_service_name_default_initscripts' from source: role '' defaults 30564 1726882842.44476: variable '__network_packages_default_initscripts' from source: role '' defaults 30564 1726882842.44536: variable '__network_packages_default_initscripts' from source: role '' defaults 30564 1726882842.44970: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30564 1726882842.45266: variable 'network_connections' from source: include params 30564 1726882842.45272: variable 'interface' from source: play vars 30564 1726882842.45329: variable 'interface' from source: play vars 30564 1726882842.45336: variable 'ansible_distribution' from source: facts 30564 1726882842.45339: variable '__network_rh_distros' from source: role '' defaults 30564 1726882842.45346: variable 'ansible_distribution_major_version' from source: facts 30564 1726882842.45360: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30564 1726882842.45517: variable 'ansible_distribution' from source: facts 30564 1726882842.45520: variable '__network_rh_distros' from source: role '' defaults 30564 1726882842.45523: variable 'ansible_distribution_major_version' from source: facts 30564 1726882842.45537: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30564 1726882842.45696: variable 'ansible_distribution' from source: facts 30564 1726882842.45700: variable '__network_rh_distros' from source: role '' defaults 30564 1726882842.45705: variable 'ansible_distribution_major_version' from source: facts 30564 1726882842.45737: variable 'network_provider' from source: set_fact 30564 1726882842.45758: variable 'omit' from source: magic vars 30564 1726882842.45787: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882842.45818: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882842.45835: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882842.45851: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882842.45861: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882842.45891: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882842.45894: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882842.45897: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882842.45992: Set connection var ansible_timeout to 10 30564 1726882842.45996: Set connection var ansible_pipelining to False 30564 1726882842.45998: Set connection var ansible_shell_type to sh 30564 1726882842.46005: Set connection var ansible_shell_executable to /bin/sh 30564 1726882842.46018: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882842.46021: Set connection var ansible_connection to ssh 30564 1726882842.46048: variable 'ansible_shell_executable' from source: unknown 30564 1726882842.46051: variable 'ansible_connection' from source: unknown 30564 1726882842.46053: variable 'ansible_module_compression' from source: unknown 30564 1726882842.46055: variable 'ansible_shell_type' from source: unknown 30564 1726882842.46057: variable 'ansible_shell_executable' from source: unknown 30564 1726882842.46060: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882842.46067: variable 'ansible_pipelining' from source: unknown 30564 1726882842.46072: variable 'ansible_timeout' from source: unknown 30564 1726882842.46075: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882842.46178: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882842.46186: variable 'omit' from source: magic vars 30564 1726882842.46194: starting attempt loop 30564 1726882842.46196: running the handler 30564 1726882842.46275: variable 'ansible_facts' from source: unknown 30564 1726882842.47103: _low_level_execute_command(): starting 30564 1726882842.47110: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882842.47835: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882842.47846: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882842.47859: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882842.47880: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882842.47919: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882842.47926: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882842.47935: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882842.47949: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882842.47956: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882842.47962: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882842.47973: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882842.47985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882842.47996: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882842.48004: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882842.48010: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882842.48020: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882842.48097: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882842.48117: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882842.48129: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882842.48266: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882842.49933: stdout chunk (state=3): >>>/root <<< 30564 1726882842.50036: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882842.50098: stderr chunk (state=3): >>><<< 30564 1726882842.50101: stdout chunk (state=3): >>><<< 30564 1726882842.50122: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882842.50133: _low_level_execute_command(): starting 30564 1726882842.50138: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882842.5012152-32428-1614936259514 `" && echo ansible-tmp-1726882842.5012152-32428-1614936259514="` echo /root/.ansible/tmp/ansible-tmp-1726882842.5012152-32428-1614936259514 `" ) && sleep 0' 30564 1726882842.50745: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882842.50755: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882842.50765: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882842.50780: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882842.50817: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882842.50824: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882842.50833: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882842.50846: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882842.50855: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882842.50864: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882842.50871: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882842.50880: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882842.50891: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882842.50899: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882842.50905: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882842.50914: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882842.50987: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882842.51000: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882842.51010: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882842.51295: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882842.53163: stdout chunk (state=3): >>>ansible-tmp-1726882842.5012152-32428-1614936259514=/root/.ansible/tmp/ansible-tmp-1726882842.5012152-32428-1614936259514 <<< 30564 1726882842.53335: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882842.53339: stdout chunk (state=3): >>><<< 30564 1726882842.53344: stderr chunk (state=3): >>><<< 30564 1726882842.53359: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882842.5012152-32428-1614936259514=/root/.ansible/tmp/ansible-tmp-1726882842.5012152-32428-1614936259514 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882842.53393: variable 'ansible_module_compression' from source: unknown 30564 1726882842.53444: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 30564 1726882842.53499: variable 'ansible_facts' from source: unknown 30564 1726882842.53677: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882842.5012152-32428-1614936259514/AnsiballZ_systemd.py 30564 1726882842.53821: Sending initial data 30564 1726882842.53824: Sent initial data (154 bytes) 30564 1726882842.54717: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882842.54725: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882842.54734: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882842.54746: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882842.54783: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882842.54791: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882842.54803: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882842.54811: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882842.54819: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882842.54824: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882842.54831: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882842.54840: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882842.54850: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882842.54856: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882842.54862: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882842.54876: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882842.54945: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882842.54958: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882842.54973: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882842.55098: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882842.56846: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882842.56942: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882842.57041: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmps1wm860p /root/.ansible/tmp/ansible-tmp-1726882842.5012152-32428-1614936259514/AnsiballZ_systemd.py <<< 30564 1726882842.57134: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882842.59849: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882842.59962: stderr chunk (state=3): >>><<< 30564 1726882842.59967: stdout chunk (state=3): >>><<< 30564 1726882842.59970: done transferring module to remote 30564 1726882842.59972: _low_level_execute_command(): starting 30564 1726882842.59974: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882842.5012152-32428-1614936259514/ /root/.ansible/tmp/ansible-tmp-1726882842.5012152-32428-1614936259514/AnsiballZ_systemd.py && sleep 0' 30564 1726882842.60516: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882842.60531: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882842.60545: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882842.60561: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882842.60606: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882842.60617: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882842.60630: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882842.60647: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882842.60658: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882842.60672: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882842.60684: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882842.60697: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882842.60710: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882842.60721: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882842.60731: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882842.60743: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882842.60821: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882842.60838: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882842.60853: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882842.60997: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882842.62825: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882842.62888: stderr chunk (state=3): >>><<< 30564 1726882842.62891: stdout chunk (state=3): >>><<< 30564 1726882842.62974: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882842.62980: _low_level_execute_command(): starting 30564 1726882842.62983: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882842.5012152-32428-1614936259514/AnsiballZ_systemd.py && sleep 0' 30564 1726882842.63513: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882842.63526: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882842.63539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882842.63556: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882842.63601: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882842.63614: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882842.63629: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882842.63647: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882842.63660: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882842.63678: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882842.63692: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882842.63706: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882842.63722: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882842.63735: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882842.63746: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882842.63759: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882842.63837: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882842.63854: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882842.63874: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882842.64012: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882842.89236: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6692", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ExecMainStartTimestampMonotonic": "202392137", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6692", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManag<<< 30564 1726882842.89277: stdout chunk (state=3): >>>er.service", "ControlGroupId": "3602", "MemoryCurrent": "9150464", "MemoryAvailable": "infinity", "CPUUsageNSec": "2140289000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Watchdo<<< 30564 1726882842.89287: stdout chunk (state=3): >>>gSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service multi-user.target network.target shutdown.target cloud-init.service", "After": "cloud-init-local.service dbus-broker.service network-pre.target system.slice dbus.socket systemd-journald.socket basic.target sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:57 EDT", "StateChangeTimestampMonotonic": "316658837", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveExitTimestampMonotonic": "202392395", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveEnterTimestampMonotonic": "202472383", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveExitTimestampMonotonic": "202362940", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveEnterTimestampMonotonic": "202381901", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ConditionTimestampMonotonic": "202382734", "AssertTimestamp": "Fri 2024-09-20 21:31:03 EDT", "AssertTimestampMonotonic": "202382737", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "55e27919215348fab37a11b7ea324f90", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 30564 1726882842.90846: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882842.90850: stdout chunk (state=3): >>><<< 30564 1726882842.90858: stderr chunk (state=3): >>><<< 30564 1726882842.90885: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6692", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ExecMainStartTimestampMonotonic": "202392137", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6692", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3602", "MemoryCurrent": "9150464", "MemoryAvailable": "infinity", "CPUUsageNSec": "2140289000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service multi-user.target network.target shutdown.target cloud-init.service", "After": "cloud-init-local.service dbus-broker.service network-pre.target system.slice dbus.socket systemd-journald.socket basic.target sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:57 EDT", "StateChangeTimestampMonotonic": "316658837", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveExitTimestampMonotonic": "202392395", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveEnterTimestampMonotonic": "202472383", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveExitTimestampMonotonic": "202362940", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveEnterTimestampMonotonic": "202381901", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ConditionTimestampMonotonic": "202382734", "AssertTimestamp": "Fri 2024-09-20 21:31:03 EDT", "AssertTimestampMonotonic": "202382737", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "55e27919215348fab37a11b7ea324f90", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882842.91060: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882842.5012152-32428-1614936259514/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882842.91081: _low_level_execute_command(): starting 30564 1726882842.91084: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882842.5012152-32428-1614936259514/ > /dev/null 2>&1 && sleep 0' 30564 1726882842.91875: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882842.91881: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882842.91897: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882842.91932: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 30564 1726882842.91938: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration <<< 30564 1726882842.91951: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882842.91957: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 30564 1726882842.91975: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882842.92057: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882842.92078: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882842.92210: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882842.94036: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882842.94124: stderr chunk (state=3): >>><<< 30564 1726882842.94127: stdout chunk (state=3): >>><<< 30564 1726882842.94144: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882842.94151: handler run complete 30564 1726882842.94215: attempt loop complete, returning result 30564 1726882842.94219: _execute() done 30564 1726882842.94221: dumping result to json 30564 1726882842.94238: done dumping result, returning 30564 1726882842.94249: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0e448fcc-3ce9-4216-acec-000000000d22] 30564 1726882842.94255: sending task result for task 0e448fcc-3ce9-4216-acec-000000000d22 30564 1726882842.94543: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000d22 30564 1726882842.94546: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30564 1726882842.94607: no more pending results, returning what we have 30564 1726882842.94610: results queue empty 30564 1726882842.94611: checking for any_errors_fatal 30564 1726882842.94617: done checking for any_errors_fatal 30564 1726882842.94618: checking for max_fail_percentage 30564 1726882842.94620: done checking for max_fail_percentage 30564 1726882842.94620: checking to see if all hosts have failed and the running result is not ok 30564 1726882842.94621: done checking to see if all hosts have failed 30564 1726882842.94622: getting the remaining hosts for this loop 30564 1726882842.94624: done getting the remaining hosts for this loop 30564 1726882842.94627: getting the next task for host managed_node2 30564 1726882842.94634: done getting next task for host managed_node2 30564 1726882842.94637: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30564 1726882842.94642: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882842.94652: getting variables 30564 1726882842.94653: in VariableManager get_vars() 30564 1726882842.94687: Calling all_inventory to load vars for managed_node2 30564 1726882842.94690: Calling groups_inventory to load vars for managed_node2 30564 1726882842.94692: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882842.94701: Calling all_plugins_play to load vars for managed_node2 30564 1726882842.94704: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882842.94706: Calling groups_plugins_play to load vars for managed_node2 30564 1726882842.96262: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882842.98217: done with get_vars() 30564 1726882842.98241: done getting variables 30564 1726882842.98316: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:40:42 -0400 (0:00:00.619) 0:00:41.564 ****** 30564 1726882842.98360: entering _queue_task() for managed_node2/service 30564 1726882842.98725: worker is 1 (out of 1 available) 30564 1726882842.98738: exiting _queue_task() for managed_node2/service 30564 1726882842.98751: done queuing things up, now waiting for results queue to drain 30564 1726882842.98753: waiting for pending results... 30564 1726882842.99087: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30564 1726882842.99249: in run() - task 0e448fcc-3ce9-4216-acec-000000000d23 30564 1726882842.99273: variable 'ansible_search_path' from source: unknown 30564 1726882842.99281: variable 'ansible_search_path' from source: unknown 30564 1726882842.99326: calling self._execute() 30564 1726882842.99444: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882842.99465: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882842.99485: variable 'omit' from source: magic vars 30564 1726882842.99920: variable 'ansible_distribution_major_version' from source: facts 30564 1726882842.99938: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882843.00081: variable 'network_provider' from source: set_fact 30564 1726882843.00094: Evaluated conditional (network_provider == "nm"): True 30564 1726882843.00206: variable '__network_wpa_supplicant_required' from source: role '' defaults 30564 1726882843.00328: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30564 1726882843.00548: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882843.03591: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882843.03637: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882843.03670: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882843.03698: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882843.03718: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882843.03780: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882843.03803: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882843.03821: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882843.03846: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882843.03857: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882843.03895: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882843.03914: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882843.03930: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882843.03955: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882843.03970: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882843.04011: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882843.04033: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882843.04049: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882843.04078: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882843.04091: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882843.04200: variable 'network_connections' from source: include params 30564 1726882843.04210: variable 'interface' from source: play vars 30564 1726882843.04262: variable 'interface' from source: play vars 30564 1726882843.04329: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882843.04442: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882843.04474: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882843.04496: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882843.04517: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882843.04549: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882843.04567: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882843.04587: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882843.04604: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882843.04645: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882843.04810: variable 'network_connections' from source: include params 30564 1726882843.04814: variable 'interface' from source: play vars 30564 1726882843.04859: variable 'interface' from source: play vars 30564 1726882843.04886: Evaluated conditional (__network_wpa_supplicant_required): False 30564 1726882843.04890: when evaluation is False, skipping this task 30564 1726882843.04893: _execute() done 30564 1726882843.04895: dumping result to json 30564 1726882843.04897: done dumping result, returning 30564 1726882843.04905: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0e448fcc-3ce9-4216-acec-000000000d23] 30564 1726882843.04918: sending task result for task 0e448fcc-3ce9-4216-acec-000000000d23 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 30564 1726882843.05053: no more pending results, returning what we have 30564 1726882843.05057: results queue empty 30564 1726882843.05058: checking for any_errors_fatal 30564 1726882843.05082: done checking for any_errors_fatal 30564 1726882843.05083: checking for max_fail_percentage 30564 1726882843.05085: done checking for max_fail_percentage 30564 1726882843.05086: checking to see if all hosts have failed and the running result is not ok 30564 1726882843.05086: done checking to see if all hosts have failed 30564 1726882843.05087: getting the remaining hosts for this loop 30564 1726882843.05089: done getting the remaining hosts for this loop 30564 1726882843.05093: getting the next task for host managed_node2 30564 1726882843.05103: done getting next task for host managed_node2 30564 1726882843.05106: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 30564 1726882843.05112: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882843.05132: getting variables 30564 1726882843.05133: in VariableManager get_vars() 30564 1726882843.05169: Calling all_inventory to load vars for managed_node2 30564 1726882843.05172: Calling groups_inventory to load vars for managed_node2 30564 1726882843.05174: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882843.05183: Calling all_plugins_play to load vars for managed_node2 30564 1726882843.05186: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882843.05188: Calling groups_plugins_play to load vars for managed_node2 30564 1726882843.06159: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882843.06692: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000d23 30564 1726882843.06696: WORKER PROCESS EXITING 30564 1726882843.07684: done with get_vars() 30564 1726882843.07700: done getting variables 30564 1726882843.07740: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:40:43 -0400 (0:00:00.094) 0:00:41.659 ****** 30564 1726882843.07783: entering _queue_task() for managed_node2/service 30564 1726882843.08058: worker is 1 (out of 1 available) 30564 1726882843.08073: exiting _queue_task() for managed_node2/service 30564 1726882843.08085: done queuing things up, now waiting for results queue to drain 30564 1726882843.08086: waiting for pending results... 30564 1726882843.08414: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 30564 1726882843.08553: in run() - task 0e448fcc-3ce9-4216-acec-000000000d24 30564 1726882843.08572: variable 'ansible_search_path' from source: unknown 30564 1726882843.08576: variable 'ansible_search_path' from source: unknown 30564 1726882843.08605: calling self._execute() 30564 1726882843.08703: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882843.08707: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882843.08719: variable 'omit' from source: magic vars 30564 1726882843.09182: variable 'ansible_distribution_major_version' from source: facts 30564 1726882843.09200: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882843.09330: variable 'network_provider' from source: set_fact 30564 1726882843.09333: Evaluated conditional (network_provider == "initscripts"): False 30564 1726882843.09336: when evaluation is False, skipping this task 30564 1726882843.09340: _execute() done 30564 1726882843.09342: dumping result to json 30564 1726882843.09345: done dumping result, returning 30564 1726882843.09351: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [0e448fcc-3ce9-4216-acec-000000000d24] 30564 1726882843.09358: sending task result for task 0e448fcc-3ce9-4216-acec-000000000d24 30564 1726882843.09464: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000d24 30564 1726882843.09477: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30564 1726882843.09523: no more pending results, returning what we have 30564 1726882843.09527: results queue empty 30564 1726882843.09528: checking for any_errors_fatal 30564 1726882843.09538: done checking for any_errors_fatal 30564 1726882843.09539: checking for max_fail_percentage 30564 1726882843.09540: done checking for max_fail_percentage 30564 1726882843.09541: checking to see if all hosts have failed and the running result is not ok 30564 1726882843.09542: done checking to see if all hosts have failed 30564 1726882843.09542: getting the remaining hosts for this loop 30564 1726882843.09544: done getting the remaining hosts for this loop 30564 1726882843.09548: getting the next task for host managed_node2 30564 1726882843.09556: done getting next task for host managed_node2 30564 1726882843.09560: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30564 1726882843.09566: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882843.09586: getting variables 30564 1726882843.09587: in VariableManager get_vars() 30564 1726882843.09618: Calling all_inventory to load vars for managed_node2 30564 1726882843.09620: Calling groups_inventory to load vars for managed_node2 30564 1726882843.09622: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882843.09631: Calling all_plugins_play to load vars for managed_node2 30564 1726882843.09633: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882843.09636: Calling groups_plugins_play to load vars for managed_node2 30564 1726882843.10477: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882843.11524: done with get_vars() 30564 1726882843.11539: done getting variables 30564 1726882843.11585: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:40:43 -0400 (0:00:00.038) 0:00:41.697 ****** 30564 1726882843.11610: entering _queue_task() for managed_node2/copy 30564 1726882843.11812: worker is 1 (out of 1 available) 30564 1726882843.11826: exiting _queue_task() for managed_node2/copy 30564 1726882843.11838: done queuing things up, now waiting for results queue to drain 30564 1726882843.11840: waiting for pending results... 30564 1726882843.12079: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30564 1726882843.12229: in run() - task 0e448fcc-3ce9-4216-acec-000000000d25 30564 1726882843.12240: variable 'ansible_search_path' from source: unknown 30564 1726882843.12244: variable 'ansible_search_path' from source: unknown 30564 1726882843.12279: calling self._execute() 30564 1726882843.12381: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882843.12387: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882843.12399: variable 'omit' from source: magic vars 30564 1726882843.12797: variable 'ansible_distribution_major_version' from source: facts 30564 1726882843.12809: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882843.12929: variable 'network_provider' from source: set_fact 30564 1726882843.12936: Evaluated conditional (network_provider == "initscripts"): False 30564 1726882843.12938: when evaluation is False, skipping this task 30564 1726882843.12941: _execute() done 30564 1726882843.12943: dumping result to json 30564 1726882843.12947: done dumping result, returning 30564 1726882843.12958: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0e448fcc-3ce9-4216-acec-000000000d25] 30564 1726882843.12961: sending task result for task 0e448fcc-3ce9-4216-acec-000000000d25 30564 1726882843.13080: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000d25 30564 1726882843.13083: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 30564 1726882843.13130: no more pending results, returning what we have 30564 1726882843.13135: results queue empty 30564 1726882843.13136: checking for any_errors_fatal 30564 1726882843.13143: done checking for any_errors_fatal 30564 1726882843.13144: checking for max_fail_percentage 30564 1726882843.13146: done checking for max_fail_percentage 30564 1726882843.13146: checking to see if all hosts have failed and the running result is not ok 30564 1726882843.13147: done checking to see if all hosts have failed 30564 1726882843.13148: getting the remaining hosts for this loop 30564 1726882843.13150: done getting the remaining hosts for this loop 30564 1726882843.13154: getting the next task for host managed_node2 30564 1726882843.13168: done getting next task for host managed_node2 30564 1726882843.13173: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30564 1726882843.13179: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882843.13203: getting variables 30564 1726882843.13205: in VariableManager get_vars() 30564 1726882843.13241: Calling all_inventory to load vars for managed_node2 30564 1726882843.13244: Calling groups_inventory to load vars for managed_node2 30564 1726882843.13246: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882843.13258: Calling all_plugins_play to load vars for managed_node2 30564 1726882843.13261: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882843.13266: Calling groups_plugins_play to load vars for managed_node2 30564 1726882843.14204: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882843.15148: done with get_vars() 30564 1726882843.15162: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:40:43 -0400 (0:00:00.036) 0:00:41.733 ****** 30564 1726882843.15222: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 30564 1726882843.15413: worker is 1 (out of 1 available) 30564 1726882843.15426: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 30564 1726882843.15439: done queuing things up, now waiting for results queue to drain 30564 1726882843.15440: waiting for pending results... 30564 1726882843.15628: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30564 1726882843.15723: in run() - task 0e448fcc-3ce9-4216-acec-000000000d26 30564 1726882843.15734: variable 'ansible_search_path' from source: unknown 30564 1726882843.15738: variable 'ansible_search_path' from source: unknown 30564 1726882843.15765: calling self._execute() 30564 1726882843.15841: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882843.15845: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882843.15854: variable 'omit' from source: magic vars 30564 1726882843.16121: variable 'ansible_distribution_major_version' from source: facts 30564 1726882843.16131: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882843.16137: variable 'omit' from source: magic vars 30564 1726882843.16186: variable 'omit' from source: magic vars 30564 1726882843.16297: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882843.18064: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882843.18270: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882843.18274: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882843.18278: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882843.18280: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882843.18406: variable 'network_provider' from source: set_fact 30564 1726882843.18466: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882843.18495: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882843.18520: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882843.18561: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882843.18580: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882843.18650: variable 'omit' from source: magic vars 30564 1726882843.18755: variable 'omit' from source: magic vars 30564 1726882843.18854: variable 'network_connections' from source: include params 30564 1726882843.18866: variable 'interface' from source: play vars 30564 1726882843.18927: variable 'interface' from source: play vars 30564 1726882843.19069: variable 'omit' from source: magic vars 30564 1726882843.19081: variable '__lsr_ansible_managed' from source: task vars 30564 1726882843.19155: variable '__lsr_ansible_managed' from source: task vars 30564 1726882843.19418: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 30564 1726882843.19631: Loaded config def from plugin (lookup/template) 30564 1726882843.19634: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 30564 1726882843.19654: File lookup term: get_ansible_managed.j2 30564 1726882843.19657: variable 'ansible_search_path' from source: unknown 30564 1726882843.19661: evaluation_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 30564 1726882843.19675: search_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 30564 1726882843.19687: variable 'ansible_search_path' from source: unknown 30564 1726882843.24817: variable 'ansible_managed' from source: unknown 30564 1726882843.24942: variable 'omit' from source: magic vars 30564 1726882843.24972: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882843.24994: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882843.25016: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882843.25026: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882843.25041: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882843.25063: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882843.25066: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882843.25073: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882843.25170: Set connection var ansible_timeout to 10 30564 1726882843.25174: Set connection var ansible_pipelining to False 30564 1726882843.25177: Set connection var ansible_shell_type to sh 30564 1726882843.25179: Set connection var ansible_shell_executable to /bin/sh 30564 1726882843.25186: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882843.25189: Set connection var ansible_connection to ssh 30564 1726882843.25214: variable 'ansible_shell_executable' from source: unknown 30564 1726882843.25217: variable 'ansible_connection' from source: unknown 30564 1726882843.25220: variable 'ansible_module_compression' from source: unknown 30564 1726882843.25222: variable 'ansible_shell_type' from source: unknown 30564 1726882843.25224: variable 'ansible_shell_executable' from source: unknown 30564 1726882843.25227: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882843.25230: variable 'ansible_pipelining' from source: unknown 30564 1726882843.25232: variable 'ansible_timeout' from source: unknown 30564 1726882843.25239: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882843.25387: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30564 1726882843.25398: variable 'omit' from source: magic vars 30564 1726882843.25401: starting attempt loop 30564 1726882843.25404: running the handler 30564 1726882843.25406: _low_level_execute_command(): starting 30564 1726882843.25408: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882843.26043: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882843.26049: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882843.26079: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882843.26083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882843.26113: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882843.26119: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882843.26130: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882843.26148: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882843.26151: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882843.26153: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882843.26162: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882843.26177: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882843.26186: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882843.26194: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882843.26201: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882843.26210: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882843.26280: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882843.26312: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882843.26315: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882843.26449: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882843.28105: stdout chunk (state=3): >>>/root <<< 30564 1726882843.28210: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882843.28283: stderr chunk (state=3): >>><<< 30564 1726882843.28296: stdout chunk (state=3): >>><<< 30564 1726882843.28400: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882843.28403: _low_level_execute_command(): starting 30564 1726882843.28406: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882843.283206-32460-71907800716150 `" && echo ansible-tmp-1726882843.283206-32460-71907800716150="` echo /root/.ansible/tmp/ansible-tmp-1726882843.283206-32460-71907800716150 `" ) && sleep 0' 30564 1726882843.28953: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882843.28970: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882843.28986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882843.29004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882843.29047: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882843.29061: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882843.29081: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882843.29099: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882843.29112: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882843.29123: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882843.29135: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882843.29149: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882843.29171: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882843.29184: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882843.29195: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882843.29209: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882843.29289: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882843.29310: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882843.29326: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882843.29456: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882843.31332: stdout chunk (state=3): >>>ansible-tmp-1726882843.283206-32460-71907800716150=/root/.ansible/tmp/ansible-tmp-1726882843.283206-32460-71907800716150 <<< 30564 1726882843.31451: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882843.31517: stderr chunk (state=3): >>><<< 30564 1726882843.31543: stdout chunk (state=3): >>><<< 30564 1726882843.31776: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882843.283206-32460-71907800716150=/root/.ansible/tmp/ansible-tmp-1726882843.283206-32460-71907800716150 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882843.31784: variable 'ansible_module_compression' from source: unknown 30564 1726882843.31786: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 30564 1726882843.31789: variable 'ansible_facts' from source: unknown 30564 1726882843.31823: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882843.283206-32460-71907800716150/AnsiballZ_network_connections.py 30564 1726882843.31985: Sending initial data 30564 1726882843.31988: Sent initial data (166 bytes) 30564 1726882843.32981: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882843.32996: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882843.33010: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882843.33028: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882843.33074: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882843.33093: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882843.33108: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882843.33126: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882843.33138: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882843.33150: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882843.33161: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882843.33184: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882843.33206: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882843.33221: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882843.33233: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882843.33248: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882843.33333: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882843.33350: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882843.33367: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882843.33500: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882843.35247: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882843.35340: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882843.35441: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmp79brgmky /root/.ansible/tmp/ansible-tmp-1726882843.283206-32460-71907800716150/AnsiballZ_network_connections.py <<< 30564 1726882843.35537: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882843.37487: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882843.37614: stderr chunk (state=3): >>><<< 30564 1726882843.37617: stdout chunk (state=3): >>><<< 30564 1726882843.37619: done transferring module to remote 30564 1726882843.37621: _low_level_execute_command(): starting 30564 1726882843.37623: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882843.283206-32460-71907800716150/ /root/.ansible/tmp/ansible-tmp-1726882843.283206-32460-71907800716150/AnsiballZ_network_connections.py && sleep 0' 30564 1726882843.38235: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882843.38249: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882843.38275: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882843.38295: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882843.38337: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882843.38349: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882843.38365: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882843.38393: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882843.38406: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882843.38418: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882843.38430: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882843.38443: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882843.38458: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882843.38476: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882843.38495: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882843.38509: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882843.38589: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882843.38621: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882843.38634: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882843.38760: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882843.40577: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882843.40580: stdout chunk (state=3): >>><<< 30564 1726882843.40583: stderr chunk (state=3): >>><<< 30564 1726882843.40594: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882843.40601: _low_level_execute_command(): starting 30564 1726882843.40604: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882843.283206-32460-71907800716150/AnsiballZ_network_connections.py && sleep 0' 30564 1726882843.41193: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882843.41201: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882843.41212: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882843.41226: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882843.41262: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882843.41273: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882843.41281: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882843.41295: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882843.41302: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882843.41309: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882843.41316: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882843.41328: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882843.41337: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882843.41342: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882843.41351: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882843.41357: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882843.41433: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882843.41450: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882843.41462: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882843.41598: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882843.63808: stdout chunk (state=3): >>> {"changed": false, "warnings": [], "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 6d0eee33-2e09-457c-9193-5de1eabb8deb skipped because already active\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 30564 1726882843.65241: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882843.65279: stderr chunk (state=3): >>><<< 30564 1726882843.65283: stdout chunk (state=3): >>><<< 30564 1726882843.65297: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "warnings": [], "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 6d0eee33-2e09-457c-9193-5de1eabb8deb skipped because already active\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882843.65324: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'state': 'up'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882843.283206-32460-71907800716150/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882843.65331: _low_level_execute_command(): starting 30564 1726882843.65337: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882843.283206-32460-71907800716150/ > /dev/null 2>&1 && sleep 0' 30564 1726882843.65769: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882843.65787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882843.65815: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882843.65818: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882843.65869: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882843.65873: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882843.65985: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882843.67781: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882843.67821: stderr chunk (state=3): >>><<< 30564 1726882843.67824: stdout chunk (state=3): >>><<< 30564 1726882843.67833: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882843.67839: handler run complete 30564 1726882843.67862: attempt loop complete, returning result 30564 1726882843.67865: _execute() done 30564 1726882843.67868: dumping result to json 30564 1726882843.67875: done dumping result, returning 30564 1726882843.67884: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0e448fcc-3ce9-4216-acec-000000000d26] 30564 1726882843.67888: sending task result for task 0e448fcc-3ce9-4216-acec-000000000d26 30564 1726882843.67990: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000d26 30564 1726882843.67992: WORKER PROCESS EXITING ok: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "state": "up" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false } STDERR: [002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 6d0eee33-2e09-457c-9193-5de1eabb8deb skipped because already active 30564 1726882843.68088: no more pending results, returning what we have 30564 1726882843.68091: results queue empty 30564 1726882843.68092: checking for any_errors_fatal 30564 1726882843.68099: done checking for any_errors_fatal 30564 1726882843.68100: checking for max_fail_percentage 30564 1726882843.68103: done checking for max_fail_percentage 30564 1726882843.68104: checking to see if all hosts have failed and the running result is not ok 30564 1726882843.68104: done checking to see if all hosts have failed 30564 1726882843.68105: getting the remaining hosts for this loop 30564 1726882843.68107: done getting the remaining hosts for this loop 30564 1726882843.68111: getting the next task for host managed_node2 30564 1726882843.68118: done getting next task for host managed_node2 30564 1726882843.68122: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 30564 1726882843.68126: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882843.68137: getting variables 30564 1726882843.68138: in VariableManager get_vars() 30564 1726882843.68173: Calling all_inventory to load vars for managed_node2 30564 1726882843.68176: Calling groups_inventory to load vars for managed_node2 30564 1726882843.68178: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882843.68187: Calling all_plugins_play to load vars for managed_node2 30564 1726882843.68190: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882843.68192: Calling groups_plugins_play to load vars for managed_node2 30564 1726882843.69156: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882843.70092: done with get_vars() 30564 1726882843.70108: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:40:43 -0400 (0:00:00.549) 0:00:42.283 ****** 30564 1726882843.70171: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 30564 1726882843.70372: worker is 1 (out of 1 available) 30564 1726882843.70383: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 30564 1726882843.70397: done queuing things up, now waiting for results queue to drain 30564 1726882843.70399: waiting for pending results... 30564 1726882843.70595: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 30564 1726882843.70691: in run() - task 0e448fcc-3ce9-4216-acec-000000000d27 30564 1726882843.70703: variable 'ansible_search_path' from source: unknown 30564 1726882843.70706: variable 'ansible_search_path' from source: unknown 30564 1726882843.70734: calling self._execute() 30564 1726882843.70809: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882843.70813: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882843.70822: variable 'omit' from source: magic vars 30564 1726882843.71098: variable 'ansible_distribution_major_version' from source: facts 30564 1726882843.71110: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882843.71194: variable 'network_state' from source: role '' defaults 30564 1726882843.71203: Evaluated conditional (network_state != {}): False 30564 1726882843.71206: when evaluation is False, skipping this task 30564 1726882843.71209: _execute() done 30564 1726882843.71212: dumping result to json 30564 1726882843.71215: done dumping result, returning 30564 1726882843.71218: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [0e448fcc-3ce9-4216-acec-000000000d27] 30564 1726882843.71228: sending task result for task 0e448fcc-3ce9-4216-acec-000000000d27 30564 1726882843.71311: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000d27 30564 1726882843.71314: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30564 1726882843.71371: no more pending results, returning what we have 30564 1726882843.71375: results queue empty 30564 1726882843.71376: checking for any_errors_fatal 30564 1726882843.71384: done checking for any_errors_fatal 30564 1726882843.71385: checking for max_fail_percentage 30564 1726882843.71386: done checking for max_fail_percentage 30564 1726882843.71387: checking to see if all hosts have failed and the running result is not ok 30564 1726882843.71388: done checking to see if all hosts have failed 30564 1726882843.71388: getting the remaining hosts for this loop 30564 1726882843.71389: done getting the remaining hosts for this loop 30564 1726882843.71392: getting the next task for host managed_node2 30564 1726882843.71398: done getting next task for host managed_node2 30564 1726882843.71402: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30564 1726882843.71406: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882843.71421: getting variables 30564 1726882843.71423: in VariableManager get_vars() 30564 1726882843.71458: Calling all_inventory to load vars for managed_node2 30564 1726882843.71460: Calling groups_inventory to load vars for managed_node2 30564 1726882843.71461: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882843.71470: Calling all_plugins_play to load vars for managed_node2 30564 1726882843.71472: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882843.71474: Calling groups_plugins_play to load vars for managed_node2 30564 1726882843.72240: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882843.73281: done with get_vars() 30564 1726882843.73298: done getting variables 30564 1726882843.73337: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:40:43 -0400 (0:00:00.031) 0:00:42.314 ****** 30564 1726882843.73359: entering _queue_task() for managed_node2/debug 30564 1726882843.73536: worker is 1 (out of 1 available) 30564 1726882843.73548: exiting _queue_task() for managed_node2/debug 30564 1726882843.73559: done queuing things up, now waiting for results queue to drain 30564 1726882843.73560: waiting for pending results... 30564 1726882843.73753: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30564 1726882843.73841: in run() - task 0e448fcc-3ce9-4216-acec-000000000d28 30564 1726882843.73853: variable 'ansible_search_path' from source: unknown 30564 1726882843.73857: variable 'ansible_search_path' from source: unknown 30564 1726882843.73886: calling self._execute() 30564 1726882843.73959: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882843.73962: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882843.73973: variable 'omit' from source: magic vars 30564 1726882843.74234: variable 'ansible_distribution_major_version' from source: facts 30564 1726882843.74244: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882843.74251: variable 'omit' from source: magic vars 30564 1726882843.74302: variable 'omit' from source: magic vars 30564 1726882843.74324: variable 'omit' from source: magic vars 30564 1726882843.74357: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882843.74387: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882843.74403: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882843.74417: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882843.74426: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882843.74449: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882843.74452: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882843.74454: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882843.74527: Set connection var ansible_timeout to 10 30564 1726882843.74531: Set connection var ansible_pipelining to False 30564 1726882843.74533: Set connection var ansible_shell_type to sh 30564 1726882843.74539: Set connection var ansible_shell_executable to /bin/sh 30564 1726882843.74548: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882843.74550: Set connection var ansible_connection to ssh 30564 1726882843.74571: variable 'ansible_shell_executable' from source: unknown 30564 1726882843.74574: variable 'ansible_connection' from source: unknown 30564 1726882843.74577: variable 'ansible_module_compression' from source: unknown 30564 1726882843.74579: variable 'ansible_shell_type' from source: unknown 30564 1726882843.74581: variable 'ansible_shell_executable' from source: unknown 30564 1726882843.74585: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882843.74587: variable 'ansible_pipelining' from source: unknown 30564 1726882843.74589: variable 'ansible_timeout' from source: unknown 30564 1726882843.74591: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882843.74692: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882843.74702: variable 'omit' from source: magic vars 30564 1726882843.74705: starting attempt loop 30564 1726882843.74708: running the handler 30564 1726882843.74799: variable '__network_connections_result' from source: set_fact 30564 1726882843.74839: handler run complete 30564 1726882843.74851: attempt loop complete, returning result 30564 1726882843.74854: _execute() done 30564 1726882843.74857: dumping result to json 30564 1726882843.74859: done dumping result, returning 30564 1726882843.74873: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0e448fcc-3ce9-4216-acec-000000000d28] 30564 1726882843.74875: sending task result for task 0e448fcc-3ce9-4216-acec-000000000d28 30564 1726882843.74962: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000d28 30564 1726882843.74967: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 6d0eee33-2e09-457c-9193-5de1eabb8deb skipped because already active" ] } 30564 1726882843.75047: no more pending results, returning what we have 30564 1726882843.75050: results queue empty 30564 1726882843.75051: checking for any_errors_fatal 30564 1726882843.75055: done checking for any_errors_fatal 30564 1726882843.75056: checking for max_fail_percentage 30564 1726882843.75058: done checking for max_fail_percentage 30564 1726882843.75059: checking to see if all hosts have failed and the running result is not ok 30564 1726882843.75059: done checking to see if all hosts have failed 30564 1726882843.75060: getting the remaining hosts for this loop 30564 1726882843.75061: done getting the remaining hosts for this loop 30564 1726882843.75066: getting the next task for host managed_node2 30564 1726882843.75074: done getting next task for host managed_node2 30564 1726882843.75078: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30564 1726882843.75082: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882843.75091: getting variables 30564 1726882843.75093: in VariableManager get_vars() 30564 1726882843.75114: Calling all_inventory to load vars for managed_node2 30564 1726882843.75116: Calling groups_inventory to load vars for managed_node2 30564 1726882843.75118: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882843.75123: Calling all_plugins_play to load vars for managed_node2 30564 1726882843.75125: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882843.75127: Calling groups_plugins_play to load vars for managed_node2 30564 1726882843.75900: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882843.76845: done with get_vars() 30564 1726882843.76859: done getting variables 30564 1726882843.76902: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:40:43 -0400 (0:00:00.035) 0:00:42.350 ****** 30564 1726882843.76927: entering _queue_task() for managed_node2/debug 30564 1726882843.77093: worker is 1 (out of 1 available) 30564 1726882843.77104: exiting _queue_task() for managed_node2/debug 30564 1726882843.77115: done queuing things up, now waiting for results queue to drain 30564 1726882843.77116: waiting for pending results... 30564 1726882843.77301: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30564 1726882843.77384: in run() - task 0e448fcc-3ce9-4216-acec-000000000d29 30564 1726882843.77394: variable 'ansible_search_path' from source: unknown 30564 1726882843.77398: variable 'ansible_search_path' from source: unknown 30564 1726882843.77427: calling self._execute() 30564 1726882843.77500: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882843.77505: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882843.77514: variable 'omit' from source: magic vars 30564 1726882843.77772: variable 'ansible_distribution_major_version' from source: facts 30564 1726882843.77784: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882843.77790: variable 'omit' from source: magic vars 30564 1726882843.77836: variable 'omit' from source: magic vars 30564 1726882843.77857: variable 'omit' from source: magic vars 30564 1726882843.77895: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882843.77919: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882843.77935: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882843.77948: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882843.77957: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882843.77986: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882843.77989: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882843.77992: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882843.78057: Set connection var ansible_timeout to 10 30564 1726882843.78060: Set connection var ansible_pipelining to False 30564 1726882843.78064: Set connection var ansible_shell_type to sh 30564 1726882843.78073: Set connection var ansible_shell_executable to /bin/sh 30564 1726882843.78079: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882843.78082: Set connection var ansible_connection to ssh 30564 1726882843.78103: variable 'ansible_shell_executable' from source: unknown 30564 1726882843.78106: variable 'ansible_connection' from source: unknown 30564 1726882843.78109: variable 'ansible_module_compression' from source: unknown 30564 1726882843.78111: variable 'ansible_shell_type' from source: unknown 30564 1726882843.78113: variable 'ansible_shell_executable' from source: unknown 30564 1726882843.78116: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882843.78118: variable 'ansible_pipelining' from source: unknown 30564 1726882843.78120: variable 'ansible_timeout' from source: unknown 30564 1726882843.78124: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882843.78221: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882843.78234: variable 'omit' from source: magic vars 30564 1726882843.78237: starting attempt loop 30564 1726882843.78239: running the handler 30564 1726882843.78280: variable '__network_connections_result' from source: set_fact 30564 1726882843.78332: variable '__network_connections_result' from source: set_fact 30564 1726882843.78411: handler run complete 30564 1726882843.78429: attempt loop complete, returning result 30564 1726882843.78432: _execute() done 30564 1726882843.78434: dumping result to json 30564 1726882843.78437: done dumping result, returning 30564 1726882843.78444: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0e448fcc-3ce9-4216-acec-000000000d29] 30564 1726882843.78448: sending task result for task 0e448fcc-3ce9-4216-acec-000000000d29 30564 1726882843.78544: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000d29 30564 1726882843.78546: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "state": "up" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false, "failed": false, "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 6d0eee33-2e09-457c-9193-5de1eabb8deb skipped because already active\n", "stderr_lines": [ "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 6d0eee33-2e09-457c-9193-5de1eabb8deb skipped because already active" ] } } 30564 1726882843.78647: no more pending results, returning what we have 30564 1726882843.78650: results queue empty 30564 1726882843.78650: checking for any_errors_fatal 30564 1726882843.78654: done checking for any_errors_fatal 30564 1726882843.78655: checking for max_fail_percentage 30564 1726882843.78656: done checking for max_fail_percentage 30564 1726882843.78656: checking to see if all hosts have failed and the running result is not ok 30564 1726882843.78657: done checking to see if all hosts have failed 30564 1726882843.78657: getting the remaining hosts for this loop 30564 1726882843.78658: done getting the remaining hosts for this loop 30564 1726882843.78660: getting the next task for host managed_node2 30564 1726882843.78668: done getting next task for host managed_node2 30564 1726882843.78672: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30564 1726882843.78676: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882843.78684: getting variables 30564 1726882843.78685: in VariableManager get_vars() 30564 1726882843.78705: Calling all_inventory to load vars for managed_node2 30564 1726882843.78707: Calling groups_inventory to load vars for managed_node2 30564 1726882843.78712: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882843.78717: Calling all_plugins_play to load vars for managed_node2 30564 1726882843.78719: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882843.78721: Calling groups_plugins_play to load vars for managed_node2 30564 1726882843.82775: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882843.83697: done with get_vars() 30564 1726882843.83712: done getting variables 30564 1726882843.83744: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:40:43 -0400 (0:00:00.068) 0:00:42.418 ****** 30564 1726882843.83766: entering _queue_task() for managed_node2/debug 30564 1726882843.83974: worker is 1 (out of 1 available) 30564 1726882843.83988: exiting _queue_task() for managed_node2/debug 30564 1726882843.83999: done queuing things up, now waiting for results queue to drain 30564 1726882843.84002: waiting for pending results... 30564 1726882843.84187: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30564 1726882843.84290: in run() - task 0e448fcc-3ce9-4216-acec-000000000d2a 30564 1726882843.84304: variable 'ansible_search_path' from source: unknown 30564 1726882843.84309: variable 'ansible_search_path' from source: unknown 30564 1726882843.84333: calling self._execute() 30564 1726882843.84412: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882843.84417: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882843.84426: variable 'omit' from source: magic vars 30564 1726882843.84698: variable 'ansible_distribution_major_version' from source: facts 30564 1726882843.84711: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882843.84799: variable 'network_state' from source: role '' defaults 30564 1726882843.84808: Evaluated conditional (network_state != {}): False 30564 1726882843.84813: when evaluation is False, skipping this task 30564 1726882843.84816: _execute() done 30564 1726882843.84818: dumping result to json 30564 1726882843.84822: done dumping result, returning 30564 1726882843.84825: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0e448fcc-3ce9-4216-acec-000000000d2a] 30564 1726882843.84833: sending task result for task 0e448fcc-3ce9-4216-acec-000000000d2a 30564 1726882843.84919: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000d2a 30564 1726882843.84922: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "network_state != {}" } 30564 1726882843.84976: no more pending results, returning what we have 30564 1726882843.84980: results queue empty 30564 1726882843.84981: checking for any_errors_fatal 30564 1726882843.84993: done checking for any_errors_fatal 30564 1726882843.84994: checking for max_fail_percentage 30564 1726882843.84996: done checking for max_fail_percentage 30564 1726882843.84996: checking to see if all hosts have failed and the running result is not ok 30564 1726882843.84997: done checking to see if all hosts have failed 30564 1726882843.84998: getting the remaining hosts for this loop 30564 1726882843.84999: done getting the remaining hosts for this loop 30564 1726882843.85003: getting the next task for host managed_node2 30564 1726882843.85010: done getting next task for host managed_node2 30564 1726882843.85014: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 30564 1726882843.85018: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882843.85035: getting variables 30564 1726882843.85036: in VariableManager get_vars() 30564 1726882843.85078: Calling all_inventory to load vars for managed_node2 30564 1726882843.85081: Calling groups_inventory to load vars for managed_node2 30564 1726882843.85083: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882843.85091: Calling all_plugins_play to load vars for managed_node2 30564 1726882843.85093: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882843.85096: Calling groups_plugins_play to load vars for managed_node2 30564 1726882843.85867: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882843.86828: done with get_vars() 30564 1726882843.86841: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:40:43 -0400 (0:00:00.031) 0:00:42.450 ****** 30564 1726882843.86910: entering _queue_task() for managed_node2/ping 30564 1726882843.87094: worker is 1 (out of 1 available) 30564 1726882843.87107: exiting _queue_task() for managed_node2/ping 30564 1726882843.87118: done queuing things up, now waiting for results queue to drain 30564 1726882843.87119: waiting for pending results... 30564 1726882843.87295: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 30564 1726882843.87393: in run() - task 0e448fcc-3ce9-4216-acec-000000000d2b 30564 1726882843.87402: variable 'ansible_search_path' from source: unknown 30564 1726882843.87406: variable 'ansible_search_path' from source: unknown 30564 1726882843.87433: calling self._execute() 30564 1726882843.87506: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882843.87509: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882843.87518: variable 'omit' from source: magic vars 30564 1726882843.87788: variable 'ansible_distribution_major_version' from source: facts 30564 1726882843.87799: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882843.87805: variable 'omit' from source: magic vars 30564 1726882843.87845: variable 'omit' from source: magic vars 30564 1726882843.87870: variable 'omit' from source: magic vars 30564 1726882843.87907: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882843.87932: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882843.87948: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882843.87962: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882843.87978: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882843.87999: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882843.88003: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882843.88006: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882843.88070: Set connection var ansible_timeout to 10 30564 1726882843.88080: Set connection var ansible_pipelining to False 30564 1726882843.88083: Set connection var ansible_shell_type to sh 30564 1726882843.88087: Set connection var ansible_shell_executable to /bin/sh 30564 1726882843.88093: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882843.88095: Set connection var ansible_connection to ssh 30564 1726882843.88117: variable 'ansible_shell_executable' from source: unknown 30564 1726882843.88120: variable 'ansible_connection' from source: unknown 30564 1726882843.88123: variable 'ansible_module_compression' from source: unknown 30564 1726882843.88125: variable 'ansible_shell_type' from source: unknown 30564 1726882843.88128: variable 'ansible_shell_executable' from source: unknown 30564 1726882843.88130: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882843.88132: variable 'ansible_pipelining' from source: unknown 30564 1726882843.88134: variable 'ansible_timeout' from source: unknown 30564 1726882843.88139: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882843.88285: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30564 1726882843.88297: variable 'omit' from source: magic vars 30564 1726882843.88300: starting attempt loop 30564 1726882843.88302: running the handler 30564 1726882843.88312: _low_level_execute_command(): starting 30564 1726882843.88319: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882843.88847: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882843.88868: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882843.88886: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 30564 1726882843.88899: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 30564 1726882843.88910: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882843.88949: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882843.88962: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882843.88981: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882843.89097: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882843.90764: stdout chunk (state=3): >>>/root <<< 30564 1726882843.90871: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882843.90920: stderr chunk (state=3): >>><<< 30564 1726882843.90923: stdout chunk (state=3): >>><<< 30564 1726882843.90943: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882843.90954: _low_level_execute_command(): starting 30564 1726882843.90958: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882843.9094222-32480-87922947294778 `" && echo ansible-tmp-1726882843.9094222-32480-87922947294778="` echo /root/.ansible/tmp/ansible-tmp-1726882843.9094222-32480-87922947294778 `" ) && sleep 0' 30564 1726882843.91394: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882843.91400: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882843.91432: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882843.91452: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882843.91509: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882843.91515: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882843.91625: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882843.93537: stdout chunk (state=3): >>>ansible-tmp-1726882843.9094222-32480-87922947294778=/root/.ansible/tmp/ansible-tmp-1726882843.9094222-32480-87922947294778 <<< 30564 1726882843.93707: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882843.93711: stderr chunk (state=3): >>><<< 30564 1726882843.93713: stdout chunk (state=3): >>><<< 30564 1726882843.93840: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882843.9094222-32480-87922947294778=/root/.ansible/tmp/ansible-tmp-1726882843.9094222-32480-87922947294778 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882843.93843: variable 'ansible_module_compression' from source: unknown 30564 1726882843.93845: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 30564 1726882843.93847: variable 'ansible_facts' from source: unknown 30564 1726882843.93902: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882843.9094222-32480-87922947294778/AnsiballZ_ping.py 30564 1726882843.93997: Sending initial data 30564 1726882843.94001: Sent initial data (152 bytes) 30564 1726882843.94623: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882843.94630: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882843.94665: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882843.94674: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration <<< 30564 1726882843.94681: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882843.94691: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882843.94697: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882843.94747: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882843.94770: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882843.94773: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882843.94881: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882843.96639: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 <<< 30564 1726882843.96645: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882843.96734: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882843.96831: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmppr1zezgv /root/.ansible/tmp/ansible-tmp-1726882843.9094222-32480-87922947294778/AnsiballZ_ping.py <<< 30564 1726882843.96925: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882843.97919: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882843.98009: stderr chunk (state=3): >>><<< 30564 1726882843.98013: stdout chunk (state=3): >>><<< 30564 1726882843.98026: done transferring module to remote 30564 1726882843.98033: _low_level_execute_command(): starting 30564 1726882843.98038: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882843.9094222-32480-87922947294778/ /root/.ansible/tmp/ansible-tmp-1726882843.9094222-32480-87922947294778/AnsiballZ_ping.py && sleep 0' 30564 1726882843.98437: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882843.98443: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882843.98474: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882843.98485: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882843.98535: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882843.98547: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882843.98556: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882843.98662: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882844.00437: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882844.00483: stderr chunk (state=3): >>><<< 30564 1726882844.00487: stdout chunk (state=3): >>><<< 30564 1726882844.00497: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882844.00500: _low_level_execute_command(): starting 30564 1726882844.00505: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882843.9094222-32480-87922947294778/AnsiballZ_ping.py && sleep 0' 30564 1726882844.00897: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882844.00915: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882844.00931: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882844.00942: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882844.00992: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882844.01003: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882844.01124: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882844.13997: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 30564 1726882844.15024: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882844.15043: stderr chunk (state=3): >>><<< 30564 1726882844.15046: stdout chunk (state=3): >>><<< 30564 1726882844.15154: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882844.15159: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882843.9094222-32480-87922947294778/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882844.15162: _low_level_execute_command(): starting 30564 1726882844.15166: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882843.9094222-32480-87922947294778/ > /dev/null 2>&1 && sleep 0' 30564 1726882844.15677: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882844.15691: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882844.15704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882844.15719: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882844.15758: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882844.15773: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882844.15788: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882844.15807: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882844.15819: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882844.15830: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882844.15841: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882844.15854: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882844.15871: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882844.15884: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882844.15895: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882844.15908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882844.15989: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882844.16006: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882844.16021: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882844.16151: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882844.18127: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882844.18204: stderr chunk (state=3): >>><<< 30564 1726882844.18214: stdout chunk (state=3): >>><<< 30564 1726882844.18602: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882844.18605: handler run complete 30564 1726882844.18607: attempt loop complete, returning result 30564 1726882844.18609: _execute() done 30564 1726882844.18610: dumping result to json 30564 1726882844.18612: done dumping result, returning 30564 1726882844.18614: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0e448fcc-3ce9-4216-acec-000000000d2b] 30564 1726882844.18617: sending task result for task 0e448fcc-3ce9-4216-acec-000000000d2b 30564 1726882844.18696: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000d2b 30564 1726882844.18699: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 30564 1726882844.18756: no more pending results, returning what we have 30564 1726882844.18759: results queue empty 30564 1726882844.18760: checking for any_errors_fatal 30564 1726882844.18769: done checking for any_errors_fatal 30564 1726882844.18770: checking for max_fail_percentage 30564 1726882844.18772: done checking for max_fail_percentage 30564 1726882844.18773: checking to see if all hosts have failed and the running result is not ok 30564 1726882844.18774: done checking to see if all hosts have failed 30564 1726882844.18774: getting the remaining hosts for this loop 30564 1726882844.18776: done getting the remaining hosts for this loop 30564 1726882844.18780: getting the next task for host managed_node2 30564 1726882844.18791: done getting next task for host managed_node2 30564 1726882844.18793: ^ task is: TASK: meta (role_complete) 30564 1726882844.18798: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882844.18810: getting variables 30564 1726882844.18812: in VariableManager get_vars() 30564 1726882844.18848: Calling all_inventory to load vars for managed_node2 30564 1726882844.18850: Calling groups_inventory to load vars for managed_node2 30564 1726882844.18853: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882844.18862: Calling all_plugins_play to load vars for managed_node2 30564 1726882844.18867: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882844.18873: Calling groups_plugins_play to load vars for managed_node2 30564 1726882844.20429: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882844.22217: done with get_vars() 30564 1726882844.22238: done getting variables 30564 1726882844.22320: done queuing things up, now waiting for results queue to drain 30564 1726882844.22322: results queue empty 30564 1726882844.22323: checking for any_errors_fatal 30564 1726882844.22325: done checking for any_errors_fatal 30564 1726882844.22326: checking for max_fail_percentage 30564 1726882844.22327: done checking for max_fail_percentage 30564 1726882844.22327: checking to see if all hosts have failed and the running result is not ok 30564 1726882844.22328: done checking to see if all hosts have failed 30564 1726882844.22329: getting the remaining hosts for this loop 30564 1726882844.22330: done getting the remaining hosts for this loop 30564 1726882844.22332: getting the next task for host managed_node2 30564 1726882844.22337: done getting next task for host managed_node2 30564 1726882844.22339: ^ task is: TASK: Asserts 30564 1726882844.22341: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882844.22344: getting variables 30564 1726882844.22345: in VariableManager get_vars() 30564 1726882844.22354: Calling all_inventory to load vars for managed_node2 30564 1726882844.22356: Calling groups_inventory to load vars for managed_node2 30564 1726882844.22358: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882844.22364: Calling all_plugins_play to load vars for managed_node2 30564 1726882844.22367: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882844.22372: Calling groups_plugins_play to load vars for managed_node2 30564 1726882844.23699: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882844.25451: done with get_vars() 30564 1726882844.25474: done getting variables TASK [Asserts] ***************************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:36 Friday 20 September 2024 21:40:44 -0400 (0:00:00.386) 0:00:42.836 ****** 30564 1726882844.25544: entering _queue_task() for managed_node2/include_tasks 30564 1726882844.25839: worker is 1 (out of 1 available) 30564 1726882844.25852: exiting _queue_task() for managed_node2/include_tasks 30564 1726882844.25865: done queuing things up, now waiting for results queue to drain 30564 1726882844.25866: waiting for pending results... 30564 1726882844.26160: running TaskExecutor() for managed_node2/TASK: Asserts 30564 1726882844.26300: in run() - task 0e448fcc-3ce9-4216-acec-000000000a4e 30564 1726882844.26324: variable 'ansible_search_path' from source: unknown 30564 1726882844.26332: variable 'ansible_search_path' from source: unknown 30564 1726882844.26388: variable 'lsr_assert' from source: include params 30564 1726882844.26625: variable 'lsr_assert' from source: include params 30564 1726882844.26707: variable 'omit' from source: magic vars 30564 1726882844.26870: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882844.26888: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882844.26906: variable 'omit' from source: magic vars 30564 1726882844.27160: variable 'ansible_distribution_major_version' from source: facts 30564 1726882844.27184: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882844.27195: variable 'item' from source: unknown 30564 1726882844.27262: variable 'item' from source: unknown 30564 1726882844.27308: variable 'item' from source: unknown 30564 1726882844.27374: variable 'item' from source: unknown 30564 1726882844.27576: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882844.27592: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882844.27606: variable 'omit' from source: magic vars 30564 1726882844.27773: variable 'ansible_distribution_major_version' from source: facts 30564 1726882844.27785: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882844.27794: variable 'item' from source: unknown 30564 1726882844.27859: variable 'item' from source: unknown 30564 1726882844.27898: variable 'item' from source: unknown 30564 1726882844.27962: variable 'item' from source: unknown 30564 1726882844.28046: dumping result to json 30564 1726882844.28057: done dumping result, returning 30564 1726882844.28074: done running TaskExecutor() for managed_node2/TASK: Asserts [0e448fcc-3ce9-4216-acec-000000000a4e] 30564 1726882844.28085: sending task result for task 0e448fcc-3ce9-4216-acec-000000000a4e 30564 1726882844.28151: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000a4e 30564 1726882844.28159: WORKER PROCESS EXITING 30564 1726882844.28192: no more pending results, returning what we have 30564 1726882844.28197: in VariableManager get_vars() 30564 1726882844.28233: Calling all_inventory to load vars for managed_node2 30564 1726882844.28236: Calling groups_inventory to load vars for managed_node2 30564 1726882844.28240: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882844.28255: Calling all_plugins_play to load vars for managed_node2 30564 1726882844.28259: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882844.28262: Calling groups_plugins_play to load vars for managed_node2 30564 1726882844.29959: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882844.31684: done with get_vars() 30564 1726882844.31703: variable 'ansible_search_path' from source: unknown 30564 1726882844.31704: variable 'ansible_search_path' from source: unknown 30564 1726882844.31743: variable 'ansible_search_path' from source: unknown 30564 1726882844.31744: variable 'ansible_search_path' from source: unknown 30564 1726882844.31775: we have included files to process 30564 1726882844.31776: generating all_blocks data 30564 1726882844.31778: done generating all_blocks data 30564 1726882844.31784: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 30564 1726882844.31785: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 30564 1726882844.31787: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 30564 1726882844.31894: in VariableManager get_vars() 30564 1726882844.31912: done with get_vars() 30564 1726882844.32021: done processing included file 30564 1726882844.32023: iterating over new_blocks loaded from include file 30564 1726882844.32025: in VariableManager get_vars() 30564 1726882844.32039: done with get_vars() 30564 1726882844.32040: filtering new block on tags 30564 1726882844.32079: done filtering new block on tags 30564 1726882844.32082: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed_node2 => (item=tasks/assert_device_present.yml) 30564 1726882844.32087: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 30564 1726882844.32088: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 30564 1726882844.32090: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 30564 1726882844.32198: in VariableManager get_vars() 30564 1726882844.32216: done with get_vars() 30564 1726882844.32451: done processing included file 30564 1726882844.32452: iterating over new_blocks loaded from include file 30564 1726882844.32454: in VariableManager get_vars() 30564 1726882844.32471: done with get_vars() 30564 1726882844.32473: filtering new block on tags 30564 1726882844.32521: done filtering new block on tags 30564 1726882844.32524: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node2 => (item=tasks/assert_profile_present.yml) 30564 1726882844.32527: extending task lists for all hosts with included blocks 30564 1726882844.33487: done extending task lists 30564 1726882844.33488: done processing included files 30564 1726882844.33489: results queue empty 30564 1726882844.33490: checking for any_errors_fatal 30564 1726882844.33491: done checking for any_errors_fatal 30564 1726882844.33492: checking for max_fail_percentage 30564 1726882844.33493: done checking for max_fail_percentage 30564 1726882844.33494: checking to see if all hosts have failed and the running result is not ok 30564 1726882844.33495: done checking to see if all hosts have failed 30564 1726882844.33496: getting the remaining hosts for this loop 30564 1726882844.33497: done getting the remaining hosts for this loop 30564 1726882844.33499: getting the next task for host managed_node2 30564 1726882844.33504: done getting next task for host managed_node2 30564 1726882844.33506: ^ task is: TASK: Include the task 'get_interface_stat.yml' 30564 1726882844.33509: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882844.33516: getting variables 30564 1726882844.33517: in VariableManager get_vars() 30564 1726882844.33526: Calling all_inventory to load vars for managed_node2 30564 1726882844.33528: Calling groups_inventory to load vars for managed_node2 30564 1726882844.33530: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882844.33536: Calling all_plugins_play to load vars for managed_node2 30564 1726882844.33538: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882844.33541: Calling groups_plugins_play to load vars for managed_node2 30564 1726882844.34797: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882844.36513: done with get_vars() 30564 1726882844.36533: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 21:40:44 -0400 (0:00:00.110) 0:00:42.947 ****** 30564 1726882844.36604: entering _queue_task() for managed_node2/include_tasks 30564 1726882844.36895: worker is 1 (out of 1 available) 30564 1726882844.36906: exiting _queue_task() for managed_node2/include_tasks 30564 1726882844.36919: done queuing things up, now waiting for results queue to drain 30564 1726882844.36920: waiting for pending results... 30564 1726882844.37207: running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' 30564 1726882844.37322: in run() - task 0e448fcc-3ce9-4216-acec-000000000e86 30564 1726882844.37342: variable 'ansible_search_path' from source: unknown 30564 1726882844.37349: variable 'ansible_search_path' from source: unknown 30564 1726882844.37396: calling self._execute() 30564 1726882844.37503: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882844.37514: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882844.37529: variable 'omit' from source: magic vars 30564 1726882844.37927: variable 'ansible_distribution_major_version' from source: facts 30564 1726882844.37945: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882844.37955: _execute() done 30564 1726882844.37963: dumping result to json 30564 1726882844.37976: done dumping result, returning 30564 1726882844.37986: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' [0e448fcc-3ce9-4216-acec-000000000e86] 30564 1726882844.37997: sending task result for task 0e448fcc-3ce9-4216-acec-000000000e86 30564 1726882844.38108: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000e86 30564 1726882844.38118: WORKER PROCESS EXITING 30564 1726882844.38146: no more pending results, returning what we have 30564 1726882844.38151: in VariableManager get_vars() 30564 1726882844.38194: Calling all_inventory to load vars for managed_node2 30564 1726882844.38198: Calling groups_inventory to load vars for managed_node2 30564 1726882844.38201: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882844.38214: Calling all_plugins_play to load vars for managed_node2 30564 1726882844.38218: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882844.38221: Calling groups_plugins_play to load vars for managed_node2 30564 1726882844.40026: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882844.41812: done with get_vars() 30564 1726882844.41831: variable 'ansible_search_path' from source: unknown 30564 1726882844.41833: variable 'ansible_search_path' from source: unknown 30564 1726882844.41840: variable 'item' from source: include params 30564 1726882844.41954: variable 'item' from source: include params 30564 1726882844.41993: we have included files to process 30564 1726882844.41994: generating all_blocks data 30564 1726882844.41996: done generating all_blocks data 30564 1726882844.41997: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30564 1726882844.41999: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30564 1726882844.42001: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30564 1726882844.42177: done processing included file 30564 1726882844.42179: iterating over new_blocks loaded from include file 30564 1726882844.42180: in VariableManager get_vars() 30564 1726882844.42193: done with get_vars() 30564 1726882844.42195: filtering new block on tags 30564 1726882844.42216: done filtering new block on tags 30564 1726882844.42218: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node2 30564 1726882844.42222: extending task lists for all hosts with included blocks 30564 1726882844.42370: done extending task lists 30564 1726882844.42372: done processing included files 30564 1726882844.42373: results queue empty 30564 1726882844.42373: checking for any_errors_fatal 30564 1726882844.42378: done checking for any_errors_fatal 30564 1726882844.42379: checking for max_fail_percentage 30564 1726882844.42380: done checking for max_fail_percentage 30564 1726882844.42381: checking to see if all hosts have failed and the running result is not ok 30564 1726882844.42382: done checking to see if all hosts have failed 30564 1726882844.42382: getting the remaining hosts for this loop 30564 1726882844.42384: done getting the remaining hosts for this loop 30564 1726882844.42387: getting the next task for host managed_node2 30564 1726882844.42392: done getting next task for host managed_node2 30564 1726882844.42394: ^ task is: TASK: Get stat for interface {{ interface }} 30564 1726882844.42398: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882844.42401: getting variables 30564 1726882844.42402: in VariableManager get_vars() 30564 1726882844.42411: Calling all_inventory to load vars for managed_node2 30564 1726882844.42414: Calling groups_inventory to load vars for managed_node2 30564 1726882844.42416: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882844.42421: Calling all_plugins_play to load vars for managed_node2 30564 1726882844.42423: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882844.42426: Calling groups_plugins_play to load vars for managed_node2 30564 1726882844.43661: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882844.45336: done with get_vars() 30564 1726882844.45356: done getting variables 30564 1726882844.45480: variable 'interface' from source: play vars TASK [Get stat for interface statebr] ****************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:40:44 -0400 (0:00:00.089) 0:00:43.036 ****** 30564 1726882844.45508: entering _queue_task() for managed_node2/stat 30564 1726882844.45798: worker is 1 (out of 1 available) 30564 1726882844.45810: exiting _queue_task() for managed_node2/stat 30564 1726882844.45823: done queuing things up, now waiting for results queue to drain 30564 1726882844.45825: waiting for pending results... 30564 1726882844.46110: running TaskExecutor() for managed_node2/TASK: Get stat for interface statebr 30564 1726882844.46242: in run() - task 0e448fcc-3ce9-4216-acec-000000000ef5 30564 1726882844.46262: variable 'ansible_search_path' from source: unknown 30564 1726882844.46279: variable 'ansible_search_path' from source: unknown 30564 1726882844.46316: calling self._execute() 30564 1726882844.46420: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882844.46432: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882844.46448: variable 'omit' from source: magic vars 30564 1726882844.46813: variable 'ansible_distribution_major_version' from source: facts 30564 1726882844.46831: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882844.46842: variable 'omit' from source: magic vars 30564 1726882844.46901: variable 'omit' from source: magic vars 30564 1726882844.47003: variable 'interface' from source: play vars 30564 1726882844.47028: variable 'omit' from source: magic vars 30564 1726882844.47076: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882844.47114: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882844.47143: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882844.47166: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882844.47185: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882844.47217: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882844.47225: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882844.47232: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882844.47337: Set connection var ansible_timeout to 10 30564 1726882844.47349: Set connection var ansible_pipelining to False 30564 1726882844.47359: Set connection var ansible_shell_type to sh 30564 1726882844.47374: Set connection var ansible_shell_executable to /bin/sh 30564 1726882844.47386: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882844.47391: Set connection var ansible_connection to ssh 30564 1726882844.47417: variable 'ansible_shell_executable' from source: unknown 30564 1726882844.47425: variable 'ansible_connection' from source: unknown 30564 1726882844.47431: variable 'ansible_module_compression' from source: unknown 30564 1726882844.47437: variable 'ansible_shell_type' from source: unknown 30564 1726882844.47443: variable 'ansible_shell_executable' from source: unknown 30564 1726882844.47449: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882844.47456: variable 'ansible_pipelining' from source: unknown 30564 1726882844.47466: variable 'ansible_timeout' from source: unknown 30564 1726882844.47479: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882844.47682: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30564 1726882844.47699: variable 'omit' from source: magic vars 30564 1726882844.47708: starting attempt loop 30564 1726882844.47714: running the handler 30564 1726882844.47732: _low_level_execute_command(): starting 30564 1726882844.47742: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882844.48515: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882844.48532: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882844.48552: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882844.48581: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882844.48627: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882844.48640: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882844.48657: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882844.48684: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882844.48697: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882844.48709: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882844.48721: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882844.48736: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882844.48752: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882844.48771: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882844.48787: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882844.48803: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882844.48884: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882844.48911: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882844.48931: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882844.49074: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882844.50748: stdout chunk (state=3): >>>/root <<< 30564 1726882844.50848: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882844.50913: stderr chunk (state=3): >>><<< 30564 1726882844.50916: stdout chunk (state=3): >>><<< 30564 1726882844.50940: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882844.50954: _low_level_execute_command(): starting 30564 1726882844.50961: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882844.5093896-32501-13173436326115 `" && echo ansible-tmp-1726882844.5093896-32501-13173436326115="` echo /root/.ansible/tmp/ansible-tmp-1726882844.5093896-32501-13173436326115 `" ) && sleep 0' 30564 1726882844.51560: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882844.51573: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882844.51582: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882844.51599: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882844.51634: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882844.51637: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882844.51648: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882844.51662: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882844.51673: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882844.51680: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882844.51688: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882844.51699: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882844.51708: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882844.51716: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882844.51722: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882844.51734: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882844.51799: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882844.51809: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882844.51823: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882844.51976: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882844.53858: stdout chunk (state=3): >>>ansible-tmp-1726882844.5093896-32501-13173436326115=/root/.ansible/tmp/ansible-tmp-1726882844.5093896-32501-13173436326115 <<< 30564 1726882844.53973: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882844.54041: stderr chunk (state=3): >>><<< 30564 1726882844.54045: stdout chunk (state=3): >>><<< 30564 1726882844.54064: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882844.5093896-32501-13173436326115=/root/.ansible/tmp/ansible-tmp-1726882844.5093896-32501-13173436326115 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882844.54107: variable 'ansible_module_compression' from source: unknown 30564 1726882844.54171: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 30564 1726882844.54206: variable 'ansible_facts' from source: unknown 30564 1726882844.54293: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882844.5093896-32501-13173436326115/AnsiballZ_stat.py 30564 1726882844.54422: Sending initial data 30564 1726882844.54426: Sent initial data (152 bytes) 30564 1726882844.55332: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882844.55337: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882844.55348: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882844.55362: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882844.55401: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882844.55406: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882844.55416: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882844.55429: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882844.55437: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882844.55444: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882844.55451: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882844.55461: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882844.55476: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882844.55486: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882844.55491: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882844.55502: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882844.55579: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882844.55586: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882844.55599: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882844.55745: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882844.57548: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882844.57645: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882844.57742: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmpjpf2u_iu /root/.ansible/tmp/ansible-tmp-1726882844.5093896-32501-13173436326115/AnsiballZ_stat.py <<< 30564 1726882844.57838: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882844.59572: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882844.59576: stderr chunk (state=3): >>><<< 30564 1726882844.59578: stdout chunk (state=3): >>><<< 30564 1726882844.59580: done transferring module to remote 30564 1726882844.59587: _low_level_execute_command(): starting 30564 1726882844.59590: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882844.5093896-32501-13173436326115/ /root/.ansible/tmp/ansible-tmp-1726882844.5093896-32501-13173436326115/AnsiballZ_stat.py && sleep 0' 30564 1726882844.60154: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882844.60162: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882844.60175: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882844.60187: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882844.60223: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882844.60230: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882844.60239: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882844.60261: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882844.60273: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882844.60280: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882844.60288: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882844.60296: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882844.60308: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882844.60316: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882844.60322: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882844.60332: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882844.60412: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882844.60428: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882844.60439: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882844.60562: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882844.62435: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882844.62439: stdout chunk (state=3): >>><<< 30564 1726882844.62445: stderr chunk (state=3): >>><<< 30564 1726882844.62463: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882844.62470: _low_level_execute_command(): starting 30564 1726882844.62473: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882844.5093896-32501-13173436326115/AnsiballZ_stat.py && sleep 0' 30564 1726882844.63138: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882844.63141: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882844.63143: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882844.63146: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882844.63386: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 30564 1726882844.63393: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882844.63397: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882844.63413: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 30564 1726882844.63418: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882844.63506: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882844.63512: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882844.63521: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882844.63649: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882844.76876: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/statebr", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 32163, "dev": 21, "nlink": 1, "atime": 1726882837.2303085, "mtime": 1726882837.2303085, "ctime": 1726882837.2303085, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/statebr", "lnk_target": "../../devices/virtual/net/statebr", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 30564 1726882844.77875: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882844.77921: stderr chunk (state=3): >>><<< 30564 1726882844.77924: stdout chunk (state=3): >>><<< 30564 1726882844.77938: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/statebr", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 32163, "dev": 21, "nlink": 1, "atime": 1726882837.2303085, "mtime": 1726882837.2303085, "ctime": 1726882837.2303085, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/statebr", "lnk_target": "../../devices/virtual/net/statebr", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882844.77983: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882844.5093896-32501-13173436326115/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882844.77992: _low_level_execute_command(): starting 30564 1726882844.77996: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882844.5093896-32501-13173436326115/ > /dev/null 2>&1 && sleep 0' 30564 1726882844.78514: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882844.78528: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882844.78543: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882844.78561: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882844.78604: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882844.78619: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882844.78636: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882844.78653: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882844.78670: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882844.78684: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882844.78696: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882844.78709: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882844.78724: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882844.78739: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882844.78751: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882844.78767: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882844.78840: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882844.78861: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882844.78879: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882844.79009: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882844.80842: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882844.80895: stderr chunk (state=3): >>><<< 30564 1726882844.80899: stdout chunk (state=3): >>><<< 30564 1726882844.80914: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882844.80920: handler run complete 30564 1726882844.80974: attempt loop complete, returning result 30564 1726882844.80978: _execute() done 30564 1726882844.80980: dumping result to json 30564 1726882844.80982: done dumping result, returning 30564 1726882844.80990: done running TaskExecutor() for managed_node2/TASK: Get stat for interface statebr [0e448fcc-3ce9-4216-acec-000000000ef5] 30564 1726882844.80997: sending task result for task 0e448fcc-3ce9-4216-acec-000000000ef5 30564 1726882844.81120: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000ef5 30564 1726882844.81123: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "atime": 1726882837.2303085, "block_size": 4096, "blocks": 0, "ctime": 1726882837.2303085, "dev": 21, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 32163, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/statebr", "lnk_target": "../../devices/virtual/net/statebr", "mode": "0777", "mtime": 1726882837.2303085, "nlink": 1, "path": "/sys/class/net/statebr", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 30564 1726882844.81218: no more pending results, returning what we have 30564 1726882844.81221: results queue empty 30564 1726882844.81222: checking for any_errors_fatal 30564 1726882844.81224: done checking for any_errors_fatal 30564 1726882844.81225: checking for max_fail_percentage 30564 1726882844.81226: done checking for max_fail_percentage 30564 1726882844.81227: checking to see if all hosts have failed and the running result is not ok 30564 1726882844.81228: done checking to see if all hosts have failed 30564 1726882844.81229: getting the remaining hosts for this loop 30564 1726882844.81230: done getting the remaining hosts for this loop 30564 1726882844.81236: getting the next task for host managed_node2 30564 1726882844.81245: done getting next task for host managed_node2 30564 1726882844.81247: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 30564 1726882844.81251: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882844.81259: getting variables 30564 1726882844.81260: in VariableManager get_vars() 30564 1726882844.81300: Calling all_inventory to load vars for managed_node2 30564 1726882844.81303: Calling groups_inventory to load vars for managed_node2 30564 1726882844.81306: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882844.81315: Calling all_plugins_play to load vars for managed_node2 30564 1726882844.81318: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882844.81320: Calling groups_plugins_play to load vars for managed_node2 30564 1726882844.83015: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882844.84949: done with get_vars() 30564 1726882844.84989: done getting variables 30564 1726882844.85047: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30564 1726882844.85195: variable 'interface' from source: play vars TASK [Assert that the interface is present - 'statebr'] ************************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 21:40:44 -0400 (0:00:00.397) 0:00:43.433 ****** 30564 1726882844.85226: entering _queue_task() for managed_node2/assert 30564 1726882844.85577: worker is 1 (out of 1 available) 30564 1726882844.85592: exiting _queue_task() for managed_node2/assert 30564 1726882844.85605: done queuing things up, now waiting for results queue to drain 30564 1726882844.85607: waiting for pending results... 30564 1726882844.85930: running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'statebr' 30564 1726882844.86076: in run() - task 0e448fcc-3ce9-4216-acec-000000000e87 30564 1726882844.86095: variable 'ansible_search_path' from source: unknown 30564 1726882844.86102: variable 'ansible_search_path' from source: unknown 30564 1726882844.86141: calling self._execute() 30564 1726882844.86254: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882844.86278: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882844.86300: variable 'omit' from source: magic vars 30564 1726882844.86704: variable 'ansible_distribution_major_version' from source: facts 30564 1726882844.86733: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882844.86744: variable 'omit' from source: magic vars 30564 1726882844.86798: variable 'omit' from source: magic vars 30564 1726882844.86906: variable 'interface' from source: play vars 30564 1726882844.86936: variable 'omit' from source: magic vars 30564 1726882844.86992: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882844.87042: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882844.87079: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882844.87103: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882844.87120: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882844.87171: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882844.87182: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882844.87191: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882844.87312: Set connection var ansible_timeout to 10 30564 1726882844.87324: Set connection var ansible_pipelining to False 30564 1726882844.87332: Set connection var ansible_shell_type to sh 30564 1726882844.87343: Set connection var ansible_shell_executable to /bin/sh 30564 1726882844.87355: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882844.87378: Set connection var ansible_connection to ssh 30564 1726882844.87409: variable 'ansible_shell_executable' from source: unknown 30564 1726882844.87418: variable 'ansible_connection' from source: unknown 30564 1726882844.87427: variable 'ansible_module_compression' from source: unknown 30564 1726882844.87434: variable 'ansible_shell_type' from source: unknown 30564 1726882844.87441: variable 'ansible_shell_executable' from source: unknown 30564 1726882844.87448: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882844.87455: variable 'ansible_pipelining' from source: unknown 30564 1726882844.87461: variable 'ansible_timeout' from source: unknown 30564 1726882844.87481: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882844.87636: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882844.87651: variable 'omit' from source: magic vars 30564 1726882844.87660: starting attempt loop 30564 1726882844.87672: running the handler 30564 1726882844.87832: variable 'interface_stat' from source: set_fact 30564 1726882844.87859: Evaluated conditional (interface_stat.stat.exists): True 30564 1726882844.87874: handler run complete 30564 1726882844.87894: attempt loop complete, returning result 30564 1726882844.87903: _execute() done 30564 1726882844.87916: dumping result to json 30564 1726882844.87929: done dumping result, returning 30564 1726882844.87941: done running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'statebr' [0e448fcc-3ce9-4216-acec-000000000e87] 30564 1726882844.87951: sending task result for task 0e448fcc-3ce9-4216-acec-000000000e87 ok: [managed_node2] => { "changed": false } MSG: All assertions passed 30564 1726882844.88104: no more pending results, returning what we have 30564 1726882844.88108: results queue empty 30564 1726882844.88109: checking for any_errors_fatal 30564 1726882844.88118: done checking for any_errors_fatal 30564 1726882844.88119: checking for max_fail_percentage 30564 1726882844.88121: done checking for max_fail_percentage 30564 1726882844.88122: checking to see if all hosts have failed and the running result is not ok 30564 1726882844.88123: done checking to see if all hosts have failed 30564 1726882844.88124: getting the remaining hosts for this loop 30564 1726882844.88126: done getting the remaining hosts for this loop 30564 1726882844.88130: getting the next task for host managed_node2 30564 1726882844.88141: done getting next task for host managed_node2 30564 1726882844.88144: ^ task is: TASK: Include the task 'get_profile_stat.yml' 30564 1726882844.88150: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882844.88155: getting variables 30564 1726882844.88157: in VariableManager get_vars() 30564 1726882844.88203: Calling all_inventory to load vars for managed_node2 30564 1726882844.88206: Calling groups_inventory to load vars for managed_node2 30564 1726882844.88210: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882844.88221: Calling all_plugins_play to load vars for managed_node2 30564 1726882844.88225: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882844.88227: Calling groups_plugins_play to load vars for managed_node2 30564 1726882844.89303: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000e87 30564 1726882844.89306: WORKER PROCESS EXITING 30564 1726882844.90232: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882844.91978: done with get_vars() 30564 1726882844.91996: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 21:40:44 -0400 (0:00:00.068) 0:00:43.502 ****** 30564 1726882844.92077: entering _queue_task() for managed_node2/include_tasks 30564 1726882844.92293: worker is 1 (out of 1 available) 30564 1726882844.92308: exiting _queue_task() for managed_node2/include_tasks 30564 1726882844.92319: done queuing things up, now waiting for results queue to drain 30564 1726882844.92321: waiting for pending results... 30564 1726882844.92506: running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' 30564 1726882844.92582: in run() - task 0e448fcc-3ce9-4216-acec-000000000e8b 30564 1726882844.92593: variable 'ansible_search_path' from source: unknown 30564 1726882844.92596: variable 'ansible_search_path' from source: unknown 30564 1726882844.92623: calling self._execute() 30564 1726882844.92702: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882844.92706: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882844.92716: variable 'omit' from source: magic vars 30564 1726882844.92984: variable 'ansible_distribution_major_version' from source: facts 30564 1726882844.92995: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882844.93000: _execute() done 30564 1726882844.93003: dumping result to json 30564 1726882844.93005: done dumping result, returning 30564 1726882844.93011: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' [0e448fcc-3ce9-4216-acec-000000000e8b] 30564 1726882844.93017: sending task result for task 0e448fcc-3ce9-4216-acec-000000000e8b 30564 1726882844.93106: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000e8b 30564 1726882844.93109: WORKER PROCESS EXITING 30564 1726882844.93139: no more pending results, returning what we have 30564 1726882844.93144: in VariableManager get_vars() 30564 1726882844.93184: Calling all_inventory to load vars for managed_node2 30564 1726882844.93187: Calling groups_inventory to load vars for managed_node2 30564 1726882844.93190: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882844.93200: Calling all_plugins_play to load vars for managed_node2 30564 1726882844.93203: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882844.93206: Calling groups_plugins_play to load vars for managed_node2 30564 1726882844.95143: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882844.97495: done with get_vars() 30564 1726882844.97514: variable 'ansible_search_path' from source: unknown 30564 1726882844.97516: variable 'ansible_search_path' from source: unknown 30564 1726882844.97523: variable 'item' from source: include params 30564 1726882844.97635: variable 'item' from source: include params 30564 1726882844.97666: we have included files to process 30564 1726882844.97667: generating all_blocks data 30564 1726882844.97669: done generating all_blocks data 30564 1726882844.97674: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30564 1726882844.97675: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30564 1726882844.97676: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30564 1726882844.98271: done processing included file 30564 1726882844.98273: iterating over new_blocks loaded from include file 30564 1726882844.98274: in VariableManager get_vars() 30564 1726882844.98285: done with get_vars() 30564 1726882844.98286: filtering new block on tags 30564 1726882844.98331: done filtering new block on tags 30564 1726882844.98333: in VariableManager get_vars() 30564 1726882844.98341: done with get_vars() 30564 1726882844.98342: filtering new block on tags 30564 1726882844.98378: done filtering new block on tags 30564 1726882844.98379: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node2 30564 1726882844.98383: extending task lists for all hosts with included blocks 30564 1726882844.98655: done extending task lists 30564 1726882844.98656: done processing included files 30564 1726882844.98657: results queue empty 30564 1726882844.98657: checking for any_errors_fatal 30564 1726882844.98660: done checking for any_errors_fatal 30564 1726882844.98661: checking for max_fail_percentage 30564 1726882844.98661: done checking for max_fail_percentage 30564 1726882844.98662: checking to see if all hosts have failed and the running result is not ok 30564 1726882844.98662: done checking to see if all hosts have failed 30564 1726882844.98663: getting the remaining hosts for this loop 30564 1726882844.98665: done getting the remaining hosts for this loop 30564 1726882844.98667: getting the next task for host managed_node2 30564 1726882844.98671: done getting next task for host managed_node2 30564 1726882844.98672: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 30564 1726882844.98676: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882844.98678: getting variables 30564 1726882844.98678: in VariableManager get_vars() 30564 1726882844.98685: Calling all_inventory to load vars for managed_node2 30564 1726882844.98686: Calling groups_inventory to load vars for managed_node2 30564 1726882844.98687: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882844.98691: Calling all_plugins_play to load vars for managed_node2 30564 1726882844.98692: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882844.98694: Calling groups_plugins_play to load vars for managed_node2 30564 1726882844.99682: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882845.00869: done with get_vars() 30564 1726882845.00883: done getting variables 30564 1726882845.00909: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 21:40:45 -0400 (0:00:00.088) 0:00:43.590 ****** 30564 1726882845.00931: entering _queue_task() for managed_node2/set_fact 30564 1726882845.01137: worker is 1 (out of 1 available) 30564 1726882845.01147: exiting _queue_task() for managed_node2/set_fact 30564 1726882845.01158: done queuing things up, now waiting for results queue to drain 30564 1726882845.01160: waiting for pending results... 30564 1726882845.01343: running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag 30564 1726882845.01416: in run() - task 0e448fcc-3ce9-4216-acec-000000000f13 30564 1726882845.01427: variable 'ansible_search_path' from source: unknown 30564 1726882845.01431: variable 'ansible_search_path' from source: unknown 30564 1726882845.01457: calling self._execute() 30564 1726882845.01530: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882845.01536: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882845.01546: variable 'omit' from source: magic vars 30564 1726882845.01808: variable 'ansible_distribution_major_version' from source: facts 30564 1726882845.01818: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882845.01825: variable 'omit' from source: magic vars 30564 1726882845.01861: variable 'omit' from source: magic vars 30564 1726882845.01888: variable 'omit' from source: magic vars 30564 1726882845.01923: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882845.01949: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882845.01967: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882845.01982: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882845.01992: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882845.02020: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882845.02023: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882845.02026: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882845.02092: Set connection var ansible_timeout to 10 30564 1726882845.02096: Set connection var ansible_pipelining to False 30564 1726882845.02098: Set connection var ansible_shell_type to sh 30564 1726882845.02104: Set connection var ansible_shell_executable to /bin/sh 30564 1726882845.02114: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882845.02118: Set connection var ansible_connection to ssh 30564 1726882845.02136: variable 'ansible_shell_executable' from source: unknown 30564 1726882845.02139: variable 'ansible_connection' from source: unknown 30564 1726882845.02144: variable 'ansible_module_compression' from source: unknown 30564 1726882845.02147: variable 'ansible_shell_type' from source: unknown 30564 1726882845.02149: variable 'ansible_shell_executable' from source: unknown 30564 1726882845.02152: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882845.02154: variable 'ansible_pipelining' from source: unknown 30564 1726882845.02157: variable 'ansible_timeout' from source: unknown 30564 1726882845.02159: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882845.02255: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882845.02265: variable 'omit' from source: magic vars 30564 1726882845.02272: starting attempt loop 30564 1726882845.02276: running the handler 30564 1726882845.02287: handler run complete 30564 1726882845.02295: attempt loop complete, returning result 30564 1726882845.02297: _execute() done 30564 1726882845.02299: dumping result to json 30564 1726882845.02302: done dumping result, returning 30564 1726882845.02308: done running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag [0e448fcc-3ce9-4216-acec-000000000f13] 30564 1726882845.02313: sending task result for task 0e448fcc-3ce9-4216-acec-000000000f13 30564 1726882845.02393: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000f13 30564 1726882845.02396: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 30564 1726882845.02454: no more pending results, returning what we have 30564 1726882845.02458: results queue empty 30564 1726882845.02459: checking for any_errors_fatal 30564 1726882845.02461: done checking for any_errors_fatal 30564 1726882845.02462: checking for max_fail_percentage 30564 1726882845.02465: done checking for max_fail_percentage 30564 1726882845.02466: checking to see if all hosts have failed and the running result is not ok 30564 1726882845.02467: done checking to see if all hosts have failed 30564 1726882845.02470: getting the remaining hosts for this loop 30564 1726882845.02471: done getting the remaining hosts for this loop 30564 1726882845.02475: getting the next task for host managed_node2 30564 1726882845.02482: done getting next task for host managed_node2 30564 1726882845.02484: ^ task is: TASK: Stat profile file 30564 1726882845.02488: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882845.02492: getting variables 30564 1726882845.02493: in VariableManager get_vars() 30564 1726882845.02525: Calling all_inventory to load vars for managed_node2 30564 1726882845.02527: Calling groups_inventory to load vars for managed_node2 30564 1726882845.02530: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882845.02539: Calling all_plugins_play to load vars for managed_node2 30564 1726882845.02542: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882845.02549: Calling groups_plugins_play to load vars for managed_node2 30564 1726882845.03421: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882845.04374: done with get_vars() 30564 1726882845.04389: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 21:40:45 -0400 (0:00:00.035) 0:00:43.625 ****** 30564 1726882845.04446: entering _queue_task() for managed_node2/stat 30564 1726882845.04633: worker is 1 (out of 1 available) 30564 1726882845.04646: exiting _queue_task() for managed_node2/stat 30564 1726882845.04657: done queuing things up, now waiting for results queue to drain 30564 1726882845.04658: waiting for pending results... 30564 1726882845.04827: running TaskExecutor() for managed_node2/TASK: Stat profile file 30564 1726882845.04900: in run() - task 0e448fcc-3ce9-4216-acec-000000000f14 30564 1726882845.04911: variable 'ansible_search_path' from source: unknown 30564 1726882845.04915: variable 'ansible_search_path' from source: unknown 30564 1726882845.04940: calling self._execute() 30564 1726882845.05007: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882845.05013: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882845.05021: variable 'omit' from source: magic vars 30564 1726882845.05274: variable 'ansible_distribution_major_version' from source: facts 30564 1726882845.05283: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882845.05289: variable 'omit' from source: magic vars 30564 1726882845.05329: variable 'omit' from source: magic vars 30564 1726882845.05395: variable 'profile' from source: play vars 30564 1726882845.05399: variable 'interface' from source: play vars 30564 1726882845.05444: variable 'interface' from source: play vars 30564 1726882845.05460: variable 'omit' from source: magic vars 30564 1726882845.05493: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882845.05518: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882845.05533: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882845.05548: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882845.05561: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882845.05582: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882845.05585: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882845.05588: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882845.05652: Set connection var ansible_timeout to 10 30564 1726882845.05657: Set connection var ansible_pipelining to False 30564 1726882845.05660: Set connection var ansible_shell_type to sh 30564 1726882845.05671: Set connection var ansible_shell_executable to /bin/sh 30564 1726882845.05674: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882845.05678: Set connection var ansible_connection to ssh 30564 1726882845.05695: variable 'ansible_shell_executable' from source: unknown 30564 1726882845.05698: variable 'ansible_connection' from source: unknown 30564 1726882845.05701: variable 'ansible_module_compression' from source: unknown 30564 1726882845.05703: variable 'ansible_shell_type' from source: unknown 30564 1726882845.05705: variable 'ansible_shell_executable' from source: unknown 30564 1726882845.05707: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882845.05710: variable 'ansible_pipelining' from source: unknown 30564 1726882845.05712: variable 'ansible_timeout' from source: unknown 30564 1726882845.05717: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882845.05855: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30564 1726882845.05866: variable 'omit' from source: magic vars 30564 1726882845.05873: starting attempt loop 30564 1726882845.05876: running the handler 30564 1726882845.05888: _low_level_execute_command(): starting 30564 1726882845.05891: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882845.06390: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882845.06399: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882845.06434: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882845.06447: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882845.06453: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882845.06508: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882845.06526: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882845.06642: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882845.08296: stdout chunk (state=3): >>>/root <<< 30564 1726882845.08398: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882845.08443: stderr chunk (state=3): >>><<< 30564 1726882845.08446: stdout chunk (state=3): >>><<< 30564 1726882845.08471: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882845.08485: _low_level_execute_command(): starting 30564 1726882845.08489: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882845.084727-32529-175556671418242 `" && echo ansible-tmp-1726882845.084727-32529-175556671418242="` echo /root/.ansible/tmp/ansible-tmp-1726882845.084727-32529-175556671418242 `" ) && sleep 0' 30564 1726882845.08918: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882845.08924: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882845.08951: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882845.08975: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882845.09022: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882845.09034: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882845.09145: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882845.11051: stdout chunk (state=3): >>>ansible-tmp-1726882845.084727-32529-175556671418242=/root/.ansible/tmp/ansible-tmp-1726882845.084727-32529-175556671418242 <<< 30564 1726882845.11161: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882845.11209: stderr chunk (state=3): >>><<< 30564 1726882845.11212: stdout chunk (state=3): >>><<< 30564 1726882845.11223: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882845.084727-32529-175556671418242=/root/.ansible/tmp/ansible-tmp-1726882845.084727-32529-175556671418242 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882845.11260: variable 'ansible_module_compression' from source: unknown 30564 1726882845.11306: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 30564 1726882845.11333: variable 'ansible_facts' from source: unknown 30564 1726882845.11400: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882845.084727-32529-175556671418242/AnsiballZ_stat.py 30564 1726882845.11500: Sending initial data 30564 1726882845.11509: Sent initial data (152 bytes) 30564 1726882845.12172: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882845.12187: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882845.12201: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882845.12219: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882845.12257: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882845.12278: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882845.12293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882845.12312: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882845.12324: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882845.12337: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882845.12350: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882845.12363: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882845.12386: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882845.12398: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882845.12410: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882845.12424: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882845.12505: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882845.12528: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882845.12544: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882845.12675: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882845.14472: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882845.14565: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882845.14662: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmptnvkbo5p /root/.ansible/tmp/ansible-tmp-1726882845.084727-32529-175556671418242/AnsiballZ_stat.py <<< 30564 1726882845.14755: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882845.16279: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882845.16346: stderr chunk (state=3): >>><<< 30564 1726882845.16349: stdout chunk (state=3): >>><<< 30564 1726882845.16380: done transferring module to remote 30564 1726882845.16391: _low_level_execute_command(): starting 30564 1726882845.16394: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882845.084727-32529-175556671418242/ /root/.ansible/tmp/ansible-tmp-1726882845.084727-32529-175556671418242/AnsiballZ_stat.py && sleep 0' 30564 1726882845.17055: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882845.17063: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882845.17078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882845.17092: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882845.17138: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882845.17144: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882845.17155: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882845.17172: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882845.17176: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882845.17183: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882845.17191: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882845.17200: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882845.17211: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882845.17225: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882845.17233: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882845.17248: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882845.17318: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882845.17334: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882845.17341: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882845.17483: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882845.19348: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882845.19352: stdout chunk (state=3): >>><<< 30564 1726882845.19354: stderr chunk (state=3): >>><<< 30564 1726882845.19429: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882845.19432: _low_level_execute_command(): starting 30564 1726882845.19435: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882845.084727-32529-175556671418242/AnsiballZ_stat.py && sleep 0' 30564 1726882845.20961: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882845.20983: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882845.21104: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882845.21121: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882845.21160: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882845.21178: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882845.21192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882845.21213: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882845.21223: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882845.21232: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882845.21243: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882845.21255: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882845.21274: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882845.21286: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882845.21296: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882845.21312: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882845.21392: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882845.21543: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882845.21558: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882845.21708: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882845.34851: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 30564 1726882845.35885: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882845.36056: stderr chunk (state=3): >>><<< 30564 1726882845.36059: stdout chunk (state=3): >>><<< 30564 1726882845.36186: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882845.36190: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882845.084727-32529-175556671418242/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882845.36194: _low_level_execute_command(): starting 30564 1726882845.36196: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882845.084727-32529-175556671418242/ > /dev/null 2>&1 && sleep 0' 30564 1726882845.37875: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882845.37884: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882845.37897: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882845.37916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882845.37960: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882845.37973: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882845.37980: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882845.37993: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882845.38000: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882845.38008: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882845.38019: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882845.38028: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882845.38040: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882845.38054: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882845.38061: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882845.38073: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882845.38144: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882845.38174: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882845.38183: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882845.38320: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882845.40215: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882845.40219: stdout chunk (state=3): >>><<< 30564 1726882845.40225: stderr chunk (state=3): >>><<< 30564 1726882845.40241: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882845.40249: handler run complete 30564 1726882845.40284: attempt loop complete, returning result 30564 1726882845.40287: _execute() done 30564 1726882845.40289: dumping result to json 30564 1726882845.40292: done dumping result, returning 30564 1726882845.40301: done running TaskExecutor() for managed_node2/TASK: Stat profile file [0e448fcc-3ce9-4216-acec-000000000f14] 30564 1726882845.40307: sending task result for task 0e448fcc-3ce9-4216-acec-000000000f14 30564 1726882845.40418: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000f14 30564 1726882845.40421: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 30564 1726882845.40528: no more pending results, returning what we have 30564 1726882845.40532: results queue empty 30564 1726882845.40534: checking for any_errors_fatal 30564 1726882845.40542: done checking for any_errors_fatal 30564 1726882845.40543: checking for max_fail_percentage 30564 1726882845.40547: done checking for max_fail_percentage 30564 1726882845.40548: checking to see if all hosts have failed and the running result is not ok 30564 1726882845.40549: done checking to see if all hosts have failed 30564 1726882845.40550: getting the remaining hosts for this loop 30564 1726882845.40552: done getting the remaining hosts for this loop 30564 1726882845.40557: getting the next task for host managed_node2 30564 1726882845.40569: done getting next task for host managed_node2 30564 1726882845.40572: ^ task is: TASK: Set NM profile exist flag based on the profile files 30564 1726882845.40578: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882845.40583: getting variables 30564 1726882845.40584: in VariableManager get_vars() 30564 1726882845.40624: Calling all_inventory to load vars for managed_node2 30564 1726882845.40630: Calling groups_inventory to load vars for managed_node2 30564 1726882845.40635: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882845.40647: Calling all_plugins_play to load vars for managed_node2 30564 1726882845.40652: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882845.40656: Calling groups_plugins_play to load vars for managed_node2 30564 1726882845.43554: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882845.47430: done with get_vars() 30564 1726882845.47458: done getting variables 30564 1726882845.47642: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 21:40:45 -0400 (0:00:00.432) 0:00:44.058 ****** 30564 1726882845.47676: entering _queue_task() for managed_node2/set_fact 30564 1726882845.48520: worker is 1 (out of 1 available) 30564 1726882845.48532: exiting _queue_task() for managed_node2/set_fact 30564 1726882845.48545: done queuing things up, now waiting for results queue to drain 30564 1726882845.48546: waiting for pending results... 30564 1726882845.49079: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files 30564 1726882845.49226: in run() - task 0e448fcc-3ce9-4216-acec-000000000f15 30564 1726882845.49245: variable 'ansible_search_path' from source: unknown 30564 1726882845.49255: variable 'ansible_search_path' from source: unknown 30564 1726882845.49305: calling self._execute() 30564 1726882845.49408: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882845.49422: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882845.49443: variable 'omit' from source: magic vars 30564 1726882845.49837: variable 'ansible_distribution_major_version' from source: facts 30564 1726882845.49954: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882845.50131: variable 'profile_stat' from source: set_fact 30564 1726882845.50141: Evaluated conditional (profile_stat.stat.exists): False 30564 1726882845.50144: when evaluation is False, skipping this task 30564 1726882845.50147: _execute() done 30564 1726882845.50149: dumping result to json 30564 1726882845.50152: done dumping result, returning 30564 1726882845.50158: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files [0e448fcc-3ce9-4216-acec-000000000f15] 30564 1726882845.50165: sending task result for task 0e448fcc-3ce9-4216-acec-000000000f15 30564 1726882845.50272: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000f15 30564 1726882845.50275: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30564 1726882845.50346: no more pending results, returning what we have 30564 1726882845.50351: results queue empty 30564 1726882845.50353: checking for any_errors_fatal 30564 1726882845.50365: done checking for any_errors_fatal 30564 1726882845.50366: checking for max_fail_percentage 30564 1726882845.50369: done checking for max_fail_percentage 30564 1726882845.50370: checking to see if all hosts have failed and the running result is not ok 30564 1726882845.50371: done checking to see if all hosts have failed 30564 1726882845.50372: getting the remaining hosts for this loop 30564 1726882845.50374: done getting the remaining hosts for this loop 30564 1726882845.50379: getting the next task for host managed_node2 30564 1726882845.50389: done getting next task for host managed_node2 30564 1726882845.50391: ^ task is: TASK: Get NM profile info 30564 1726882845.50398: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882845.50403: getting variables 30564 1726882845.50405: in VariableManager get_vars() 30564 1726882845.50444: Calling all_inventory to load vars for managed_node2 30564 1726882845.50447: Calling groups_inventory to load vars for managed_node2 30564 1726882845.50451: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882845.50467: Calling all_plugins_play to load vars for managed_node2 30564 1726882845.50471: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882845.50475: Calling groups_plugins_play to load vars for managed_node2 30564 1726882845.53027: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882845.54431: done with get_vars() 30564 1726882845.54447: done getting variables 30564 1726882845.54493: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 21:40:45 -0400 (0:00:00.068) 0:00:44.126 ****** 30564 1726882845.54519: entering _queue_task() for managed_node2/shell 30564 1726882845.54748: worker is 1 (out of 1 available) 30564 1726882845.54765: exiting _queue_task() for managed_node2/shell 30564 1726882845.54780: done queuing things up, now waiting for results queue to drain 30564 1726882845.54782: waiting for pending results... 30564 1726882845.55113: running TaskExecutor() for managed_node2/TASK: Get NM profile info 30564 1726882845.55203: in run() - task 0e448fcc-3ce9-4216-acec-000000000f16 30564 1726882845.55216: variable 'ansible_search_path' from source: unknown 30564 1726882845.55220: variable 'ansible_search_path' from source: unknown 30564 1726882845.55253: calling self._execute() 30564 1726882845.55356: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882845.55360: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882845.55378: variable 'omit' from source: magic vars 30564 1726882845.55798: variable 'ansible_distribution_major_version' from source: facts 30564 1726882845.55820: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882845.55824: variable 'omit' from source: magic vars 30564 1726882845.55883: variable 'omit' from source: magic vars 30564 1726882845.55991: variable 'profile' from source: play vars 30564 1726882845.55995: variable 'interface' from source: play vars 30564 1726882845.56070: variable 'interface' from source: play vars 30564 1726882845.56085: variable 'omit' from source: magic vars 30564 1726882845.56141: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882845.56184: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882845.56203: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882845.56216: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882845.56226: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882845.56281: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882845.56284: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882845.56287: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882845.56353: Set connection var ansible_timeout to 10 30564 1726882845.56356: Set connection var ansible_pipelining to False 30564 1726882845.56359: Set connection var ansible_shell_type to sh 30564 1726882845.56361: Set connection var ansible_shell_executable to /bin/sh 30564 1726882845.56372: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882845.56377: Set connection var ansible_connection to ssh 30564 1726882845.56395: variable 'ansible_shell_executable' from source: unknown 30564 1726882845.56399: variable 'ansible_connection' from source: unknown 30564 1726882845.56401: variable 'ansible_module_compression' from source: unknown 30564 1726882845.56405: variable 'ansible_shell_type' from source: unknown 30564 1726882845.56407: variable 'ansible_shell_executable' from source: unknown 30564 1726882845.56409: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882845.56412: variable 'ansible_pipelining' from source: unknown 30564 1726882845.56416: variable 'ansible_timeout' from source: unknown 30564 1726882845.56419: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882845.56530: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882845.56546: variable 'omit' from source: magic vars 30564 1726882845.56549: starting attempt loop 30564 1726882845.56552: running the handler 30564 1726882845.56562: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882845.56587: _low_level_execute_command(): starting 30564 1726882845.56593: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882845.57377: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882845.57390: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882845.57523: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882845.59187: stdout chunk (state=3): >>>/root <<< 30564 1726882845.59345: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882845.59350: stdout chunk (state=3): >>><<< 30564 1726882845.59358: stderr chunk (state=3): >>><<< 30564 1726882845.59382: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882845.59397: _low_level_execute_command(): starting 30564 1726882845.59403: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882845.5938253-32556-257239567765643 `" && echo ansible-tmp-1726882845.5938253-32556-257239567765643="` echo /root/.ansible/tmp/ansible-tmp-1726882845.5938253-32556-257239567765643 `" ) && sleep 0' 30564 1726882845.60988: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882845.61024: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882845.61034: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882845.61048: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882845.61107: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882845.61110: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882845.61116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882845.61132: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882845.61141: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882845.61147: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882845.61155: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882845.61164: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882845.61181: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882845.61188: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882845.61195: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882845.61205: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882845.61281: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882845.61314: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882845.61317: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882845.61618: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882845.63590: stdout chunk (state=3): >>>ansible-tmp-1726882845.5938253-32556-257239567765643=/root/.ansible/tmp/ansible-tmp-1726882845.5938253-32556-257239567765643 <<< 30564 1726882845.63731: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882845.63822: stderr chunk (state=3): >>><<< 30564 1726882845.64587: stdout chunk (state=3): >>><<< 30564 1726882845.64912: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882845.5938253-32556-257239567765643=/root/.ansible/tmp/ansible-tmp-1726882845.5938253-32556-257239567765643 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882845.64916: variable 'ansible_module_compression' from source: unknown 30564 1726882845.64918: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30564 1726882845.64920: variable 'ansible_facts' from source: unknown 30564 1726882845.64983: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882845.5938253-32556-257239567765643/AnsiballZ_command.py 30564 1726882845.65180: Sending initial data 30564 1726882845.65183: Sent initial data (156 bytes) 30564 1726882845.67007: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882845.67021: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882845.67034: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882845.67053: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882845.67105: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882845.67118: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882845.67131: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882845.67146: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882845.67156: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882845.67171: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882845.67193: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882845.67207: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882845.67223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882845.67236: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882845.67248: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882845.67262: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882845.67350: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882845.67368: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882845.67387: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882845.67530: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882845.69365: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882845.69456: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882845.69559: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmpp9jqcukg /root/.ansible/tmp/ansible-tmp-1726882845.5938253-32556-257239567765643/AnsiballZ_command.py <<< 30564 1726882845.69655: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882845.71242: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882845.71367: stderr chunk (state=3): >>><<< 30564 1726882845.71370: stdout chunk (state=3): >>><<< 30564 1726882845.71372: done transferring module to remote 30564 1726882845.71374: _low_level_execute_command(): starting 30564 1726882845.71376: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882845.5938253-32556-257239567765643/ /root/.ansible/tmp/ansible-tmp-1726882845.5938253-32556-257239567765643/AnsiballZ_command.py && sleep 0' 30564 1726882845.72740: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882845.72754: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882845.72786: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882845.72805: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882845.72932: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882845.72945: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882845.72959: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882845.72978: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882845.72993: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882845.73012: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882845.73025: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882845.73038: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882845.73055: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882845.73070: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882845.73083: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882845.73099: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882845.73234: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882845.73254: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882845.73342: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882845.73470: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882845.75371: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882845.75374: stdout chunk (state=3): >>><<< 30564 1726882845.75377: stderr chunk (state=3): >>><<< 30564 1726882845.75470: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882845.75474: _low_level_execute_command(): starting 30564 1726882845.75479: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882845.5938253-32556-257239567765643/AnsiballZ_command.py && sleep 0' 30564 1726882845.76939: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882845.77022: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882845.77038: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882845.77057: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882845.77104: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882845.77123: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882845.77237: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882845.77259: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882845.77274: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882845.77286: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882845.77297: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882845.77311: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882845.77329: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882845.77344: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882845.77361: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882845.77382: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882845.77464: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882845.77575: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882845.77592: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882845.77734: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882845.92837: stdout chunk (state=3): >>> {"changed": true, "stdout": "statebr /etc/NetworkManager/system-connections/statebr.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-20 21:40:45.906541", "end": "2024-09-20 21:40:45.926115", "delta": "0:00:00.019574", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30564 1726882845.94141: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882845.94145: stdout chunk (state=3): >>><<< 30564 1726882845.94147: stderr chunk (state=3): >>><<< 30564 1726882845.94290: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "statebr /etc/NetworkManager/system-connections/statebr.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-20 21:40:45.906541", "end": "2024-09-20 21:40:45.926115", "delta": "0:00:00.019574", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882845.94308: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882845.5938253-32556-257239567765643/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882845.94320: _low_level_execute_command(): starting 30564 1726882845.94323: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882845.5938253-32556-257239567765643/ > /dev/null 2>&1 && sleep 0' 30564 1726882845.94897: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882845.94912: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882845.94931: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882845.94950: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882845.94994: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882845.95006: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882845.95021: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882845.95040: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882845.95052: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882845.95068: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882845.95082: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882845.95097: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882845.95113: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882845.95126: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882845.95137: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882845.95152: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882845.95228: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882845.95277: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882845.95292: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882845.95581: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882845.97357: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882845.97360: stdout chunk (state=3): >>><<< 30564 1726882845.97363: stderr chunk (state=3): >>><<< 30564 1726882845.97675: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882845.97679: handler run complete 30564 1726882845.97681: Evaluated conditional (False): False 30564 1726882845.97683: attempt loop complete, returning result 30564 1726882845.97684: _execute() done 30564 1726882845.97686: dumping result to json 30564 1726882845.97687: done dumping result, returning 30564 1726882845.97689: done running TaskExecutor() for managed_node2/TASK: Get NM profile info [0e448fcc-3ce9-4216-acec-000000000f16] 30564 1726882845.97691: sending task result for task 0e448fcc-3ce9-4216-acec-000000000f16 30564 1726882845.97765: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000f16 30564 1726882845.97768: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "delta": "0:00:00.019574", "end": "2024-09-20 21:40:45.926115", "rc": 0, "start": "2024-09-20 21:40:45.906541" } STDOUT: statebr /etc/NetworkManager/system-connections/statebr.nmconnection 30564 1726882845.97850: no more pending results, returning what we have 30564 1726882845.97853: results queue empty 30564 1726882845.97855: checking for any_errors_fatal 30564 1726882845.97860: done checking for any_errors_fatal 30564 1726882845.97861: checking for max_fail_percentage 30564 1726882845.97863: done checking for max_fail_percentage 30564 1726882845.97865: checking to see if all hosts have failed and the running result is not ok 30564 1726882845.97866: done checking to see if all hosts have failed 30564 1726882845.97867: getting the remaining hosts for this loop 30564 1726882845.97869: done getting the remaining hosts for this loop 30564 1726882845.97873: getting the next task for host managed_node2 30564 1726882845.97882: done getting next task for host managed_node2 30564 1726882845.97887: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 30564 1726882845.97892: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882845.97896: getting variables 30564 1726882845.97898: in VariableManager get_vars() 30564 1726882845.97932: Calling all_inventory to load vars for managed_node2 30564 1726882845.97935: Calling groups_inventory to load vars for managed_node2 30564 1726882845.97939: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882845.97955: Calling all_plugins_play to load vars for managed_node2 30564 1726882845.97958: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882845.97961: Calling groups_plugins_play to load vars for managed_node2 30564 1726882846.00481: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882846.03076: done with get_vars() 30564 1726882846.03100: done getting variables 30564 1726882846.03161: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 21:40:46 -0400 (0:00:00.486) 0:00:44.613 ****** 30564 1726882846.03198: entering _queue_task() for managed_node2/set_fact 30564 1726882846.03496: worker is 1 (out of 1 available) 30564 1726882846.03509: exiting _queue_task() for managed_node2/set_fact 30564 1726882846.03521: done queuing things up, now waiting for results queue to drain 30564 1726882846.03523: waiting for pending results... 30564 1726882846.03810: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 30564 1726882846.03935: in run() - task 0e448fcc-3ce9-4216-acec-000000000f17 30564 1726882846.03957: variable 'ansible_search_path' from source: unknown 30564 1726882846.03968: variable 'ansible_search_path' from source: unknown 30564 1726882846.04005: calling self._execute() 30564 1726882846.04114: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882846.04123: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882846.04138: variable 'omit' from source: magic vars 30564 1726882846.04517: variable 'ansible_distribution_major_version' from source: facts 30564 1726882846.04536: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882846.04673: variable 'nm_profile_exists' from source: set_fact 30564 1726882846.04691: Evaluated conditional (nm_profile_exists.rc == 0): True 30564 1726882846.04703: variable 'omit' from source: magic vars 30564 1726882846.04765: variable 'omit' from source: magic vars 30564 1726882846.04804: variable 'omit' from source: magic vars 30564 1726882846.04855: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882846.04897: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882846.04920: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882846.04947: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882846.04966: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882846.05003: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882846.05012: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882846.05019: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882846.05129: Set connection var ansible_timeout to 10 30564 1726882846.05140: Set connection var ansible_pipelining to False 30564 1726882846.05147: Set connection var ansible_shell_type to sh 30564 1726882846.05162: Set connection var ansible_shell_executable to /bin/sh 30564 1726882846.05177: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882846.05184: Set connection var ansible_connection to ssh 30564 1726882846.05213: variable 'ansible_shell_executable' from source: unknown 30564 1726882846.05221: variable 'ansible_connection' from source: unknown 30564 1726882846.05228: variable 'ansible_module_compression' from source: unknown 30564 1726882846.05234: variable 'ansible_shell_type' from source: unknown 30564 1726882846.05240: variable 'ansible_shell_executable' from source: unknown 30564 1726882846.05247: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882846.05254: variable 'ansible_pipelining' from source: unknown 30564 1726882846.05263: variable 'ansible_timeout' from source: unknown 30564 1726882846.05276: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882846.05424: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882846.05441: variable 'omit' from source: magic vars 30564 1726882846.05451: starting attempt loop 30564 1726882846.05458: running the handler 30564 1726882846.05481: handler run complete 30564 1726882846.05500: attempt loop complete, returning result 30564 1726882846.05507: _execute() done 30564 1726882846.05513: dumping result to json 30564 1726882846.05520: done dumping result, returning 30564 1726882846.05531: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0e448fcc-3ce9-4216-acec-000000000f17] 30564 1726882846.05541: sending task result for task 0e448fcc-3ce9-4216-acec-000000000f17 ok: [managed_node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 30564 1726882846.05693: no more pending results, returning what we have 30564 1726882846.05697: results queue empty 30564 1726882846.05698: checking for any_errors_fatal 30564 1726882846.05707: done checking for any_errors_fatal 30564 1726882846.05708: checking for max_fail_percentage 30564 1726882846.05710: done checking for max_fail_percentage 30564 1726882846.05711: checking to see if all hosts have failed and the running result is not ok 30564 1726882846.05712: done checking to see if all hosts have failed 30564 1726882846.05713: getting the remaining hosts for this loop 30564 1726882846.05715: done getting the remaining hosts for this loop 30564 1726882846.05718: getting the next task for host managed_node2 30564 1726882846.05731: done getting next task for host managed_node2 30564 1726882846.05733: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 30564 1726882846.05739: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882846.05742: getting variables 30564 1726882846.05744: in VariableManager get_vars() 30564 1726882846.05779: Calling all_inventory to load vars for managed_node2 30564 1726882846.05782: Calling groups_inventory to load vars for managed_node2 30564 1726882846.05785: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882846.05797: Calling all_plugins_play to load vars for managed_node2 30564 1726882846.05801: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882846.05804: Calling groups_plugins_play to load vars for managed_node2 30564 1726882846.06781: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000f17 30564 1726882846.06784: WORKER PROCESS EXITING 30564 1726882846.07516: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882846.09287: done with get_vars() 30564 1726882846.09311: done getting variables 30564 1726882846.09354: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30564 1726882846.09447: variable 'profile' from source: play vars 30564 1726882846.09450: variable 'interface' from source: play vars 30564 1726882846.09494: variable 'interface' from source: play vars TASK [Get the ansible_managed comment in ifcfg-statebr] ************************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 21:40:46 -0400 (0:00:00.063) 0:00:44.676 ****** 30564 1726882846.09517: entering _queue_task() for managed_node2/command 30564 1726882846.09734: worker is 1 (out of 1 available) 30564 1726882846.09747: exiting _queue_task() for managed_node2/command 30564 1726882846.09761: done queuing things up, now waiting for results queue to drain 30564 1726882846.09762: waiting for pending results... 30564 1726882846.09945: running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-statebr 30564 1726882846.10160: in run() - task 0e448fcc-3ce9-4216-acec-000000000f19 30564 1726882846.10166: variable 'ansible_search_path' from source: unknown 30564 1726882846.10171: variable 'ansible_search_path' from source: unknown 30564 1726882846.10175: calling self._execute() 30564 1726882846.10305: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882846.10308: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882846.10311: variable 'omit' from source: magic vars 30564 1726882846.10574: variable 'ansible_distribution_major_version' from source: facts 30564 1726882846.10585: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882846.11000: variable 'profile_stat' from source: set_fact 30564 1726882846.11011: Evaluated conditional (profile_stat.stat.exists): False 30564 1726882846.11014: when evaluation is False, skipping this task 30564 1726882846.11017: _execute() done 30564 1726882846.11019: dumping result to json 30564 1726882846.11022: done dumping result, returning 30564 1726882846.11029: done running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-statebr [0e448fcc-3ce9-4216-acec-000000000f19] 30564 1726882846.11036: sending task result for task 0e448fcc-3ce9-4216-acec-000000000f19 30564 1726882846.11124: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000f19 30564 1726882846.11126: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30564 1726882846.11180: no more pending results, returning what we have 30564 1726882846.11184: results queue empty 30564 1726882846.11185: checking for any_errors_fatal 30564 1726882846.11190: done checking for any_errors_fatal 30564 1726882846.11190: checking for max_fail_percentage 30564 1726882846.11192: done checking for max_fail_percentage 30564 1726882846.11193: checking to see if all hosts have failed and the running result is not ok 30564 1726882846.11194: done checking to see if all hosts have failed 30564 1726882846.11194: getting the remaining hosts for this loop 30564 1726882846.11196: done getting the remaining hosts for this loop 30564 1726882846.11199: getting the next task for host managed_node2 30564 1726882846.11205: done getting next task for host managed_node2 30564 1726882846.11208: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 30564 1726882846.11212: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882846.11215: getting variables 30564 1726882846.11216: in VariableManager get_vars() 30564 1726882846.11242: Calling all_inventory to load vars for managed_node2 30564 1726882846.11244: Calling groups_inventory to load vars for managed_node2 30564 1726882846.11246: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882846.11256: Calling all_plugins_play to load vars for managed_node2 30564 1726882846.11259: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882846.11262: Calling groups_plugins_play to load vars for managed_node2 30564 1726882846.16359: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882846.17316: done with get_vars() 30564 1726882846.17339: done getting variables 30564 1726882846.17378: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30564 1726882846.17456: variable 'profile' from source: play vars 30564 1726882846.17459: variable 'interface' from source: play vars 30564 1726882846.17516: variable 'interface' from source: play vars TASK [Verify the ansible_managed comment in ifcfg-statebr] ********************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 21:40:46 -0400 (0:00:00.080) 0:00:44.756 ****** 30564 1726882846.17538: entering _queue_task() for managed_node2/set_fact 30564 1726882846.17813: worker is 1 (out of 1 available) 30564 1726882846.17827: exiting _queue_task() for managed_node2/set_fact 30564 1726882846.17839: done queuing things up, now waiting for results queue to drain 30564 1726882846.17841: waiting for pending results... 30564 1726882846.18273: running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-statebr 30564 1726882846.18285: in run() - task 0e448fcc-3ce9-4216-acec-000000000f1a 30564 1726882846.18290: variable 'ansible_search_path' from source: unknown 30564 1726882846.18296: variable 'ansible_search_path' from source: unknown 30564 1726882846.18841: calling self._execute() 30564 1726882846.18845: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882846.18849: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882846.18854: variable 'omit' from source: magic vars 30564 1726882846.18858: variable 'ansible_distribution_major_version' from source: facts 30564 1726882846.18862: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882846.18956: variable 'profile_stat' from source: set_fact 30564 1726882846.18960: Evaluated conditional (profile_stat.stat.exists): False 30564 1726882846.18965: when evaluation is False, skipping this task 30564 1726882846.18971: _execute() done 30564 1726882846.18974: dumping result to json 30564 1726882846.18979: done dumping result, returning 30564 1726882846.18990: done running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-statebr [0e448fcc-3ce9-4216-acec-000000000f1a] 30564 1726882846.18999: sending task result for task 0e448fcc-3ce9-4216-acec-000000000f1a skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30564 1726882846.19154: no more pending results, returning what we have 30564 1726882846.19158: results queue empty 30564 1726882846.19159: checking for any_errors_fatal 30564 1726882846.19169: done checking for any_errors_fatal 30564 1726882846.19170: checking for max_fail_percentage 30564 1726882846.19172: done checking for max_fail_percentage 30564 1726882846.19173: checking to see if all hosts have failed and the running result is not ok 30564 1726882846.19173: done checking to see if all hosts have failed 30564 1726882846.19174: getting the remaining hosts for this loop 30564 1726882846.19176: done getting the remaining hosts for this loop 30564 1726882846.19179: getting the next task for host managed_node2 30564 1726882846.19187: done getting next task for host managed_node2 30564 1726882846.19189: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 30564 1726882846.19195: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882846.19200: getting variables 30564 1726882846.19202: in VariableManager get_vars() 30564 1726882846.19254: Calling all_inventory to load vars for managed_node2 30564 1726882846.19258: Calling groups_inventory to load vars for managed_node2 30564 1726882846.19261: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882846.19272: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000f1a 30564 1726882846.19275: WORKER PROCESS EXITING 30564 1726882846.19288: Calling all_plugins_play to load vars for managed_node2 30564 1726882846.19293: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882846.19296: Calling groups_plugins_play to load vars for managed_node2 30564 1726882846.21313: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882846.22514: done with get_vars() 30564 1726882846.22530: done getting variables 30564 1726882846.22593: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30564 1726882846.22696: variable 'profile' from source: play vars 30564 1726882846.22700: variable 'interface' from source: play vars 30564 1726882846.22753: variable 'interface' from source: play vars TASK [Get the fingerprint comment in ifcfg-statebr] **************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 21:40:46 -0400 (0:00:00.052) 0:00:44.809 ****** 30564 1726882846.22787: entering _queue_task() for managed_node2/command 30564 1726882846.23068: worker is 1 (out of 1 available) 30564 1726882846.23082: exiting _queue_task() for managed_node2/command 30564 1726882846.23095: done queuing things up, now waiting for results queue to drain 30564 1726882846.23096: waiting for pending results... 30564 1726882846.23380: running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-statebr 30564 1726882846.23502: in run() - task 0e448fcc-3ce9-4216-acec-000000000f1b 30564 1726882846.23513: variable 'ansible_search_path' from source: unknown 30564 1726882846.23517: variable 'ansible_search_path' from source: unknown 30564 1726882846.23556: calling self._execute() 30564 1726882846.23638: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882846.23649: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882846.23673: variable 'omit' from source: magic vars 30564 1726882846.23994: variable 'ansible_distribution_major_version' from source: facts 30564 1726882846.24005: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882846.24087: variable 'profile_stat' from source: set_fact 30564 1726882846.24099: Evaluated conditional (profile_stat.stat.exists): False 30564 1726882846.24102: when evaluation is False, skipping this task 30564 1726882846.24106: _execute() done 30564 1726882846.24109: dumping result to json 30564 1726882846.24112: done dumping result, returning 30564 1726882846.24114: done running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-statebr [0e448fcc-3ce9-4216-acec-000000000f1b] 30564 1726882846.24119: sending task result for task 0e448fcc-3ce9-4216-acec-000000000f1b 30564 1726882846.24219: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000f1b 30564 1726882846.24223: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30564 1726882846.24277: no more pending results, returning what we have 30564 1726882846.24281: results queue empty 30564 1726882846.24282: checking for any_errors_fatal 30564 1726882846.24287: done checking for any_errors_fatal 30564 1726882846.24288: checking for max_fail_percentage 30564 1726882846.24289: done checking for max_fail_percentage 30564 1726882846.24290: checking to see if all hosts have failed and the running result is not ok 30564 1726882846.24291: done checking to see if all hosts have failed 30564 1726882846.24292: getting the remaining hosts for this loop 30564 1726882846.24293: done getting the remaining hosts for this loop 30564 1726882846.24296: getting the next task for host managed_node2 30564 1726882846.24302: done getting next task for host managed_node2 30564 1726882846.24304: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 30564 1726882846.24309: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882846.24312: getting variables 30564 1726882846.24313: in VariableManager get_vars() 30564 1726882846.24340: Calling all_inventory to load vars for managed_node2 30564 1726882846.24342: Calling groups_inventory to load vars for managed_node2 30564 1726882846.24345: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882846.24353: Calling all_plugins_play to load vars for managed_node2 30564 1726882846.24356: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882846.24358: Calling groups_plugins_play to load vars for managed_node2 30564 1726882846.25137: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882846.26105: done with get_vars() 30564 1726882846.26121: done getting variables 30564 1726882846.26159: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30564 1726882846.26234: variable 'profile' from source: play vars 30564 1726882846.26237: variable 'interface' from source: play vars 30564 1726882846.26278: variable 'interface' from source: play vars TASK [Verify the fingerprint comment in ifcfg-statebr] ************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 21:40:46 -0400 (0:00:00.035) 0:00:44.844 ****** 30564 1726882846.26299: entering _queue_task() for managed_node2/set_fact 30564 1726882846.26483: worker is 1 (out of 1 available) 30564 1726882846.26496: exiting _queue_task() for managed_node2/set_fact 30564 1726882846.26508: done queuing things up, now waiting for results queue to drain 30564 1726882846.26509: waiting for pending results... 30564 1726882846.26673: running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-statebr 30564 1726882846.26755: in run() - task 0e448fcc-3ce9-4216-acec-000000000f1c 30564 1726882846.26775: variable 'ansible_search_path' from source: unknown 30564 1726882846.26778: variable 'ansible_search_path' from source: unknown 30564 1726882846.26802: calling self._execute() 30564 1726882846.26874: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882846.26878: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882846.26886: variable 'omit' from source: magic vars 30564 1726882846.27139: variable 'ansible_distribution_major_version' from source: facts 30564 1726882846.27148: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882846.27232: variable 'profile_stat' from source: set_fact 30564 1726882846.27240: Evaluated conditional (profile_stat.stat.exists): False 30564 1726882846.27243: when evaluation is False, skipping this task 30564 1726882846.27246: _execute() done 30564 1726882846.27248: dumping result to json 30564 1726882846.27250: done dumping result, returning 30564 1726882846.27256: done running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-statebr [0e448fcc-3ce9-4216-acec-000000000f1c] 30564 1726882846.27262: sending task result for task 0e448fcc-3ce9-4216-acec-000000000f1c 30564 1726882846.27349: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000f1c 30564 1726882846.27352: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30564 1726882846.27420: no more pending results, returning what we have 30564 1726882846.27423: results queue empty 30564 1726882846.27424: checking for any_errors_fatal 30564 1726882846.27430: done checking for any_errors_fatal 30564 1726882846.27430: checking for max_fail_percentage 30564 1726882846.27432: done checking for max_fail_percentage 30564 1726882846.27433: checking to see if all hosts have failed and the running result is not ok 30564 1726882846.27433: done checking to see if all hosts have failed 30564 1726882846.27434: getting the remaining hosts for this loop 30564 1726882846.27435: done getting the remaining hosts for this loop 30564 1726882846.27438: getting the next task for host managed_node2 30564 1726882846.27446: done getting next task for host managed_node2 30564 1726882846.27449: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 30564 1726882846.27452: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882846.27456: getting variables 30564 1726882846.27457: in VariableManager get_vars() 30564 1726882846.27488: Calling all_inventory to load vars for managed_node2 30564 1726882846.27490: Calling groups_inventory to load vars for managed_node2 30564 1726882846.27492: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882846.27499: Calling all_plugins_play to load vars for managed_node2 30564 1726882846.27500: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882846.27502: Calling groups_plugins_play to load vars for managed_node2 30564 1726882846.28401: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882846.29333: done with get_vars() 30564 1726882846.29347: done getting variables 30564 1726882846.29390: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30564 1726882846.29467: variable 'profile' from source: play vars 30564 1726882846.29472: variable 'interface' from source: play vars 30564 1726882846.29510: variable 'interface' from source: play vars TASK [Assert that the profile is present - 'statebr'] ************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 21:40:46 -0400 (0:00:00.032) 0:00:44.876 ****** 30564 1726882846.29533: entering _queue_task() for managed_node2/assert 30564 1726882846.29725: worker is 1 (out of 1 available) 30564 1726882846.29738: exiting _queue_task() for managed_node2/assert 30564 1726882846.29750: done queuing things up, now waiting for results queue to drain 30564 1726882846.29751: waiting for pending results... 30564 1726882846.29923: running TaskExecutor() for managed_node2/TASK: Assert that the profile is present - 'statebr' 30564 1726882846.29996: in run() - task 0e448fcc-3ce9-4216-acec-000000000e8c 30564 1726882846.30007: variable 'ansible_search_path' from source: unknown 30564 1726882846.30010: variable 'ansible_search_path' from source: unknown 30564 1726882846.30036: calling self._execute() 30564 1726882846.30106: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882846.30109: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882846.30118: variable 'omit' from source: magic vars 30564 1726882846.30376: variable 'ansible_distribution_major_version' from source: facts 30564 1726882846.30386: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882846.30392: variable 'omit' from source: magic vars 30564 1726882846.30426: variable 'omit' from source: magic vars 30564 1726882846.30492: variable 'profile' from source: play vars 30564 1726882846.30496: variable 'interface' from source: play vars 30564 1726882846.30542: variable 'interface' from source: play vars 30564 1726882846.30555: variable 'omit' from source: magic vars 30564 1726882846.30588: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882846.30617: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882846.30630: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882846.30644: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882846.30654: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882846.30682: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882846.30685: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882846.30688: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882846.30755: Set connection var ansible_timeout to 10 30564 1726882846.30759: Set connection var ansible_pipelining to False 30564 1726882846.30761: Set connection var ansible_shell_type to sh 30564 1726882846.30771: Set connection var ansible_shell_executable to /bin/sh 30564 1726882846.30776: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882846.30779: Set connection var ansible_connection to ssh 30564 1726882846.30796: variable 'ansible_shell_executable' from source: unknown 30564 1726882846.30798: variable 'ansible_connection' from source: unknown 30564 1726882846.30801: variable 'ansible_module_compression' from source: unknown 30564 1726882846.30804: variable 'ansible_shell_type' from source: unknown 30564 1726882846.30806: variable 'ansible_shell_executable' from source: unknown 30564 1726882846.30808: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882846.30812: variable 'ansible_pipelining' from source: unknown 30564 1726882846.30815: variable 'ansible_timeout' from source: unknown 30564 1726882846.30818: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882846.30915: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882846.30923: variable 'omit' from source: magic vars 30564 1726882846.30928: starting attempt loop 30564 1726882846.30931: running the handler 30564 1726882846.31007: variable 'lsr_net_profile_exists' from source: set_fact 30564 1726882846.31010: Evaluated conditional (lsr_net_profile_exists): True 30564 1726882846.31017: handler run complete 30564 1726882846.31029: attempt loop complete, returning result 30564 1726882846.31031: _execute() done 30564 1726882846.31034: dumping result to json 30564 1726882846.31037: done dumping result, returning 30564 1726882846.31042: done running TaskExecutor() for managed_node2/TASK: Assert that the profile is present - 'statebr' [0e448fcc-3ce9-4216-acec-000000000e8c] 30564 1726882846.31049: sending task result for task 0e448fcc-3ce9-4216-acec-000000000e8c 30564 1726882846.31141: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000e8c 30564 1726882846.31144: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 30564 1726882846.31210: no more pending results, returning what we have 30564 1726882846.31213: results queue empty 30564 1726882846.31213: checking for any_errors_fatal 30564 1726882846.31218: done checking for any_errors_fatal 30564 1726882846.31219: checking for max_fail_percentage 30564 1726882846.31220: done checking for max_fail_percentage 30564 1726882846.31221: checking to see if all hosts have failed and the running result is not ok 30564 1726882846.31222: done checking to see if all hosts have failed 30564 1726882846.31223: getting the remaining hosts for this loop 30564 1726882846.31224: done getting the remaining hosts for this loop 30564 1726882846.31227: getting the next task for host managed_node2 30564 1726882846.31232: done getting next task for host managed_node2 30564 1726882846.31235: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 30564 1726882846.31238: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882846.31241: getting variables 30564 1726882846.31243: in VariableManager get_vars() 30564 1726882846.31279: Calling all_inventory to load vars for managed_node2 30564 1726882846.31281: Calling groups_inventory to load vars for managed_node2 30564 1726882846.31284: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882846.31290: Calling all_plugins_play to load vars for managed_node2 30564 1726882846.31292: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882846.31294: Calling groups_plugins_play to load vars for managed_node2 30564 1726882846.32103: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882846.33144: done with get_vars() 30564 1726882846.33158: done getting variables 30564 1726882846.33199: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30564 1726882846.33275: variable 'profile' from source: play vars 30564 1726882846.33278: variable 'interface' from source: play vars 30564 1726882846.33313: variable 'interface' from source: play vars TASK [Assert that the ansible managed comment is present in 'statebr'] ********* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 21:40:46 -0400 (0:00:00.038) 0:00:44.914 ****** 30564 1726882846.33339: entering _queue_task() for managed_node2/assert 30564 1726882846.33520: worker is 1 (out of 1 available) 30564 1726882846.33532: exiting _queue_task() for managed_node2/assert 30564 1726882846.33543: done queuing things up, now waiting for results queue to drain 30564 1726882846.33544: waiting for pending results... 30564 1726882846.33715: running TaskExecutor() for managed_node2/TASK: Assert that the ansible managed comment is present in 'statebr' 30564 1726882846.33808: in run() - task 0e448fcc-3ce9-4216-acec-000000000e8d 30564 1726882846.33819: variable 'ansible_search_path' from source: unknown 30564 1726882846.33823: variable 'ansible_search_path' from source: unknown 30564 1726882846.33849: calling self._execute() 30564 1726882846.33921: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882846.33925: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882846.33934: variable 'omit' from source: magic vars 30564 1726882846.34200: variable 'ansible_distribution_major_version' from source: facts 30564 1726882846.34212: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882846.34218: variable 'omit' from source: magic vars 30564 1726882846.34248: variable 'omit' from source: magic vars 30564 1726882846.34318: variable 'profile' from source: play vars 30564 1726882846.34322: variable 'interface' from source: play vars 30564 1726882846.34366: variable 'interface' from source: play vars 30564 1726882846.34382: variable 'omit' from source: magic vars 30564 1726882846.34416: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882846.34440: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882846.34455: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882846.34469: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882846.34482: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882846.34505: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882846.34508: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882846.34511: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882846.34583: Set connection var ansible_timeout to 10 30564 1726882846.34586: Set connection var ansible_pipelining to False 30564 1726882846.34589: Set connection var ansible_shell_type to sh 30564 1726882846.34595: Set connection var ansible_shell_executable to /bin/sh 30564 1726882846.34601: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882846.34603: Set connection var ansible_connection to ssh 30564 1726882846.34621: variable 'ansible_shell_executable' from source: unknown 30564 1726882846.34624: variable 'ansible_connection' from source: unknown 30564 1726882846.34628: variable 'ansible_module_compression' from source: unknown 30564 1726882846.34630: variable 'ansible_shell_type' from source: unknown 30564 1726882846.34632: variable 'ansible_shell_executable' from source: unknown 30564 1726882846.34635: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882846.34639: variable 'ansible_pipelining' from source: unknown 30564 1726882846.34641: variable 'ansible_timeout' from source: unknown 30564 1726882846.34644: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882846.34740: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882846.34748: variable 'omit' from source: magic vars 30564 1726882846.34754: starting attempt loop 30564 1726882846.34757: running the handler 30564 1726882846.34829: variable 'lsr_net_profile_ansible_managed' from source: set_fact 30564 1726882846.34833: Evaluated conditional (lsr_net_profile_ansible_managed): True 30564 1726882846.34839: handler run complete 30564 1726882846.34849: attempt loop complete, returning result 30564 1726882846.34852: _execute() done 30564 1726882846.34855: dumping result to json 30564 1726882846.34857: done dumping result, returning 30564 1726882846.34870: done running TaskExecutor() for managed_node2/TASK: Assert that the ansible managed comment is present in 'statebr' [0e448fcc-3ce9-4216-acec-000000000e8d] 30564 1726882846.34873: sending task result for task 0e448fcc-3ce9-4216-acec-000000000e8d 30564 1726882846.34950: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000e8d 30564 1726882846.34953: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 30564 1726882846.35026: no more pending results, returning what we have 30564 1726882846.35028: results queue empty 30564 1726882846.35029: checking for any_errors_fatal 30564 1726882846.35034: done checking for any_errors_fatal 30564 1726882846.35034: checking for max_fail_percentage 30564 1726882846.35036: done checking for max_fail_percentage 30564 1726882846.35037: checking to see if all hosts have failed and the running result is not ok 30564 1726882846.35038: done checking to see if all hosts have failed 30564 1726882846.35038: getting the remaining hosts for this loop 30564 1726882846.35040: done getting the remaining hosts for this loop 30564 1726882846.35043: getting the next task for host managed_node2 30564 1726882846.35049: done getting next task for host managed_node2 30564 1726882846.35051: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 30564 1726882846.35054: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882846.35057: getting variables 30564 1726882846.35059: in VariableManager get_vars() 30564 1726882846.35092: Calling all_inventory to load vars for managed_node2 30564 1726882846.35094: Calling groups_inventory to load vars for managed_node2 30564 1726882846.35097: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882846.35103: Calling all_plugins_play to load vars for managed_node2 30564 1726882846.35105: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882846.35107: Calling groups_plugins_play to load vars for managed_node2 30564 1726882846.35878: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882846.36835: done with get_vars() 30564 1726882846.36849: done getting variables 30564 1726882846.36890: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30564 1726882846.36959: variable 'profile' from source: play vars 30564 1726882846.36962: variable 'interface' from source: play vars 30564 1726882846.37006: variable 'interface' from source: play vars TASK [Assert that the fingerprint comment is present in statebr] *************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 21:40:46 -0400 (0:00:00.036) 0:00:44.951 ****** 30564 1726882846.37028: entering _queue_task() for managed_node2/assert 30564 1726882846.37198: worker is 1 (out of 1 available) 30564 1726882846.37211: exiting _queue_task() for managed_node2/assert 30564 1726882846.37222: done queuing things up, now waiting for results queue to drain 30564 1726882846.37224: waiting for pending results... 30564 1726882846.37409: running TaskExecutor() for managed_node2/TASK: Assert that the fingerprint comment is present in statebr 30564 1726882846.37481: in run() - task 0e448fcc-3ce9-4216-acec-000000000e8e 30564 1726882846.37492: variable 'ansible_search_path' from source: unknown 30564 1726882846.37496: variable 'ansible_search_path' from source: unknown 30564 1726882846.37522: calling self._execute() 30564 1726882846.37591: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882846.37595: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882846.37604: variable 'omit' from source: magic vars 30564 1726882846.37856: variable 'ansible_distribution_major_version' from source: facts 30564 1726882846.37867: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882846.37873: variable 'omit' from source: magic vars 30564 1726882846.37907: variable 'omit' from source: magic vars 30564 1726882846.37984: variable 'profile' from source: play vars 30564 1726882846.37988: variable 'interface' from source: play vars 30564 1726882846.38033: variable 'interface' from source: play vars 30564 1726882846.38046: variable 'omit' from source: magic vars 30564 1726882846.38078: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882846.38103: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882846.38124: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882846.38135: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882846.38145: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882846.38171: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882846.38174: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882846.38176: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882846.38251: Set connection var ansible_timeout to 10 30564 1726882846.38255: Set connection var ansible_pipelining to False 30564 1726882846.38257: Set connection var ansible_shell_type to sh 30564 1726882846.38263: Set connection var ansible_shell_executable to /bin/sh 30564 1726882846.38285: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882846.38605: Set connection var ansible_connection to ssh 30564 1726882846.38608: variable 'ansible_shell_executable' from source: unknown 30564 1726882846.38610: variable 'ansible_connection' from source: unknown 30564 1726882846.38617: variable 'ansible_module_compression' from source: unknown 30564 1726882846.38619: variable 'ansible_shell_type' from source: unknown 30564 1726882846.38621: variable 'ansible_shell_executable' from source: unknown 30564 1726882846.38623: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882846.38626: variable 'ansible_pipelining' from source: unknown 30564 1726882846.38628: variable 'ansible_timeout' from source: unknown 30564 1726882846.38630: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882846.38633: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882846.38635: variable 'omit' from source: magic vars 30564 1726882846.38637: starting attempt loop 30564 1726882846.38639: running the handler 30564 1726882846.38641: variable 'lsr_net_profile_fingerprint' from source: set_fact 30564 1726882846.38643: Evaluated conditional (lsr_net_profile_fingerprint): True 30564 1726882846.38645: handler run complete 30564 1726882846.38647: attempt loop complete, returning result 30564 1726882846.38649: _execute() done 30564 1726882846.38652: dumping result to json 30564 1726882846.38654: done dumping result, returning 30564 1726882846.38656: done running TaskExecutor() for managed_node2/TASK: Assert that the fingerprint comment is present in statebr [0e448fcc-3ce9-4216-acec-000000000e8e] 30564 1726882846.38658: sending task result for task 0e448fcc-3ce9-4216-acec-000000000e8e 30564 1726882846.38727: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000e8e 30564 1726882846.38730: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 30564 1726882846.38791: no more pending results, returning what we have 30564 1726882846.38794: results queue empty 30564 1726882846.38796: checking for any_errors_fatal 30564 1726882846.38816: done checking for any_errors_fatal 30564 1726882846.38818: checking for max_fail_percentage 30564 1726882846.38820: done checking for max_fail_percentage 30564 1726882846.38821: checking to see if all hosts have failed and the running result is not ok 30564 1726882846.38822: done checking to see if all hosts have failed 30564 1726882846.38823: getting the remaining hosts for this loop 30564 1726882846.38825: done getting the remaining hosts for this loop 30564 1726882846.38829: getting the next task for host managed_node2 30564 1726882846.38846: done getting next task for host managed_node2 30564 1726882846.38850: ^ task is: TASK: Conditional asserts 30564 1726882846.38854: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882846.38860: getting variables 30564 1726882846.38862: in VariableManager get_vars() 30564 1726882846.38899: Calling all_inventory to load vars for managed_node2 30564 1726882846.38901: Calling groups_inventory to load vars for managed_node2 30564 1726882846.38905: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882846.38916: Calling all_plugins_play to load vars for managed_node2 30564 1726882846.38919: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882846.38923: Calling groups_plugins_play to load vars for managed_node2 30564 1726882846.40477: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882846.41958: done with get_vars() 30564 1726882846.41981: done getting variables TASK [Conditional asserts] ***************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:42 Friday 20 September 2024 21:40:46 -0400 (0:00:00.050) 0:00:45.001 ****** 30564 1726882846.42057: entering _queue_task() for managed_node2/include_tasks 30564 1726882846.42292: worker is 1 (out of 1 available) 30564 1726882846.42306: exiting _queue_task() for managed_node2/include_tasks 30564 1726882846.42318: done queuing things up, now waiting for results queue to drain 30564 1726882846.42319: waiting for pending results... 30564 1726882846.42602: running TaskExecutor() for managed_node2/TASK: Conditional asserts 30564 1726882846.42717: in run() - task 0e448fcc-3ce9-4216-acec-000000000a4f 30564 1726882846.42735: variable 'ansible_search_path' from source: unknown 30564 1726882846.42742: variable 'ansible_search_path' from source: unknown 30564 1726882846.43026: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882846.45432: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882846.45508: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882846.45551: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882846.45600: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882846.45632: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882846.45743: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882846.45782: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882846.45842: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882846.45898: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882846.45927: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882846.46132: dumping result to json 30564 1726882846.46142: done dumping result, returning 30564 1726882846.46152: done running TaskExecutor() for managed_node2/TASK: Conditional asserts [0e448fcc-3ce9-4216-acec-000000000a4f] 30564 1726882846.46162: sending task result for task 0e448fcc-3ce9-4216-acec-000000000a4f skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } 30564 1726882846.46335: no more pending results, returning what we have 30564 1726882846.46338: results queue empty 30564 1726882846.46340: checking for any_errors_fatal 30564 1726882846.46348: done checking for any_errors_fatal 30564 1726882846.46350: checking for max_fail_percentage 30564 1726882846.46352: done checking for max_fail_percentage 30564 1726882846.46353: checking to see if all hosts have failed and the running result is not ok 30564 1726882846.46353: done checking to see if all hosts have failed 30564 1726882846.46354: getting the remaining hosts for this loop 30564 1726882846.46356: done getting the remaining hosts for this loop 30564 1726882846.46361: getting the next task for host managed_node2 30564 1726882846.46374: done getting next task for host managed_node2 30564 1726882846.46378: ^ task is: TASK: Success in test '{{ lsr_description }}' 30564 1726882846.46382: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882846.46386: getting variables 30564 1726882846.46388: in VariableManager get_vars() 30564 1726882846.46424: Calling all_inventory to load vars for managed_node2 30564 1726882846.46427: Calling groups_inventory to load vars for managed_node2 30564 1726882846.46431: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882846.46443: Calling all_plugins_play to load vars for managed_node2 30564 1726882846.46447: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882846.46450: Calling groups_plugins_play to load vars for managed_node2 30564 1726882846.47647: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000a4f 30564 1726882846.47651: WORKER PROCESS EXITING 30564 1726882846.48775: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882846.50645: done with get_vars() 30564 1726882846.50673: done getting variables 30564 1726882846.50734: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30564 1726882846.50857: variable 'lsr_description' from source: include params TASK [Success in test 'I can activate an existing profile'] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:47 Friday 20 September 2024 21:40:46 -0400 (0:00:00.088) 0:00:45.090 ****** 30564 1726882846.50891: entering _queue_task() for managed_node2/debug 30564 1726882846.51196: worker is 1 (out of 1 available) 30564 1726882846.51208: exiting _queue_task() for managed_node2/debug 30564 1726882846.51220: done queuing things up, now waiting for results queue to drain 30564 1726882846.51221: waiting for pending results... 30564 1726882846.51519: running TaskExecutor() for managed_node2/TASK: Success in test 'I can activate an existing profile' 30564 1726882846.51635: in run() - task 0e448fcc-3ce9-4216-acec-000000000a50 30564 1726882846.51652: variable 'ansible_search_path' from source: unknown 30564 1726882846.51661: variable 'ansible_search_path' from source: unknown 30564 1726882846.51707: calling self._execute() 30564 1726882846.51811: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882846.51821: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882846.51836: variable 'omit' from source: magic vars 30564 1726882846.52229: variable 'ansible_distribution_major_version' from source: facts 30564 1726882846.52246: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882846.52256: variable 'omit' from source: magic vars 30564 1726882846.52305: variable 'omit' from source: magic vars 30564 1726882846.52412: variable 'lsr_description' from source: include params 30564 1726882846.52440: variable 'omit' from source: magic vars 30564 1726882846.52489: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882846.52527: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882846.52555: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882846.52582: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882846.52599: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882846.52632: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882846.52640: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882846.52653: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882846.52749: Set connection var ansible_timeout to 10 30564 1726882846.52770: Set connection var ansible_pipelining to False 30564 1726882846.52778: Set connection var ansible_shell_type to sh 30564 1726882846.52787: Set connection var ansible_shell_executable to /bin/sh 30564 1726882846.52799: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882846.52804: Set connection var ansible_connection to ssh 30564 1726882846.52828: variable 'ansible_shell_executable' from source: unknown 30564 1726882846.52834: variable 'ansible_connection' from source: unknown 30564 1726882846.52840: variable 'ansible_module_compression' from source: unknown 30564 1726882846.52845: variable 'ansible_shell_type' from source: unknown 30564 1726882846.52851: variable 'ansible_shell_executable' from source: unknown 30564 1726882846.52856: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882846.52876: variable 'ansible_pipelining' from source: unknown 30564 1726882846.52882: variable 'ansible_timeout' from source: unknown 30564 1726882846.52889: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882846.53032: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882846.53047: variable 'omit' from source: magic vars 30564 1726882846.53056: starting attempt loop 30564 1726882846.53061: running the handler 30564 1726882846.53121: handler run complete 30564 1726882846.53139: attempt loop complete, returning result 30564 1726882846.53145: _execute() done 30564 1726882846.53151: dumping result to json 30564 1726882846.53157: done dumping result, returning 30564 1726882846.53172: done running TaskExecutor() for managed_node2/TASK: Success in test 'I can activate an existing profile' [0e448fcc-3ce9-4216-acec-000000000a50] 30564 1726882846.53183: sending task result for task 0e448fcc-3ce9-4216-acec-000000000a50 30564 1726882846.53296: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000a50 ok: [managed_node2] => {} MSG: +++++ Success in test 'I can activate an existing profile' +++++ 30564 1726882846.53351: no more pending results, returning what we have 30564 1726882846.53355: results queue empty 30564 1726882846.53356: checking for any_errors_fatal 30564 1726882846.53365: done checking for any_errors_fatal 30564 1726882846.53365: checking for max_fail_percentage 30564 1726882846.53370: done checking for max_fail_percentage 30564 1726882846.53371: checking to see if all hosts have failed and the running result is not ok 30564 1726882846.53372: done checking to see if all hosts have failed 30564 1726882846.53372: getting the remaining hosts for this loop 30564 1726882846.53374: done getting the remaining hosts for this loop 30564 1726882846.53378: getting the next task for host managed_node2 30564 1726882846.53388: done getting next task for host managed_node2 30564 1726882846.53391: ^ task is: TASK: Cleanup 30564 1726882846.53395: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882846.53400: getting variables 30564 1726882846.53402: in VariableManager get_vars() 30564 1726882846.53437: Calling all_inventory to load vars for managed_node2 30564 1726882846.53441: Calling groups_inventory to load vars for managed_node2 30564 1726882846.53444: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882846.53455: Calling all_plugins_play to load vars for managed_node2 30564 1726882846.53459: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882846.53462: Calling groups_plugins_play to load vars for managed_node2 30564 1726882846.54500: WORKER PROCESS EXITING 30564 1726882846.55661: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882846.58436: done with get_vars() 30564 1726882846.58460: done getting variables TASK [Cleanup] ***************************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:66 Friday 20 September 2024 21:40:46 -0400 (0:00:00.076) 0:00:45.166 ****** 30564 1726882846.58557: entering _queue_task() for managed_node2/include_tasks 30564 1726882846.58859: worker is 1 (out of 1 available) 30564 1726882846.58876: exiting _queue_task() for managed_node2/include_tasks 30564 1726882846.58889: done queuing things up, now waiting for results queue to drain 30564 1726882846.58890: waiting for pending results... 30564 1726882846.59186: running TaskExecutor() for managed_node2/TASK: Cleanup 30564 1726882846.59402: in run() - task 0e448fcc-3ce9-4216-acec-000000000a54 30564 1726882846.59420: variable 'ansible_search_path' from source: unknown 30564 1726882846.59427: variable 'ansible_search_path' from source: unknown 30564 1726882846.59479: variable 'lsr_cleanup' from source: include params 30564 1726882846.59678: variable 'lsr_cleanup' from source: include params 30564 1726882846.59753: variable 'omit' from source: magic vars 30564 1726882846.59905: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882846.59923: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882846.59937: variable 'omit' from source: magic vars 30564 1726882846.60189: variable 'ansible_distribution_major_version' from source: facts 30564 1726882846.60203: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882846.60213: variable 'item' from source: unknown 30564 1726882846.60291: variable 'item' from source: unknown 30564 1726882846.60330: variable 'item' from source: unknown 30564 1726882846.60402: variable 'item' from source: unknown 30564 1726882846.60554: dumping result to json 30564 1726882846.60562: done dumping result, returning 30564 1726882846.60576: done running TaskExecutor() for managed_node2/TASK: Cleanup [0e448fcc-3ce9-4216-acec-000000000a54] 30564 1726882846.60586: sending task result for task 0e448fcc-3ce9-4216-acec-000000000a54 30564 1726882846.60672: no more pending results, returning what we have 30564 1726882846.60679: in VariableManager get_vars() 30564 1726882846.60716: Calling all_inventory to load vars for managed_node2 30564 1726882846.60719: Calling groups_inventory to load vars for managed_node2 30564 1726882846.60723: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882846.60737: Calling all_plugins_play to load vars for managed_node2 30564 1726882846.60741: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882846.60744: Calling groups_plugins_play to load vars for managed_node2 30564 1726882846.61803: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000a54 30564 1726882846.61806: WORKER PROCESS EXITING 30564 1726882846.62635: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882846.64477: done with get_vars() 30564 1726882846.64498: variable 'ansible_search_path' from source: unknown 30564 1726882846.64499: variable 'ansible_search_path' from source: unknown 30564 1726882846.64541: we have included files to process 30564 1726882846.64542: generating all_blocks data 30564 1726882846.64544: done generating all_blocks data 30564 1726882846.64548: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30564 1726882846.64549: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30564 1726882846.64551: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30564 1726882846.64755: done processing included file 30564 1726882846.64757: iterating over new_blocks loaded from include file 30564 1726882846.64759: in VariableManager get_vars() 30564 1726882846.64777: done with get_vars() 30564 1726882846.64780: filtering new block on tags 30564 1726882846.64807: done filtering new block on tags 30564 1726882846.64809: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml for managed_node2 => (item=tasks/cleanup_profile+device.yml) 30564 1726882846.64814: extending task lists for all hosts with included blocks 30564 1726882846.66192: done extending task lists 30564 1726882846.66194: done processing included files 30564 1726882846.66195: results queue empty 30564 1726882846.66195: checking for any_errors_fatal 30564 1726882846.66199: done checking for any_errors_fatal 30564 1726882846.66200: checking for max_fail_percentage 30564 1726882846.66201: done checking for max_fail_percentage 30564 1726882846.66202: checking to see if all hosts have failed and the running result is not ok 30564 1726882846.66203: done checking to see if all hosts have failed 30564 1726882846.66204: getting the remaining hosts for this loop 30564 1726882846.66205: done getting the remaining hosts for this loop 30564 1726882846.66208: getting the next task for host managed_node2 30564 1726882846.66212: done getting next task for host managed_node2 30564 1726882846.66214: ^ task is: TASK: Cleanup profile and device 30564 1726882846.66217: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882846.66220: getting variables 30564 1726882846.66221: in VariableManager get_vars() 30564 1726882846.66230: Calling all_inventory to load vars for managed_node2 30564 1726882846.66232: Calling groups_inventory to load vars for managed_node2 30564 1726882846.66235: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882846.66240: Calling all_plugins_play to load vars for managed_node2 30564 1726882846.66242: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882846.66245: Calling groups_plugins_play to load vars for managed_node2 30564 1726882846.67595: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882846.69425: done with get_vars() 30564 1726882846.69447: done getting variables 30564 1726882846.69496: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Cleanup profile and device] ********************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml:3 Friday 20 September 2024 21:40:46 -0400 (0:00:00.109) 0:00:45.276 ****** 30564 1726882846.69530: entering _queue_task() for managed_node2/shell 30564 1726882846.69865: worker is 1 (out of 1 available) 30564 1726882846.69880: exiting _queue_task() for managed_node2/shell 30564 1726882846.69892: done queuing things up, now waiting for results queue to drain 30564 1726882846.69893: waiting for pending results... 30564 1726882846.70205: running TaskExecutor() for managed_node2/TASK: Cleanup profile and device 30564 1726882846.70325: in run() - task 0e448fcc-3ce9-4216-acec-000000000f6d 30564 1726882846.70347: variable 'ansible_search_path' from source: unknown 30564 1726882846.70354: variable 'ansible_search_path' from source: unknown 30564 1726882846.70401: calling self._execute() 30564 1726882846.70611: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882846.70674: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882846.70689: variable 'omit' from source: magic vars 30564 1726882846.71502: variable 'ansible_distribution_major_version' from source: facts 30564 1726882846.71515: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882846.71521: variable 'omit' from source: magic vars 30564 1726882846.71570: variable 'omit' from source: magic vars 30564 1726882846.71829: variable 'interface' from source: play vars 30564 1726882846.71849: variable 'omit' from source: magic vars 30564 1726882846.71892: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882846.71933: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882846.71954: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882846.71975: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882846.71986: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882846.72016: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882846.72024: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882846.72028: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882846.72472: Set connection var ansible_timeout to 10 30564 1726882846.72476: Set connection var ansible_pipelining to False 30564 1726882846.72478: Set connection var ansible_shell_type to sh 30564 1726882846.72479: Set connection var ansible_shell_executable to /bin/sh 30564 1726882846.72481: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882846.72483: Set connection var ansible_connection to ssh 30564 1726882846.72485: variable 'ansible_shell_executable' from source: unknown 30564 1726882846.72487: variable 'ansible_connection' from source: unknown 30564 1726882846.72489: variable 'ansible_module_compression' from source: unknown 30564 1726882846.72491: variable 'ansible_shell_type' from source: unknown 30564 1726882846.72493: variable 'ansible_shell_executable' from source: unknown 30564 1726882846.72495: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882846.72497: variable 'ansible_pipelining' from source: unknown 30564 1726882846.72500: variable 'ansible_timeout' from source: unknown 30564 1726882846.72502: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882846.72505: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882846.72508: variable 'omit' from source: magic vars 30564 1726882846.72510: starting attempt loop 30564 1726882846.72512: running the handler 30564 1726882846.72514: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882846.72516: _low_level_execute_command(): starting 30564 1726882846.72518: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882846.73148: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882846.73160: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882846.73175: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882846.73189: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882846.73229: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882846.73238: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882846.73248: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882846.73262: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882846.73273: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882846.73279: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882846.73288: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882846.73297: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882846.73309: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882846.73317: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882846.73323: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882846.73335: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882846.73412: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882846.73432: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882846.73446: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882846.73585: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882846.75239: stdout chunk (state=3): >>>/root <<< 30564 1726882846.75380: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882846.75420: stderr chunk (state=3): >>><<< 30564 1726882846.75423: stdout chunk (state=3): >>><<< 30564 1726882846.75444: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882846.75457: _low_level_execute_command(): starting 30564 1726882846.75465: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882846.7544358-32621-114721868135785 `" && echo ansible-tmp-1726882846.7544358-32621-114721868135785="` echo /root/.ansible/tmp/ansible-tmp-1726882846.7544358-32621-114721868135785 `" ) && sleep 0' 30564 1726882846.77031: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882846.77109: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882846.77119: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882846.77133: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882846.77172: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882846.77218: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882846.77230: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882846.77253: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882846.77271: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882846.77274: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882846.77282: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882846.77292: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882846.77303: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882846.77315: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882846.77328: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882846.77333: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882846.77411: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882846.77432: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882846.77443: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882846.77575: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882846.79460: stdout chunk (state=3): >>>ansible-tmp-1726882846.7544358-32621-114721868135785=/root/.ansible/tmp/ansible-tmp-1726882846.7544358-32621-114721868135785 <<< 30564 1726882846.79640: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882846.79643: stdout chunk (state=3): >>><<< 30564 1726882846.79650: stderr chunk (state=3): >>><<< 30564 1726882846.79677: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882846.7544358-32621-114721868135785=/root/.ansible/tmp/ansible-tmp-1726882846.7544358-32621-114721868135785 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882846.79709: variable 'ansible_module_compression' from source: unknown 30564 1726882846.79761: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30564 1726882846.79799: variable 'ansible_facts' from source: unknown 30564 1726882846.79891: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882846.7544358-32621-114721868135785/AnsiballZ_command.py 30564 1726882846.80489: Sending initial data 30564 1726882846.80492: Sent initial data (156 bytes) 30564 1726882846.82811: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882846.82902: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882846.82912: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882846.82930: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882846.82968: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882846.82979: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882846.82989: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882846.83003: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882846.83042: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882846.83047: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882846.83055: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882846.83065: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882846.83081: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882846.83088: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882846.83095: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882846.83104: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882846.83212: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882846.83287: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882846.83299: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882846.83427: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882846.85237: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882846.85336: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882846.85439: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmp9b4hmh7x /root/.ansible/tmp/ansible-tmp-1726882846.7544358-32621-114721868135785/AnsiballZ_command.py <<< 30564 1726882846.85536: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882846.87215: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882846.87223: stderr chunk (state=3): >>><<< 30564 1726882846.87226: stdout chunk (state=3): >>><<< 30564 1726882846.87245: done transferring module to remote 30564 1726882846.87256: _low_level_execute_command(): starting 30564 1726882846.87260: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882846.7544358-32621-114721868135785/ /root/.ansible/tmp/ansible-tmp-1726882846.7544358-32621-114721868135785/AnsiballZ_command.py && sleep 0' 30564 1726882846.88648: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882846.88982: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882846.88992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882846.89007: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882846.89043: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882846.89049: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882846.89058: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882846.89075: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882846.89084: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882846.89090: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882846.89097: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882846.89106: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882846.89116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882846.89122: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882846.89128: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882846.89136: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882846.89202: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882846.89289: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882846.89304: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882846.89424: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882846.91315: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882846.91319: stdout chunk (state=3): >>><<< 30564 1726882846.91326: stderr chunk (state=3): >>><<< 30564 1726882846.91345: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882846.91348: _low_level_execute_command(): starting 30564 1726882846.91353: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882846.7544358-32621-114721868135785/AnsiballZ_command.py && sleep 0' 30564 1726882846.92900: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882846.93015: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882846.93022: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882846.93061: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 30564 1726882846.93067: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882846.93081: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882846.93087: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882846.93091: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882846.93104: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882846.93292: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882846.93311: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882846.93447: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882847.13478: stdout chunk (state=3): >>> {"changed": true, "stdout": "Connection 'statebr' (6d0eee33-2e09-457c-9193-5de1eabb8deb) successfully deleted.", "stderr": "Cannot find device \"statebr\"", "rc": 1, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-20 21:40:47.062531", "end": "2024-09-20 21:40:47.132542", "delta": "0:00:00.070011", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30564 1726882847.14754: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.11.158 closed. <<< 30564 1726882847.14760: stderr chunk (state=3): >>><<< 30564 1726882847.14763: stdout chunk (state=3): >>><<< 30564 1726882847.14793: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "Connection 'statebr' (6d0eee33-2e09-457c-9193-5de1eabb8deb) successfully deleted.", "stderr": "Cannot find device \"statebr\"", "rc": 1, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-20 21:40:47.062531", "end": "2024-09-20 21:40:47.132542", "delta": "0:00:00.070011", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.11.158 closed. 30564 1726882847.14832: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882846.7544358-32621-114721868135785/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882847.14837: _low_level_execute_command(): starting 30564 1726882847.14842: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882846.7544358-32621-114721868135785/ > /dev/null 2>&1 && sleep 0' 30564 1726882847.15936: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882847.16581: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882847.16591: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882847.16605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882847.16643: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882847.16651: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882847.16662: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882847.16681: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882847.16688: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882847.16696: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882847.16703: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882847.16712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882847.16723: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882847.16731: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882847.16737: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882847.16746: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882847.16824: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882847.16841: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882847.16852: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882847.16987: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882847.18877: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882847.18880: stdout chunk (state=3): >>><<< 30564 1726882847.18887: stderr chunk (state=3): >>><<< 30564 1726882847.18902: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882847.18909: handler run complete 30564 1726882847.18932: Evaluated conditional (False): False 30564 1726882847.18942: attempt loop complete, returning result 30564 1726882847.18946: _execute() done 30564 1726882847.18948: dumping result to json 30564 1726882847.18951: done dumping result, returning 30564 1726882847.18961: done running TaskExecutor() for managed_node2/TASK: Cleanup profile and device [0e448fcc-3ce9-4216-acec-000000000f6d] 30564 1726882847.18968: sending task result for task 0e448fcc-3ce9-4216-acec-000000000f6d 30564 1726882847.19081: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000f6d 30564 1726882847.19084: WORKER PROCESS EXITING fatal: [managed_node2]: FAILED! => { "changed": false, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "delta": "0:00:00.070011", "end": "2024-09-20 21:40:47.132542", "rc": 1, "start": "2024-09-20 21:40:47.062531" } STDOUT: Connection 'statebr' (6d0eee33-2e09-457c-9193-5de1eabb8deb) successfully deleted. STDERR: Cannot find device "statebr" MSG: non-zero return code ...ignoring 30564 1726882847.19143: no more pending results, returning what we have 30564 1726882847.19147: results queue empty 30564 1726882847.19148: checking for any_errors_fatal 30564 1726882847.19149: done checking for any_errors_fatal 30564 1726882847.19150: checking for max_fail_percentage 30564 1726882847.19151: done checking for max_fail_percentage 30564 1726882847.19152: checking to see if all hosts have failed and the running result is not ok 30564 1726882847.19153: done checking to see if all hosts have failed 30564 1726882847.19154: getting the remaining hosts for this loop 30564 1726882847.19156: done getting the remaining hosts for this loop 30564 1726882847.19160: getting the next task for host managed_node2 30564 1726882847.19172: done getting next task for host managed_node2 30564 1726882847.19175: ^ task is: TASK: Include the task 'run_test.yml' 30564 1726882847.19177: ^ state is: HOST STATE: block=6, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882847.19181: getting variables 30564 1726882847.19182: in VariableManager get_vars() 30564 1726882847.19215: Calling all_inventory to load vars for managed_node2 30564 1726882847.19217: Calling groups_inventory to load vars for managed_node2 30564 1726882847.19221: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882847.19231: Calling all_plugins_play to load vars for managed_node2 30564 1726882847.19234: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882847.19236: Calling groups_plugins_play to load vars for managed_node2 30564 1726882847.22077: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882847.25072: done with get_vars() 30564 1726882847.25100: done getting variables TASK [Include the task 'run_test.yml'] ***************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_states.yml:83 Friday 20 September 2024 21:40:47 -0400 (0:00:00.556) 0:00:45.833 ****** 30564 1726882847.25204: entering _queue_task() for managed_node2/include_tasks 30564 1726882847.25544: worker is 1 (out of 1 available) 30564 1726882847.25557: exiting _queue_task() for managed_node2/include_tasks 30564 1726882847.25572: done queuing things up, now waiting for results queue to drain 30564 1726882847.25574: waiting for pending results... 30564 1726882847.25900: running TaskExecutor() for managed_node2/TASK: Include the task 'run_test.yml' 30564 1726882847.26054: in run() - task 0e448fcc-3ce9-4216-acec-000000000013 30564 1726882847.26076: variable 'ansible_search_path' from source: unknown 30564 1726882847.26116: calling self._execute() 30564 1726882847.26273: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882847.26286: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882847.26300: variable 'omit' from source: magic vars 30564 1726882847.26717: variable 'ansible_distribution_major_version' from source: facts 30564 1726882847.26735: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882847.26745: _execute() done 30564 1726882847.26753: dumping result to json 30564 1726882847.26760: done dumping result, returning 30564 1726882847.26773: done running TaskExecutor() for managed_node2/TASK: Include the task 'run_test.yml' [0e448fcc-3ce9-4216-acec-000000000013] 30564 1726882847.26785: sending task result for task 0e448fcc-3ce9-4216-acec-000000000013 30564 1726882847.26936: no more pending results, returning what we have 30564 1726882847.26942: in VariableManager get_vars() 30564 1726882847.26989: Calling all_inventory to load vars for managed_node2 30564 1726882847.26992: Calling groups_inventory to load vars for managed_node2 30564 1726882847.26996: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882847.27010: Calling all_plugins_play to load vars for managed_node2 30564 1726882847.27015: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882847.27020: Calling groups_plugins_play to load vars for managed_node2 30564 1726882847.28118: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000013 30564 1726882847.28121: WORKER PROCESS EXITING 30564 1726882847.29154: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882847.31023: done with get_vars() 30564 1726882847.31046: variable 'ansible_search_path' from source: unknown 30564 1726882847.31059: we have included files to process 30564 1726882847.31060: generating all_blocks data 30564 1726882847.31062: done generating all_blocks data 30564 1726882847.31070: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30564 1726882847.31071: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30564 1726882847.31074: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30564 1726882847.31491: in VariableManager get_vars() 30564 1726882847.31508: done with get_vars() 30564 1726882847.31550: in VariableManager get_vars() 30564 1726882847.31569: done with get_vars() 30564 1726882847.31615: in VariableManager get_vars() 30564 1726882847.31631: done with get_vars() 30564 1726882847.31673: in VariableManager get_vars() 30564 1726882847.31695: done with get_vars() 30564 1726882847.31737: in VariableManager get_vars() 30564 1726882847.31753: done with get_vars() 30564 1726882847.32201: in VariableManager get_vars() 30564 1726882847.32910: done with get_vars() 30564 1726882847.32923: done processing included file 30564 1726882847.32925: iterating over new_blocks loaded from include file 30564 1726882847.32926: in VariableManager get_vars() 30564 1726882847.32938: done with get_vars() 30564 1726882847.32939: filtering new block on tags 30564 1726882847.33163: done filtering new block on tags 30564 1726882847.33167: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml for managed_node2 30564 1726882847.33173: extending task lists for all hosts with included blocks 30564 1726882847.33209: done extending task lists 30564 1726882847.33211: done processing included files 30564 1726882847.33211: results queue empty 30564 1726882847.33212: checking for any_errors_fatal 30564 1726882847.33331: done checking for any_errors_fatal 30564 1726882847.33333: checking for max_fail_percentage 30564 1726882847.33334: done checking for max_fail_percentage 30564 1726882847.33335: checking to see if all hosts have failed and the running result is not ok 30564 1726882847.33335: done checking to see if all hosts have failed 30564 1726882847.33336: getting the remaining hosts for this loop 30564 1726882847.33337: done getting the remaining hosts for this loop 30564 1726882847.33340: getting the next task for host managed_node2 30564 1726882847.33345: done getting next task for host managed_node2 30564 1726882847.33347: ^ task is: TASK: TEST: {{ lsr_description }} 30564 1726882847.33350: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882847.33352: getting variables 30564 1726882847.33353: in VariableManager get_vars() 30564 1726882847.33362: Calling all_inventory to load vars for managed_node2 30564 1726882847.33367: Calling groups_inventory to load vars for managed_node2 30564 1726882847.33369: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882847.33374: Calling all_plugins_play to load vars for managed_node2 30564 1726882847.33377: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882847.33379: Calling groups_plugins_play to load vars for managed_node2 30564 1726882847.35749: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882847.37612: done with get_vars() 30564 1726882847.37634: done getting variables 30564 1726882847.37687: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30564 1726882847.37804: variable 'lsr_description' from source: include params TASK [TEST: I can remove an existing profile without taking it down] *********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:5 Friday 20 September 2024 21:40:47 -0400 (0:00:00.126) 0:00:45.959 ****** 30564 1726882847.37833: entering _queue_task() for managed_node2/debug 30564 1726882847.38407: worker is 1 (out of 1 available) 30564 1726882847.38420: exiting _queue_task() for managed_node2/debug 30564 1726882847.38546: done queuing things up, now waiting for results queue to drain 30564 1726882847.38548: waiting for pending results... 30564 1726882847.39248: running TaskExecutor() for managed_node2/TASK: TEST: I can remove an existing profile without taking it down 30564 1726882847.39349: in run() - task 0e448fcc-3ce9-4216-acec-000000001005 30564 1726882847.39367: variable 'ansible_search_path' from source: unknown 30564 1726882847.39378: variable 'ansible_search_path' from source: unknown 30564 1726882847.39419: calling self._execute() 30564 1726882847.39517: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882847.39521: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882847.39532: variable 'omit' from source: magic vars 30564 1726882847.39909: variable 'ansible_distribution_major_version' from source: facts 30564 1726882847.39922: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882847.39929: variable 'omit' from source: magic vars 30564 1726882847.39976: variable 'omit' from source: magic vars 30564 1726882847.40079: variable 'lsr_description' from source: include params 30564 1726882847.40097: variable 'omit' from source: magic vars 30564 1726882847.40137: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882847.40181: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882847.40201: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882847.40219: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882847.40231: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882847.40275: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882847.40280: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882847.40282: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882847.40388: Set connection var ansible_timeout to 10 30564 1726882847.40393: Set connection var ansible_pipelining to False 30564 1726882847.40396: Set connection var ansible_shell_type to sh 30564 1726882847.40401: Set connection var ansible_shell_executable to /bin/sh 30564 1726882847.40409: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882847.40412: Set connection var ansible_connection to ssh 30564 1726882847.40437: variable 'ansible_shell_executable' from source: unknown 30564 1726882847.40441: variable 'ansible_connection' from source: unknown 30564 1726882847.40444: variable 'ansible_module_compression' from source: unknown 30564 1726882847.40447: variable 'ansible_shell_type' from source: unknown 30564 1726882847.40449: variable 'ansible_shell_executable' from source: unknown 30564 1726882847.40451: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882847.40453: variable 'ansible_pipelining' from source: unknown 30564 1726882847.40456: variable 'ansible_timeout' from source: unknown 30564 1726882847.40460: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882847.40604: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882847.40617: variable 'omit' from source: magic vars 30564 1726882847.40620: starting attempt loop 30564 1726882847.40623: running the handler 30564 1726882847.40668: handler run complete 30564 1726882847.40686: attempt loop complete, returning result 30564 1726882847.40689: _execute() done 30564 1726882847.40691: dumping result to json 30564 1726882847.40694: done dumping result, returning 30564 1726882847.40705: done running TaskExecutor() for managed_node2/TASK: TEST: I can remove an existing profile without taking it down [0e448fcc-3ce9-4216-acec-000000001005] 30564 1726882847.40710: sending task result for task 0e448fcc-3ce9-4216-acec-000000001005 30564 1726882847.40800: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001005 30564 1726882847.40803: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: ########## I can remove an existing profile without taking it down ########## 30564 1726882847.40875: no more pending results, returning what we have 30564 1726882847.40880: results queue empty 30564 1726882847.40881: checking for any_errors_fatal 30564 1726882847.40882: done checking for any_errors_fatal 30564 1726882847.40883: checking for max_fail_percentage 30564 1726882847.40885: done checking for max_fail_percentage 30564 1726882847.40885: checking to see if all hosts have failed and the running result is not ok 30564 1726882847.40886: done checking to see if all hosts have failed 30564 1726882847.40887: getting the remaining hosts for this loop 30564 1726882847.40889: done getting the remaining hosts for this loop 30564 1726882847.40893: getting the next task for host managed_node2 30564 1726882847.40901: done getting next task for host managed_node2 30564 1726882847.40904: ^ task is: TASK: Show item 30564 1726882847.40906: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882847.40911: getting variables 30564 1726882847.40913: in VariableManager get_vars() 30564 1726882847.40946: Calling all_inventory to load vars for managed_node2 30564 1726882847.40948: Calling groups_inventory to load vars for managed_node2 30564 1726882847.40951: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882847.40961: Calling all_plugins_play to load vars for managed_node2 30564 1726882847.40966: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882847.40968: Calling groups_plugins_play to load vars for managed_node2 30564 1726882847.42428: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882847.44119: done with get_vars() 30564 1726882847.44142: done getting variables 30564 1726882847.44199: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show item] *************************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:9 Friday 20 September 2024 21:40:47 -0400 (0:00:00.063) 0:00:46.023 ****** 30564 1726882847.44229: entering _queue_task() for managed_node2/debug 30564 1726882847.44511: worker is 1 (out of 1 available) 30564 1726882847.44525: exiting _queue_task() for managed_node2/debug 30564 1726882847.44537: done queuing things up, now waiting for results queue to drain 30564 1726882847.44538: waiting for pending results... 30564 1726882847.44829: running TaskExecutor() for managed_node2/TASK: Show item 30564 1726882847.44938: in run() - task 0e448fcc-3ce9-4216-acec-000000001006 30564 1726882847.44960: variable 'ansible_search_path' from source: unknown 30564 1726882847.44976: variable 'ansible_search_path' from source: unknown 30564 1726882847.45033: variable 'omit' from source: magic vars 30564 1726882847.45184: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882847.45203: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882847.45219: variable 'omit' from source: magic vars 30564 1726882847.45580: variable 'ansible_distribution_major_version' from source: facts 30564 1726882847.45599: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882847.45609: variable 'omit' from source: magic vars 30564 1726882847.45657: variable 'omit' from source: magic vars 30564 1726882847.45703: variable 'item' from source: unknown 30564 1726882847.45779: variable 'item' from source: unknown 30564 1726882847.45801: variable 'omit' from source: magic vars 30564 1726882847.45853: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882847.45896: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882847.45922: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882847.45944: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882847.45967: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882847.45999: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882847.46008: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882847.46014: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882847.46123: Set connection var ansible_timeout to 10 30564 1726882847.46134: Set connection var ansible_pipelining to False 30564 1726882847.46141: Set connection var ansible_shell_type to sh 30564 1726882847.46151: Set connection var ansible_shell_executable to /bin/sh 30564 1726882847.46163: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882847.46171: Set connection var ansible_connection to ssh 30564 1726882847.46201: variable 'ansible_shell_executable' from source: unknown 30564 1726882847.46210: variable 'ansible_connection' from source: unknown 30564 1726882847.46216: variable 'ansible_module_compression' from source: unknown 30564 1726882847.46223: variable 'ansible_shell_type' from source: unknown 30564 1726882847.46229: variable 'ansible_shell_executable' from source: unknown 30564 1726882847.46234: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882847.46242: variable 'ansible_pipelining' from source: unknown 30564 1726882847.46248: variable 'ansible_timeout' from source: unknown 30564 1726882847.46256: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882847.46402: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882847.46417: variable 'omit' from source: magic vars 30564 1726882847.46427: starting attempt loop 30564 1726882847.46434: running the handler 30564 1726882847.46488: variable 'lsr_description' from source: include params 30564 1726882847.46556: variable 'lsr_description' from source: include params 30564 1726882847.46574: handler run complete 30564 1726882847.46598: attempt loop complete, returning result 30564 1726882847.46623: variable 'item' from source: unknown 30564 1726882847.46690: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_description) => { "ansible_loop_var": "item", "item": "lsr_description", "lsr_description": "I can remove an existing profile without taking it down" } 30564 1726882847.46939: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882847.46955: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882847.46974: variable 'omit' from source: magic vars 30564 1726882847.47137: variable 'ansible_distribution_major_version' from source: facts 30564 1726882847.47148: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882847.47157: variable 'omit' from source: magic vars 30564 1726882847.47179: variable 'omit' from source: magic vars 30564 1726882847.47228: variable 'item' from source: unknown 30564 1726882847.47292: variable 'item' from source: unknown 30564 1726882847.47314: variable 'omit' from source: magic vars 30564 1726882847.47338: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882847.47351: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882847.47363: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882847.47382: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882847.47391: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882847.47398: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882847.47479: Set connection var ansible_timeout to 10 30564 1726882847.47489: Set connection var ansible_pipelining to False 30564 1726882847.47496: Set connection var ansible_shell_type to sh 30564 1726882847.47504: Set connection var ansible_shell_executable to /bin/sh 30564 1726882847.47515: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882847.47520: Set connection var ansible_connection to ssh 30564 1726882847.47548: variable 'ansible_shell_executable' from source: unknown 30564 1726882847.47556: variable 'ansible_connection' from source: unknown 30564 1726882847.47566: variable 'ansible_module_compression' from source: unknown 30564 1726882847.47574: variable 'ansible_shell_type' from source: unknown 30564 1726882847.47581: variable 'ansible_shell_executable' from source: unknown 30564 1726882847.47588: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882847.47595: variable 'ansible_pipelining' from source: unknown 30564 1726882847.47601: variable 'ansible_timeout' from source: unknown 30564 1726882847.47608: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882847.47701: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882847.47715: variable 'omit' from source: magic vars 30564 1726882847.47723: starting attempt loop 30564 1726882847.47729: running the handler 30564 1726882847.47758: variable 'lsr_setup' from source: include params 30564 1726882847.47827: variable 'lsr_setup' from source: include params 30564 1726882847.47882: handler run complete 30564 1726882847.47902: attempt loop complete, returning result 30564 1726882847.47921: variable 'item' from source: unknown 30564 1726882847.47994: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_setup) => { "ansible_loop_var": "item", "item": "lsr_setup", "lsr_setup": [ "tasks/create_bridge_profile.yml", "tasks/activate_profile.yml" ] } 30564 1726882847.48141: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882847.48154: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882847.48169: variable 'omit' from source: magic vars 30564 1726882847.48328: variable 'ansible_distribution_major_version' from source: facts 30564 1726882847.48338: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882847.48346: variable 'omit' from source: magic vars 30564 1726882847.48362: variable 'omit' from source: magic vars 30564 1726882847.48406: variable 'item' from source: unknown 30564 1726882847.48474: variable 'item' from source: unknown 30564 1726882847.48493: variable 'omit' from source: magic vars 30564 1726882847.48513: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882847.48530: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882847.48541: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882847.48555: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882847.48565: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882847.48573: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882847.48651: Set connection var ansible_timeout to 10 30564 1726882847.48660: Set connection var ansible_pipelining to False 30564 1726882847.48670: Set connection var ansible_shell_type to sh 30564 1726882847.48682: Set connection var ansible_shell_executable to /bin/sh 30564 1726882847.48695: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882847.48702: Set connection var ansible_connection to ssh 30564 1726882847.48727: variable 'ansible_shell_executable' from source: unknown 30564 1726882847.48741: variable 'ansible_connection' from source: unknown 30564 1726882847.48749: variable 'ansible_module_compression' from source: unknown 30564 1726882847.48755: variable 'ansible_shell_type' from source: unknown 30564 1726882847.48762: variable 'ansible_shell_executable' from source: unknown 30564 1726882847.48770: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882847.48778: variable 'ansible_pipelining' from source: unknown 30564 1726882847.48784: variable 'ansible_timeout' from source: unknown 30564 1726882847.48792: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882847.48883: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882847.48897: variable 'omit' from source: magic vars 30564 1726882847.48905: starting attempt loop 30564 1726882847.48912: running the handler 30564 1726882847.48934: variable 'lsr_test' from source: include params 30564 1726882847.49002: variable 'lsr_test' from source: include params 30564 1726882847.49023: handler run complete 30564 1726882847.49040: attempt loop complete, returning result 30564 1726882847.49057: variable 'item' from source: unknown 30564 1726882847.49124: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_test) => { "ansible_loop_var": "item", "item": "lsr_test", "lsr_test": [ "tasks/remove_profile.yml" ] } 30564 1726882847.49259: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882847.49276: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882847.49291: variable 'omit' from source: magic vars 30564 1726882847.49442: variable 'ansible_distribution_major_version' from source: facts 30564 1726882847.49452: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882847.49461: variable 'omit' from source: magic vars 30564 1726882847.49482: variable 'omit' from source: magic vars 30564 1726882847.49524: variable 'item' from source: unknown 30564 1726882847.49591: variable 'item' from source: unknown 30564 1726882847.49609: variable 'omit' from source: magic vars 30564 1726882847.49630: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882847.49646: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882847.49655: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882847.49671: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882847.49678: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882847.49684: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882847.49757: Set connection var ansible_timeout to 10 30564 1726882847.49769: Set connection var ansible_pipelining to False 30564 1726882847.49776: Set connection var ansible_shell_type to sh 30564 1726882847.49786: Set connection var ansible_shell_executable to /bin/sh 30564 1726882847.49797: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882847.49803: Set connection var ansible_connection to ssh 30564 1726882847.49827: variable 'ansible_shell_executable' from source: unknown 30564 1726882847.49834: variable 'ansible_connection' from source: unknown 30564 1726882847.49840: variable 'ansible_module_compression' from source: unknown 30564 1726882847.49846: variable 'ansible_shell_type' from source: unknown 30564 1726882847.49857: variable 'ansible_shell_executable' from source: unknown 30564 1726882847.49866: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882847.49875: variable 'ansible_pipelining' from source: unknown 30564 1726882847.49882: variable 'ansible_timeout' from source: unknown 30564 1726882847.49890: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882847.49979: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882847.49992: variable 'omit' from source: magic vars 30564 1726882847.50001: starting attempt loop 30564 1726882847.50007: running the handler 30564 1726882847.50028: variable 'lsr_assert' from source: include params 30564 1726882847.50100: variable 'lsr_assert' from source: include params 30564 1726882847.50122: handler run complete 30564 1726882847.50140: attempt loop complete, returning result 30564 1726882847.50159: variable 'item' from source: unknown 30564 1726882847.50224: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_assert) => { "ansible_loop_var": "item", "item": "lsr_assert", "lsr_assert": [ "tasks/assert_device_present.yml", "tasks/assert_profile_absent.yml" ] } 30564 1726882847.50366: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882847.50380: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882847.50394: variable 'omit' from source: magic vars 30564 1726882847.50579: variable 'ansible_distribution_major_version' from source: facts 30564 1726882847.50594: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882847.50602: variable 'omit' from source: magic vars 30564 1726882847.50619: variable 'omit' from source: magic vars 30564 1726882847.50667: variable 'item' from source: unknown 30564 1726882847.50732: variable 'item' from source: unknown 30564 1726882847.50755: variable 'omit' from source: magic vars 30564 1726882847.50780: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882847.50793: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882847.50803: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882847.50818: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882847.50825: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882847.50832: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882847.50909: Set connection var ansible_timeout to 10 30564 1726882847.50920: Set connection var ansible_pipelining to False 30564 1726882847.50927: Set connection var ansible_shell_type to sh 30564 1726882847.50938: Set connection var ansible_shell_executable to /bin/sh 30564 1726882847.50950: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882847.50959: Set connection var ansible_connection to ssh 30564 1726882847.50987: variable 'ansible_shell_executable' from source: unknown 30564 1726882847.50996: variable 'ansible_connection' from source: unknown 30564 1726882847.51003: variable 'ansible_module_compression' from source: unknown 30564 1726882847.51011: variable 'ansible_shell_type' from source: unknown 30564 1726882847.51018: variable 'ansible_shell_executable' from source: unknown 30564 1726882847.51024: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882847.51032: variable 'ansible_pipelining' from source: unknown 30564 1726882847.51038: variable 'ansible_timeout' from source: unknown 30564 1726882847.51045: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882847.51133: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882847.51146: variable 'omit' from source: magic vars 30564 1726882847.51154: starting attempt loop 30564 1726882847.51160: running the handler 30564 1726882847.51262: handler run complete 30564 1726882847.51282: attempt loop complete, returning result 30564 1726882847.51305: variable 'item' from source: unknown 30564 1726882847.51365: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_assert_when) => { "ansible_loop_var": "item", "item": "lsr_assert_when", "lsr_assert_when": "VARIABLE IS NOT DEFINED!: 'lsr_assert_when' is undefined" } 30564 1726882847.51501: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882847.51514: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882847.51526: variable 'omit' from source: magic vars 30564 1726882847.51898: variable 'ansible_distribution_major_version' from source: facts 30564 1726882847.51908: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882847.51916: variable 'omit' from source: magic vars 30564 1726882847.51932: variable 'omit' from source: magic vars 30564 1726882847.51978: variable 'item' from source: unknown 30564 1726882847.52042: variable 'item' from source: unknown 30564 1726882847.52113: variable 'omit' from source: magic vars 30564 1726882847.52134: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882847.52216: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882847.52227: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882847.52241: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882847.52248: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882847.52254: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882847.52443: Set connection var ansible_timeout to 10 30564 1726882847.52452: Set connection var ansible_pipelining to False 30564 1726882847.52458: Set connection var ansible_shell_type to sh 30564 1726882847.52469: Set connection var ansible_shell_executable to /bin/sh 30564 1726882847.52480: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882847.52485: Set connection var ansible_connection to ssh 30564 1726882847.52507: variable 'ansible_shell_executable' from source: unknown 30564 1726882847.52513: variable 'ansible_connection' from source: unknown 30564 1726882847.52537: variable 'ansible_module_compression' from source: unknown 30564 1726882847.52544: variable 'ansible_shell_type' from source: unknown 30564 1726882847.52550: variable 'ansible_shell_executable' from source: unknown 30564 1726882847.52647: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882847.52656: variable 'ansible_pipelining' from source: unknown 30564 1726882847.52662: variable 'ansible_timeout' from source: unknown 30564 1726882847.52672: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882847.52874: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882847.52887: variable 'omit' from source: magic vars 30564 1726882847.52896: starting attempt loop 30564 1726882847.52903: running the handler 30564 1726882847.52924: variable 'lsr_fail_debug' from source: play vars 30564 1726882847.52994: variable 'lsr_fail_debug' from source: play vars 30564 1726882847.53089: handler run complete 30564 1726882847.53107: attempt loop complete, returning result 30564 1726882847.53125: variable 'item' from source: unknown 30564 1726882847.53302: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_fail_debug) => { "ansible_loop_var": "item", "item": "lsr_fail_debug", "lsr_fail_debug": [ "__network_connections_result" ] } 30564 1726882847.53443: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882847.53518: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882847.53532: variable 'omit' from source: magic vars 30564 1726882847.53791: variable 'ansible_distribution_major_version' from source: facts 30564 1726882847.53840: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882847.53849: variable 'omit' from source: magic vars 30564 1726882847.53956: variable 'omit' from source: magic vars 30564 1726882847.54000: variable 'item' from source: unknown 30564 1726882847.54177: variable 'item' from source: unknown 30564 1726882847.54197: variable 'omit' from source: magic vars 30564 1726882847.54218: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882847.54229: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882847.54243: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882847.54257: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882847.54269: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882847.54281: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882847.54352: Set connection var ansible_timeout to 10 30564 1726882847.54489: Set connection var ansible_pipelining to False 30564 1726882847.54497: Set connection var ansible_shell_type to sh 30564 1726882847.54506: Set connection var ansible_shell_executable to /bin/sh 30564 1726882847.54517: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882847.54523: Set connection var ansible_connection to ssh 30564 1726882847.54547: variable 'ansible_shell_executable' from source: unknown 30564 1726882847.54553: variable 'ansible_connection' from source: unknown 30564 1726882847.54558: variable 'ansible_module_compression' from source: unknown 30564 1726882847.54566: variable 'ansible_shell_type' from source: unknown 30564 1726882847.54593: variable 'ansible_shell_executable' from source: unknown 30564 1726882847.54600: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882847.54607: variable 'ansible_pipelining' from source: unknown 30564 1726882847.54613: variable 'ansible_timeout' from source: unknown 30564 1726882847.54704: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882847.54782: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882847.54815: variable 'omit' from source: magic vars 30564 1726882847.54824: starting attempt loop 30564 1726882847.54920: running the handler 30564 1726882847.54942: variable 'lsr_cleanup' from source: include params 30564 1726882847.55000: variable 'lsr_cleanup' from source: include params 30564 1726882847.55020: handler run complete 30564 1726882847.55149: attempt loop complete, returning result 30564 1726882847.55170: variable 'item' from source: unknown 30564 1726882847.55230: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_cleanup) => { "ansible_loop_var": "item", "item": "lsr_cleanup", "lsr_cleanup": [ "tasks/cleanup_profile+device.yml" ] } 30564 1726882847.55366: dumping result to json 30564 1726882847.55382: done dumping result, returning 30564 1726882847.55394: done running TaskExecutor() for managed_node2/TASK: Show item [0e448fcc-3ce9-4216-acec-000000001006] 30564 1726882847.55470: sending task result for task 0e448fcc-3ce9-4216-acec-000000001006 30564 1726882847.55553: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001006 30564 1726882847.55562: WORKER PROCESS EXITING 30564 1726882847.55614: no more pending results, returning what we have 30564 1726882847.55618: results queue empty 30564 1726882847.55619: checking for any_errors_fatal 30564 1726882847.55627: done checking for any_errors_fatal 30564 1726882847.55628: checking for max_fail_percentage 30564 1726882847.55629: done checking for max_fail_percentage 30564 1726882847.55630: checking to see if all hosts have failed and the running result is not ok 30564 1726882847.55631: done checking to see if all hosts have failed 30564 1726882847.55632: getting the remaining hosts for this loop 30564 1726882847.55634: done getting the remaining hosts for this loop 30564 1726882847.55637: getting the next task for host managed_node2 30564 1726882847.55644: done getting next task for host managed_node2 30564 1726882847.55647: ^ task is: TASK: Include the task 'show_interfaces.yml' 30564 1726882847.55649: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882847.55653: getting variables 30564 1726882847.55655: in VariableManager get_vars() 30564 1726882847.55689: Calling all_inventory to load vars for managed_node2 30564 1726882847.55692: Calling groups_inventory to load vars for managed_node2 30564 1726882847.55695: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882847.55706: Calling all_plugins_play to load vars for managed_node2 30564 1726882847.55709: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882847.55712: Calling groups_plugins_play to load vars for managed_node2 30564 1726882847.58885: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882847.60981: done with get_vars() 30564 1726882847.61004: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:21 Friday 20 September 2024 21:40:47 -0400 (0:00:00.168) 0:00:46.192 ****** 30564 1726882847.61104: entering _queue_task() for managed_node2/include_tasks 30564 1726882847.61409: worker is 1 (out of 1 available) 30564 1726882847.61421: exiting _queue_task() for managed_node2/include_tasks 30564 1726882847.61433: done queuing things up, now waiting for results queue to drain 30564 1726882847.61434: waiting for pending results... 30564 1726882847.61718: running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' 30564 1726882847.61828: in run() - task 0e448fcc-3ce9-4216-acec-000000001007 30564 1726882847.61848: variable 'ansible_search_path' from source: unknown 30564 1726882847.61855: variable 'ansible_search_path' from source: unknown 30564 1726882847.61898: calling self._execute() 30564 1726882847.61995: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882847.62006: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882847.62020: variable 'omit' from source: magic vars 30564 1726882847.62387: variable 'ansible_distribution_major_version' from source: facts 30564 1726882847.62405: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882847.62420: _execute() done 30564 1726882847.62428: dumping result to json 30564 1726882847.62435: done dumping result, returning 30564 1726882847.62446: done running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' [0e448fcc-3ce9-4216-acec-000000001007] 30564 1726882847.62456: sending task result for task 0e448fcc-3ce9-4216-acec-000000001007 30564 1726882847.62581: no more pending results, returning what we have 30564 1726882847.62587: in VariableManager get_vars() 30564 1726882847.62625: Calling all_inventory to load vars for managed_node2 30564 1726882847.62629: Calling groups_inventory to load vars for managed_node2 30564 1726882847.62632: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882847.62645: Calling all_plugins_play to load vars for managed_node2 30564 1726882847.62649: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882847.62652: Calling groups_plugins_play to load vars for managed_node2 30564 1726882847.63819: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001007 30564 1726882847.63822: WORKER PROCESS EXITING 30564 1726882847.64469: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882847.66338: done with get_vars() 30564 1726882847.66356: variable 'ansible_search_path' from source: unknown 30564 1726882847.66358: variable 'ansible_search_path' from source: unknown 30564 1726882847.66402: we have included files to process 30564 1726882847.66403: generating all_blocks data 30564 1726882847.66405: done generating all_blocks data 30564 1726882847.66409: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30564 1726882847.66411: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30564 1726882847.66413: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30564 1726882847.66526: in VariableManager get_vars() 30564 1726882847.66546: done with get_vars() 30564 1726882847.66655: done processing included file 30564 1726882847.66657: iterating over new_blocks loaded from include file 30564 1726882847.66658: in VariableManager get_vars() 30564 1726882847.66673: done with get_vars() 30564 1726882847.66675: filtering new block on tags 30564 1726882847.66714: done filtering new block on tags 30564 1726882847.66716: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node2 30564 1726882847.66721: extending task lists for all hosts with included blocks 30564 1726882847.67203: done extending task lists 30564 1726882847.67204: done processing included files 30564 1726882847.67205: results queue empty 30564 1726882847.67206: checking for any_errors_fatal 30564 1726882847.67211: done checking for any_errors_fatal 30564 1726882847.67212: checking for max_fail_percentage 30564 1726882847.67213: done checking for max_fail_percentage 30564 1726882847.67214: checking to see if all hosts have failed and the running result is not ok 30564 1726882847.67214: done checking to see if all hosts have failed 30564 1726882847.67215: getting the remaining hosts for this loop 30564 1726882847.67216: done getting the remaining hosts for this loop 30564 1726882847.67219: getting the next task for host managed_node2 30564 1726882847.67223: done getting next task for host managed_node2 30564 1726882847.67225: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 30564 1726882847.67228: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882847.67231: getting variables 30564 1726882847.67231: in VariableManager get_vars() 30564 1726882847.67245: Calling all_inventory to load vars for managed_node2 30564 1726882847.67248: Calling groups_inventory to load vars for managed_node2 30564 1726882847.67250: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882847.67255: Calling all_plugins_play to load vars for managed_node2 30564 1726882847.67257: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882847.67260: Calling groups_plugins_play to load vars for managed_node2 30564 1726882847.73452: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882847.74681: done with get_vars() 30564 1726882847.74698: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 21:40:47 -0400 (0:00:00.136) 0:00:46.328 ****** 30564 1726882847.74750: entering _queue_task() for managed_node2/include_tasks 30564 1726882847.74990: worker is 1 (out of 1 available) 30564 1726882847.75005: exiting _queue_task() for managed_node2/include_tasks 30564 1726882847.75017: done queuing things up, now waiting for results queue to drain 30564 1726882847.75019: waiting for pending results... 30564 1726882847.75197: running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' 30564 1726882847.75279: in run() - task 0e448fcc-3ce9-4216-acec-00000000102e 30564 1726882847.75289: variable 'ansible_search_path' from source: unknown 30564 1726882847.75294: variable 'ansible_search_path' from source: unknown 30564 1726882847.75325: calling self._execute() 30564 1726882847.75401: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882847.75406: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882847.75414: variable 'omit' from source: magic vars 30564 1726882847.75700: variable 'ansible_distribution_major_version' from source: facts 30564 1726882847.75711: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882847.75717: _execute() done 30564 1726882847.75720: dumping result to json 30564 1726882847.75723: done dumping result, returning 30564 1726882847.75729: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' [0e448fcc-3ce9-4216-acec-00000000102e] 30564 1726882847.75735: sending task result for task 0e448fcc-3ce9-4216-acec-00000000102e 30564 1726882847.75827: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000102e 30564 1726882847.75829: WORKER PROCESS EXITING 30564 1726882847.75858: no more pending results, returning what we have 30564 1726882847.75865: in VariableManager get_vars() 30564 1726882847.76073: Calling all_inventory to load vars for managed_node2 30564 1726882847.76076: Calling groups_inventory to load vars for managed_node2 30564 1726882847.76079: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882847.76089: Calling all_plugins_play to load vars for managed_node2 30564 1726882847.76092: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882847.76096: Calling groups_plugins_play to load vars for managed_node2 30564 1726882847.77442: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882847.78376: done with get_vars() 30564 1726882847.78391: variable 'ansible_search_path' from source: unknown 30564 1726882847.78392: variable 'ansible_search_path' from source: unknown 30564 1726882847.78416: we have included files to process 30564 1726882847.78417: generating all_blocks data 30564 1726882847.78418: done generating all_blocks data 30564 1726882847.78419: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30564 1726882847.78420: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30564 1726882847.78421: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30564 1726882847.78595: done processing included file 30564 1726882847.78596: iterating over new_blocks loaded from include file 30564 1726882847.78598: in VariableManager get_vars() 30564 1726882847.78609: done with get_vars() 30564 1726882847.78611: filtering new block on tags 30564 1726882847.78635: done filtering new block on tags 30564 1726882847.78636: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node2 30564 1726882847.78640: extending task lists for all hosts with included blocks 30564 1726882847.78740: done extending task lists 30564 1726882847.78741: done processing included files 30564 1726882847.78742: results queue empty 30564 1726882847.78742: checking for any_errors_fatal 30564 1726882847.78744: done checking for any_errors_fatal 30564 1726882847.78745: checking for max_fail_percentage 30564 1726882847.78746: done checking for max_fail_percentage 30564 1726882847.78746: checking to see if all hosts have failed and the running result is not ok 30564 1726882847.78747: done checking to see if all hosts have failed 30564 1726882847.78747: getting the remaining hosts for this loop 30564 1726882847.78748: done getting the remaining hosts for this loop 30564 1726882847.78750: getting the next task for host managed_node2 30564 1726882847.78753: done getting next task for host managed_node2 30564 1726882847.78754: ^ task is: TASK: Gather current interface info 30564 1726882847.78757: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882847.78758: getting variables 30564 1726882847.78759: in VariableManager get_vars() 30564 1726882847.78767: Calling all_inventory to load vars for managed_node2 30564 1726882847.78769: Calling groups_inventory to load vars for managed_node2 30564 1726882847.78770: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882847.78774: Calling all_plugins_play to load vars for managed_node2 30564 1726882847.78775: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882847.78777: Calling groups_plugins_play to load vars for managed_node2 30564 1726882847.79449: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882847.80362: done with get_vars() 30564 1726882847.80379: done getting variables 30564 1726882847.80404: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 21:40:47 -0400 (0:00:00.056) 0:00:46.385 ****** 30564 1726882847.80424: entering _queue_task() for managed_node2/command 30564 1726882847.80625: worker is 1 (out of 1 available) 30564 1726882847.80640: exiting _queue_task() for managed_node2/command 30564 1726882847.80652: done queuing things up, now waiting for results queue to drain 30564 1726882847.80654: waiting for pending results... 30564 1726882847.80837: running TaskExecutor() for managed_node2/TASK: Gather current interface info 30564 1726882847.80927: in run() - task 0e448fcc-3ce9-4216-acec-000000001069 30564 1726882847.80937: variable 'ansible_search_path' from source: unknown 30564 1726882847.80940: variable 'ansible_search_path' from source: unknown 30564 1726882847.80970: calling self._execute() 30564 1726882847.81045: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882847.81052: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882847.81062: variable 'omit' from source: magic vars 30564 1726882847.81343: variable 'ansible_distribution_major_version' from source: facts 30564 1726882847.81353: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882847.81359: variable 'omit' from source: magic vars 30564 1726882847.81398: variable 'omit' from source: magic vars 30564 1726882847.81420: variable 'omit' from source: magic vars 30564 1726882847.81453: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882847.81485: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882847.81502: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882847.81514: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882847.81524: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882847.81549: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882847.81552: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882847.81555: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882847.81624: Set connection var ansible_timeout to 10 30564 1726882847.81627: Set connection var ansible_pipelining to False 30564 1726882847.81630: Set connection var ansible_shell_type to sh 30564 1726882847.81635: Set connection var ansible_shell_executable to /bin/sh 30564 1726882847.81643: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882847.81646: Set connection var ansible_connection to ssh 30564 1726882847.81665: variable 'ansible_shell_executable' from source: unknown 30564 1726882847.81668: variable 'ansible_connection' from source: unknown 30564 1726882847.81671: variable 'ansible_module_compression' from source: unknown 30564 1726882847.81674: variable 'ansible_shell_type' from source: unknown 30564 1726882847.81676: variable 'ansible_shell_executable' from source: unknown 30564 1726882847.81680: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882847.81684: variable 'ansible_pipelining' from source: unknown 30564 1726882847.81686: variable 'ansible_timeout' from source: unknown 30564 1726882847.81690: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882847.81790: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882847.81798: variable 'omit' from source: magic vars 30564 1726882847.81803: starting attempt loop 30564 1726882847.81806: running the handler 30564 1726882847.81818: _low_level_execute_command(): starting 30564 1726882847.81825: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882847.82337: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882847.82347: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882847.82374: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 30564 1726882847.82388: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882847.82400: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882847.82453: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882847.82459: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882847.82585: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882847.84258: stdout chunk (state=3): >>>/root <<< 30564 1726882847.84368: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882847.84415: stderr chunk (state=3): >>><<< 30564 1726882847.84418: stdout chunk (state=3): >>><<< 30564 1726882847.84437: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882847.84450: _low_level_execute_command(): starting 30564 1726882847.84456: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882847.8443592-32664-127220571768728 `" && echo ansible-tmp-1726882847.8443592-32664-127220571768728="` echo /root/.ansible/tmp/ansible-tmp-1726882847.8443592-32664-127220571768728 `" ) && sleep 0' 30564 1726882847.84876: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882847.84894: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882847.84918: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882847.84938: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882847.84987: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882847.85007: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882847.85115: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882847.87044: stdout chunk (state=3): >>>ansible-tmp-1726882847.8443592-32664-127220571768728=/root/.ansible/tmp/ansible-tmp-1726882847.8443592-32664-127220571768728 <<< 30564 1726882847.87153: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882847.87198: stderr chunk (state=3): >>><<< 30564 1726882847.87202: stdout chunk (state=3): >>><<< 30564 1726882847.87216: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882847.8443592-32664-127220571768728=/root/.ansible/tmp/ansible-tmp-1726882847.8443592-32664-127220571768728 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882847.87243: variable 'ansible_module_compression' from source: unknown 30564 1726882847.87286: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30564 1726882847.87316: variable 'ansible_facts' from source: unknown 30564 1726882847.87382: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882847.8443592-32664-127220571768728/AnsiballZ_command.py 30564 1726882847.87487: Sending initial data 30564 1726882847.87491: Sent initial data (156 bytes) 30564 1726882847.88123: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882847.88135: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882847.88159: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 30564 1726882847.88173: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 30564 1726882847.88184: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882847.88230: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882847.88243: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882847.88352: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882847.90177: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882847.90269: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882847.90367: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmp0_n2qk85 /root/.ansible/tmp/ansible-tmp-1726882847.8443592-32664-127220571768728/AnsiballZ_command.py <<< 30564 1726882847.90461: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882847.91482: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882847.91568: stderr chunk (state=3): >>><<< 30564 1726882847.91573: stdout chunk (state=3): >>><<< 30564 1726882847.91594: done transferring module to remote 30564 1726882847.91601: _low_level_execute_command(): starting 30564 1726882847.91606: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882847.8443592-32664-127220571768728/ /root/.ansible/tmp/ansible-tmp-1726882847.8443592-32664-127220571768728/AnsiballZ_command.py && sleep 0' 30564 1726882847.92007: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882847.92020: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882847.92046: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882847.92058: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882847.92111: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882847.92122: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882847.92226: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882847.94004: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882847.94049: stderr chunk (state=3): >>><<< 30564 1726882847.94062: stdout chunk (state=3): >>><<< 30564 1726882847.94073: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882847.94076: _low_level_execute_command(): starting 30564 1726882847.94079: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882847.8443592-32664-127220571768728/AnsiballZ_command.py && sleep 0' 30564 1726882847.94508: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882847.94514: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882847.94539: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882847.94551: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882847.94605: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882847.94620: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882847.94735: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882848.08207: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo\nrpltstbr", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:40:48.076710", "end": "2024-09-20 21:40:48.079955", "delta": "0:00:00.003245", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30564 1726882848.09501: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882848.09505: stdout chunk (state=3): >>><<< 30564 1726882848.09507: stderr chunk (state=3): >>><<< 30564 1726882848.09575: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo\nrpltstbr", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:40:48.076710", "end": "2024-09-20 21:40:48.079955", "delta": "0:00:00.003245", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882848.09678: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882847.8443592-32664-127220571768728/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882848.09682: _low_level_execute_command(): starting 30564 1726882848.09685: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882847.8443592-32664-127220571768728/ > /dev/null 2>&1 && sleep 0' 30564 1726882848.10152: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882848.10158: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882848.10193: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882848.10201: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration <<< 30564 1726882848.10206: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882848.10212: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882848.10225: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882848.10232: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882848.10237: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882848.10242: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882848.10302: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882848.10309: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882848.10426: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882848.12244: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882848.12292: stderr chunk (state=3): >>><<< 30564 1726882848.12296: stdout chunk (state=3): >>><<< 30564 1726882848.12309: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882848.12316: handler run complete 30564 1726882848.12333: Evaluated conditional (False): False 30564 1726882848.12341: attempt loop complete, returning result 30564 1726882848.12344: _execute() done 30564 1726882848.12346: dumping result to json 30564 1726882848.12351: done dumping result, returning 30564 1726882848.12358: done running TaskExecutor() for managed_node2/TASK: Gather current interface info [0e448fcc-3ce9-4216-acec-000000001069] 30564 1726882848.12370: sending task result for task 0e448fcc-3ce9-4216-acec-000000001069 30564 1726882848.12478: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001069 30564 1726882848.12481: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003245", "end": "2024-09-20 21:40:48.079955", "rc": 0, "start": "2024-09-20 21:40:48.076710" } STDOUT: bonding_masters eth0 lo rpltstbr 30564 1726882848.12547: no more pending results, returning what we have 30564 1726882848.12551: results queue empty 30564 1726882848.12553: checking for any_errors_fatal 30564 1726882848.12555: done checking for any_errors_fatal 30564 1726882848.12555: checking for max_fail_percentage 30564 1726882848.12557: done checking for max_fail_percentage 30564 1726882848.12558: checking to see if all hosts have failed and the running result is not ok 30564 1726882848.12558: done checking to see if all hosts have failed 30564 1726882848.12559: getting the remaining hosts for this loop 30564 1726882848.12561: done getting the remaining hosts for this loop 30564 1726882848.12566: getting the next task for host managed_node2 30564 1726882848.12578: done getting next task for host managed_node2 30564 1726882848.12581: ^ task is: TASK: Set current_interfaces 30564 1726882848.12587: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882848.12591: getting variables 30564 1726882848.12593: in VariableManager get_vars() 30564 1726882848.12623: Calling all_inventory to load vars for managed_node2 30564 1726882848.12626: Calling groups_inventory to load vars for managed_node2 30564 1726882848.12629: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882848.12639: Calling all_plugins_play to load vars for managed_node2 30564 1726882848.12642: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882848.12645: Calling groups_plugins_play to load vars for managed_node2 30564 1726882848.13544: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882848.14505: done with get_vars() 30564 1726882848.14521: done getting variables 30564 1726882848.14569: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 21:40:48 -0400 (0:00:00.341) 0:00:46.727 ****** 30564 1726882848.14593: entering _queue_task() for managed_node2/set_fact 30564 1726882848.14805: worker is 1 (out of 1 available) 30564 1726882848.14817: exiting _queue_task() for managed_node2/set_fact 30564 1726882848.14830: done queuing things up, now waiting for results queue to drain 30564 1726882848.14831: waiting for pending results... 30564 1726882848.15018: running TaskExecutor() for managed_node2/TASK: Set current_interfaces 30564 1726882848.15085: in run() - task 0e448fcc-3ce9-4216-acec-00000000106a 30564 1726882848.15098: variable 'ansible_search_path' from source: unknown 30564 1726882848.15104: variable 'ansible_search_path' from source: unknown 30564 1726882848.15131: calling self._execute() 30564 1726882848.15210: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882848.15219: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882848.15226: variable 'omit' from source: magic vars 30564 1726882848.15502: variable 'ansible_distribution_major_version' from source: facts 30564 1726882848.15513: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882848.15518: variable 'omit' from source: magic vars 30564 1726882848.15556: variable 'omit' from source: magic vars 30564 1726882848.15630: variable '_current_interfaces' from source: set_fact 30564 1726882848.15680: variable 'omit' from source: magic vars 30564 1726882848.15711: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882848.15738: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882848.15756: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882848.15773: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882848.15783: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882848.15806: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882848.15811: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882848.15813: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882848.15883: Set connection var ansible_timeout to 10 30564 1726882848.15887: Set connection var ansible_pipelining to False 30564 1726882848.15889: Set connection var ansible_shell_type to sh 30564 1726882848.15895: Set connection var ansible_shell_executable to /bin/sh 30564 1726882848.15901: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882848.15903: Set connection var ansible_connection to ssh 30564 1726882848.15923: variable 'ansible_shell_executable' from source: unknown 30564 1726882848.15926: variable 'ansible_connection' from source: unknown 30564 1726882848.15929: variable 'ansible_module_compression' from source: unknown 30564 1726882848.15931: variable 'ansible_shell_type' from source: unknown 30564 1726882848.15933: variable 'ansible_shell_executable' from source: unknown 30564 1726882848.15935: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882848.15938: variable 'ansible_pipelining' from source: unknown 30564 1726882848.15940: variable 'ansible_timeout' from source: unknown 30564 1726882848.15944: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882848.16045: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882848.16055: variable 'omit' from source: magic vars 30564 1726882848.16062: starting attempt loop 30564 1726882848.16066: running the handler 30564 1726882848.16076: handler run complete 30564 1726882848.16085: attempt loop complete, returning result 30564 1726882848.16088: _execute() done 30564 1726882848.16090: dumping result to json 30564 1726882848.16093: done dumping result, returning 30564 1726882848.16099: done running TaskExecutor() for managed_node2/TASK: Set current_interfaces [0e448fcc-3ce9-4216-acec-00000000106a] 30564 1726882848.16104: sending task result for task 0e448fcc-3ce9-4216-acec-00000000106a 30564 1726882848.16195: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000106a 30564 1726882848.16197: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo", "rpltstbr" ] }, "changed": false } 30564 1726882848.16252: no more pending results, returning what we have 30564 1726882848.16255: results queue empty 30564 1726882848.16256: checking for any_errors_fatal 30564 1726882848.16266: done checking for any_errors_fatal 30564 1726882848.16267: checking for max_fail_percentage 30564 1726882848.16271: done checking for max_fail_percentage 30564 1726882848.16272: checking to see if all hosts have failed and the running result is not ok 30564 1726882848.16272: done checking to see if all hosts have failed 30564 1726882848.16273: getting the remaining hosts for this loop 30564 1726882848.16275: done getting the remaining hosts for this loop 30564 1726882848.16279: getting the next task for host managed_node2 30564 1726882848.16291: done getting next task for host managed_node2 30564 1726882848.16293: ^ task is: TASK: Show current_interfaces 30564 1726882848.16300: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882848.16303: getting variables 30564 1726882848.16304: in VariableManager get_vars() 30564 1726882848.16331: Calling all_inventory to load vars for managed_node2 30564 1726882848.16333: Calling groups_inventory to load vars for managed_node2 30564 1726882848.16336: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882848.16345: Calling all_plugins_play to load vars for managed_node2 30564 1726882848.16348: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882848.16350: Calling groups_plugins_play to load vars for managed_node2 30564 1726882848.17251: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882848.18206: done with get_vars() 30564 1726882848.18222: done getting variables 30564 1726882848.18266: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 21:40:48 -0400 (0:00:00.036) 0:00:46.764 ****** 30564 1726882848.18290: entering _queue_task() for managed_node2/debug 30564 1726882848.18508: worker is 1 (out of 1 available) 30564 1726882848.18523: exiting _queue_task() for managed_node2/debug 30564 1726882848.18535: done queuing things up, now waiting for results queue to drain 30564 1726882848.18537: waiting for pending results... 30564 1726882848.18729: running TaskExecutor() for managed_node2/TASK: Show current_interfaces 30564 1726882848.18803: in run() - task 0e448fcc-3ce9-4216-acec-00000000102f 30564 1726882848.18814: variable 'ansible_search_path' from source: unknown 30564 1726882848.18819: variable 'ansible_search_path' from source: unknown 30564 1726882848.18847: calling self._execute() 30564 1726882848.18924: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882848.18927: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882848.18936: variable 'omit' from source: magic vars 30564 1726882848.19234: variable 'ansible_distribution_major_version' from source: facts 30564 1726882848.19243: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882848.19249: variable 'omit' from source: magic vars 30564 1726882848.19282: variable 'omit' from source: magic vars 30564 1726882848.19348: variable 'current_interfaces' from source: set_fact 30564 1726882848.19371: variable 'omit' from source: magic vars 30564 1726882848.19406: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882848.19435: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882848.19450: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882848.19465: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882848.19478: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882848.19510: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882848.19513: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882848.19517: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882848.19590: Set connection var ansible_timeout to 10 30564 1726882848.19594: Set connection var ansible_pipelining to False 30564 1726882848.19597: Set connection var ansible_shell_type to sh 30564 1726882848.19602: Set connection var ansible_shell_executable to /bin/sh 30564 1726882848.19608: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882848.19611: Set connection var ansible_connection to ssh 30564 1726882848.19629: variable 'ansible_shell_executable' from source: unknown 30564 1726882848.19632: variable 'ansible_connection' from source: unknown 30564 1726882848.19634: variable 'ansible_module_compression' from source: unknown 30564 1726882848.19637: variable 'ansible_shell_type' from source: unknown 30564 1726882848.19639: variable 'ansible_shell_executable' from source: unknown 30564 1726882848.19641: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882848.19643: variable 'ansible_pipelining' from source: unknown 30564 1726882848.19647: variable 'ansible_timeout' from source: unknown 30564 1726882848.19654: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882848.19753: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882848.19762: variable 'omit' from source: magic vars 30564 1726882848.19771: starting attempt loop 30564 1726882848.19775: running the handler 30564 1726882848.19811: handler run complete 30564 1726882848.19821: attempt loop complete, returning result 30564 1726882848.19824: _execute() done 30564 1726882848.19827: dumping result to json 30564 1726882848.19829: done dumping result, returning 30564 1726882848.19835: done running TaskExecutor() for managed_node2/TASK: Show current_interfaces [0e448fcc-3ce9-4216-acec-00000000102f] 30564 1726882848.19840: sending task result for task 0e448fcc-3ce9-4216-acec-00000000102f 30564 1726882848.19926: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000102f 30564 1726882848.19929: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo', 'rpltstbr'] 30564 1726882848.19973: no more pending results, returning what we have 30564 1726882848.19977: results queue empty 30564 1726882848.19978: checking for any_errors_fatal 30564 1726882848.19984: done checking for any_errors_fatal 30564 1726882848.19985: checking for max_fail_percentage 30564 1726882848.19986: done checking for max_fail_percentage 30564 1726882848.19987: checking to see if all hosts have failed and the running result is not ok 30564 1726882848.19988: done checking to see if all hosts have failed 30564 1726882848.19989: getting the remaining hosts for this loop 30564 1726882848.19990: done getting the remaining hosts for this loop 30564 1726882848.20000: getting the next task for host managed_node2 30564 1726882848.20009: done getting next task for host managed_node2 30564 1726882848.20012: ^ task is: TASK: Setup 30564 1726882848.20015: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882848.20019: getting variables 30564 1726882848.20021: in VariableManager get_vars() 30564 1726882848.20049: Calling all_inventory to load vars for managed_node2 30564 1726882848.20051: Calling groups_inventory to load vars for managed_node2 30564 1726882848.20054: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882848.20066: Calling all_plugins_play to load vars for managed_node2 30564 1726882848.20069: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882848.20072: Calling groups_plugins_play to load vars for managed_node2 30564 1726882848.20882: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882848.21831: done with get_vars() 30564 1726882848.21848: done getting variables TASK [Setup] ******************************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:24 Friday 20 September 2024 21:40:48 -0400 (0:00:00.036) 0:00:46.800 ****** 30564 1726882848.21912: entering _queue_task() for managed_node2/include_tasks 30564 1726882848.22114: worker is 1 (out of 1 available) 30564 1726882848.22128: exiting _queue_task() for managed_node2/include_tasks 30564 1726882848.22141: done queuing things up, now waiting for results queue to drain 30564 1726882848.22142: waiting for pending results... 30564 1726882848.22321: running TaskExecutor() for managed_node2/TASK: Setup 30564 1726882848.22378: in run() - task 0e448fcc-3ce9-4216-acec-000000001008 30564 1726882848.22392: variable 'ansible_search_path' from source: unknown 30564 1726882848.22395: variable 'ansible_search_path' from source: unknown 30564 1726882848.22428: variable 'lsr_setup' from source: include params 30564 1726882848.22586: variable 'lsr_setup' from source: include params 30564 1726882848.22640: variable 'omit' from source: magic vars 30564 1726882848.22739: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882848.22746: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882848.22754: variable 'omit' from source: magic vars 30564 1726882848.22920: variable 'ansible_distribution_major_version' from source: facts 30564 1726882848.22933: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882848.22937: variable 'item' from source: unknown 30564 1726882848.22981: variable 'item' from source: unknown 30564 1726882848.23002: variable 'item' from source: unknown 30564 1726882848.23047: variable 'item' from source: unknown 30564 1726882848.23175: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882848.23178: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882848.23181: variable 'omit' from source: magic vars 30564 1726882848.23256: variable 'ansible_distribution_major_version' from source: facts 30564 1726882848.23259: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882848.23266: variable 'item' from source: unknown 30564 1726882848.23311: variable 'item' from source: unknown 30564 1726882848.23330: variable 'item' from source: unknown 30564 1726882848.23374: variable 'item' from source: unknown 30564 1726882848.23439: dumping result to json 30564 1726882848.23442: done dumping result, returning 30564 1726882848.23444: done running TaskExecutor() for managed_node2/TASK: Setup [0e448fcc-3ce9-4216-acec-000000001008] 30564 1726882848.23446: sending task result for task 0e448fcc-3ce9-4216-acec-000000001008 30564 1726882848.23489: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001008 30564 1726882848.23492: WORKER PROCESS EXITING 30564 1726882848.23520: no more pending results, returning what we have 30564 1726882848.23525: in VariableManager get_vars() 30564 1726882848.23559: Calling all_inventory to load vars for managed_node2 30564 1726882848.23561: Calling groups_inventory to load vars for managed_node2 30564 1726882848.23566: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882848.23579: Calling all_plugins_play to load vars for managed_node2 30564 1726882848.23582: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882848.23585: Calling groups_plugins_play to load vars for managed_node2 30564 1726882848.24503: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882848.25430: done with get_vars() 30564 1726882848.25443: variable 'ansible_search_path' from source: unknown 30564 1726882848.25444: variable 'ansible_search_path' from source: unknown 30564 1726882848.25476: variable 'ansible_search_path' from source: unknown 30564 1726882848.25478: variable 'ansible_search_path' from source: unknown 30564 1726882848.25495: we have included files to process 30564 1726882848.25496: generating all_blocks data 30564 1726882848.25497: done generating all_blocks data 30564 1726882848.25501: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 30564 1726882848.25501: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 30564 1726882848.25503: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 30564 1726882848.25653: done processing included file 30564 1726882848.25655: iterating over new_blocks loaded from include file 30564 1726882848.25656: in VariableManager get_vars() 30564 1726882848.25669: done with get_vars() 30564 1726882848.25671: filtering new block on tags 30564 1726882848.25696: done filtering new block on tags 30564 1726882848.25698: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml for managed_node2 => (item=tasks/create_bridge_profile.yml) 30564 1726882848.25702: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 30564 1726882848.25702: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 30564 1726882848.25704: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 30564 1726882848.25759: done processing included file 30564 1726882848.25760: iterating over new_blocks loaded from include file 30564 1726882848.25761: in VariableManager get_vars() 30564 1726882848.25774: done with get_vars() 30564 1726882848.25775: filtering new block on tags 30564 1726882848.25789: done filtering new block on tags 30564 1726882848.25790: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml for managed_node2 => (item=tasks/activate_profile.yml) 30564 1726882848.25793: extending task lists for all hosts with included blocks 30564 1726882848.26148: done extending task lists 30564 1726882848.26149: done processing included files 30564 1726882848.26150: results queue empty 30564 1726882848.26150: checking for any_errors_fatal 30564 1726882848.26153: done checking for any_errors_fatal 30564 1726882848.26153: checking for max_fail_percentage 30564 1726882848.26154: done checking for max_fail_percentage 30564 1726882848.26154: checking to see if all hosts have failed and the running result is not ok 30564 1726882848.26155: done checking to see if all hosts have failed 30564 1726882848.26155: getting the remaining hosts for this loop 30564 1726882848.26156: done getting the remaining hosts for this loop 30564 1726882848.26158: getting the next task for host managed_node2 30564 1726882848.26160: done getting next task for host managed_node2 30564 1726882848.26161: ^ task is: TASK: Include network role 30564 1726882848.26165: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882848.26167: getting variables 30564 1726882848.26169: in VariableManager get_vars() 30564 1726882848.26175: Calling all_inventory to load vars for managed_node2 30564 1726882848.26181: Calling groups_inventory to load vars for managed_node2 30564 1726882848.26182: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882848.26186: Calling all_plugins_play to load vars for managed_node2 30564 1726882848.26187: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882848.26189: Calling groups_plugins_play to load vars for managed_node2 30564 1726882848.26896: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882848.27867: done with get_vars() 30564 1726882848.27884: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml:3 Friday 20 September 2024 21:40:48 -0400 (0:00:00.060) 0:00:46.860 ****** 30564 1726882848.27932: entering _queue_task() for managed_node2/include_role 30564 1726882848.28157: worker is 1 (out of 1 available) 30564 1726882848.28172: exiting _queue_task() for managed_node2/include_role 30564 1726882848.28183: done queuing things up, now waiting for results queue to drain 30564 1726882848.28184: waiting for pending results... 30564 1726882848.28366: running TaskExecutor() for managed_node2/TASK: Include network role 30564 1726882848.28450: in run() - task 0e448fcc-3ce9-4216-acec-00000000108f 30564 1726882848.28459: variable 'ansible_search_path' from source: unknown 30564 1726882848.28463: variable 'ansible_search_path' from source: unknown 30564 1726882848.28493: calling self._execute() 30564 1726882848.28566: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882848.28572: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882848.28580: variable 'omit' from source: magic vars 30564 1726882848.28854: variable 'ansible_distribution_major_version' from source: facts 30564 1726882848.28866: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882848.28872: _execute() done 30564 1726882848.28875: dumping result to json 30564 1726882848.28878: done dumping result, returning 30564 1726882848.28884: done running TaskExecutor() for managed_node2/TASK: Include network role [0e448fcc-3ce9-4216-acec-00000000108f] 30564 1726882848.28890: sending task result for task 0e448fcc-3ce9-4216-acec-00000000108f 30564 1726882848.29001: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000108f 30564 1726882848.29005: WORKER PROCESS EXITING 30564 1726882848.29031: no more pending results, returning what we have 30564 1726882848.29037: in VariableManager get_vars() 30564 1726882848.29078: Calling all_inventory to load vars for managed_node2 30564 1726882848.29081: Calling groups_inventory to load vars for managed_node2 30564 1726882848.29084: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882848.29095: Calling all_plugins_play to load vars for managed_node2 30564 1726882848.29097: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882848.29100: Calling groups_plugins_play to load vars for managed_node2 30564 1726882848.29927: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882848.30877: done with get_vars() 30564 1726882848.30891: variable 'ansible_search_path' from source: unknown 30564 1726882848.30892: variable 'ansible_search_path' from source: unknown 30564 1726882848.31010: variable 'omit' from source: magic vars 30564 1726882848.31034: variable 'omit' from source: magic vars 30564 1726882848.31043: variable 'omit' from source: magic vars 30564 1726882848.31046: we have included files to process 30564 1726882848.31046: generating all_blocks data 30564 1726882848.31047: done generating all_blocks data 30564 1726882848.31048: processing included file: fedora.linux_system_roles.network 30564 1726882848.31061: in VariableManager get_vars() 30564 1726882848.31073: done with get_vars() 30564 1726882848.31093: in VariableManager get_vars() 30564 1726882848.31104: done with get_vars() 30564 1726882848.31129: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 30564 1726882848.31206: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 30564 1726882848.31251: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 30564 1726882848.31520: in VariableManager get_vars() 30564 1726882848.31533: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30564 1726882848.32785: iterating over new_blocks loaded from include file 30564 1726882848.32787: in VariableManager get_vars() 30564 1726882848.32798: done with get_vars() 30564 1726882848.32799: filtering new block on tags 30564 1726882848.32995: done filtering new block on tags 30564 1726882848.32998: in VariableManager get_vars() 30564 1726882848.33007: done with get_vars() 30564 1726882848.33008: filtering new block on tags 30564 1726882848.33018: done filtering new block on tags 30564 1726882848.33019: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node2 30564 1726882848.33023: extending task lists for all hosts with included blocks 30564 1726882848.33124: done extending task lists 30564 1726882848.33125: done processing included files 30564 1726882848.33125: results queue empty 30564 1726882848.33126: checking for any_errors_fatal 30564 1726882848.33128: done checking for any_errors_fatal 30564 1726882848.33129: checking for max_fail_percentage 30564 1726882848.33129: done checking for max_fail_percentage 30564 1726882848.33130: checking to see if all hosts have failed and the running result is not ok 30564 1726882848.33130: done checking to see if all hosts have failed 30564 1726882848.33131: getting the remaining hosts for this loop 30564 1726882848.33132: done getting the remaining hosts for this loop 30564 1726882848.33133: getting the next task for host managed_node2 30564 1726882848.33137: done getting next task for host managed_node2 30564 1726882848.33140: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30564 1726882848.33143: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882848.33150: getting variables 30564 1726882848.33151: in VariableManager get_vars() 30564 1726882848.33159: Calling all_inventory to load vars for managed_node2 30564 1726882848.33160: Calling groups_inventory to load vars for managed_node2 30564 1726882848.33162: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882848.33167: Calling all_plugins_play to load vars for managed_node2 30564 1726882848.33170: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882848.33172: Calling groups_plugins_play to load vars for managed_node2 30564 1726882848.33865: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882848.34807: done with get_vars() 30564 1726882848.34822: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:40:48 -0400 (0:00:00.069) 0:00:46.930 ****** 30564 1726882848.34876: entering _queue_task() for managed_node2/include_tasks 30564 1726882848.35124: worker is 1 (out of 1 available) 30564 1726882848.35140: exiting _queue_task() for managed_node2/include_tasks 30564 1726882848.35153: done queuing things up, now waiting for results queue to drain 30564 1726882848.35154: waiting for pending results... 30564 1726882848.35340: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30564 1726882848.35430: in run() - task 0e448fcc-3ce9-4216-acec-0000000010f5 30564 1726882848.35441: variable 'ansible_search_path' from source: unknown 30564 1726882848.35445: variable 'ansible_search_path' from source: unknown 30564 1726882848.35475: calling self._execute() 30564 1726882848.35553: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882848.35557: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882848.35567: variable 'omit' from source: magic vars 30564 1726882848.35844: variable 'ansible_distribution_major_version' from source: facts 30564 1726882848.35856: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882848.35859: _execute() done 30564 1726882848.35862: dumping result to json 30564 1726882848.35866: done dumping result, returning 30564 1726882848.35875: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0e448fcc-3ce9-4216-acec-0000000010f5] 30564 1726882848.35880: sending task result for task 0e448fcc-3ce9-4216-acec-0000000010f5 30564 1726882848.35963: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000010f5 30564 1726882848.35967: WORKER PROCESS EXITING 30564 1726882848.36016: no more pending results, returning what we have 30564 1726882848.36021: in VariableManager get_vars() 30564 1726882848.36061: Calling all_inventory to load vars for managed_node2 30564 1726882848.36065: Calling groups_inventory to load vars for managed_node2 30564 1726882848.36070: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882848.36085: Calling all_plugins_play to load vars for managed_node2 30564 1726882848.36091: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882848.36094: Calling groups_plugins_play to load vars for managed_node2 30564 1726882848.36999: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882848.37945: done with get_vars() 30564 1726882848.37959: variable 'ansible_search_path' from source: unknown 30564 1726882848.37960: variable 'ansible_search_path' from source: unknown 30564 1726882848.37988: we have included files to process 30564 1726882848.37989: generating all_blocks data 30564 1726882848.37990: done generating all_blocks data 30564 1726882848.37993: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30564 1726882848.37993: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30564 1726882848.37995: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30564 1726882848.38370: done processing included file 30564 1726882848.38371: iterating over new_blocks loaded from include file 30564 1726882848.38373: in VariableManager get_vars() 30564 1726882848.38388: done with get_vars() 30564 1726882848.38390: filtering new block on tags 30564 1726882848.38408: done filtering new block on tags 30564 1726882848.38410: in VariableManager get_vars() 30564 1726882848.38422: done with get_vars() 30564 1726882848.38423: filtering new block on tags 30564 1726882848.38447: done filtering new block on tags 30564 1726882848.38449: in VariableManager get_vars() 30564 1726882848.38461: done with get_vars() 30564 1726882848.38462: filtering new block on tags 30564 1726882848.38492: done filtering new block on tags 30564 1726882848.38493: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 30564 1726882848.38497: extending task lists for all hosts with included blocks 30564 1726882848.39496: done extending task lists 30564 1726882848.39497: done processing included files 30564 1726882848.39498: results queue empty 30564 1726882848.39498: checking for any_errors_fatal 30564 1726882848.39500: done checking for any_errors_fatal 30564 1726882848.39500: checking for max_fail_percentage 30564 1726882848.39501: done checking for max_fail_percentage 30564 1726882848.39502: checking to see if all hosts have failed and the running result is not ok 30564 1726882848.39502: done checking to see if all hosts have failed 30564 1726882848.39503: getting the remaining hosts for this loop 30564 1726882848.39504: done getting the remaining hosts for this loop 30564 1726882848.39505: getting the next task for host managed_node2 30564 1726882848.39509: done getting next task for host managed_node2 30564 1726882848.39510: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30564 1726882848.39513: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882848.39520: getting variables 30564 1726882848.39521: in VariableManager get_vars() 30564 1726882848.39529: Calling all_inventory to load vars for managed_node2 30564 1726882848.39530: Calling groups_inventory to load vars for managed_node2 30564 1726882848.39531: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882848.39534: Calling all_plugins_play to load vars for managed_node2 30564 1726882848.39536: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882848.39537: Calling groups_plugins_play to load vars for managed_node2 30564 1726882848.40195: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882848.41197: done with get_vars() 30564 1726882848.41211: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:40:48 -0400 (0:00:00.063) 0:00:46.993 ****** 30564 1726882848.41259: entering _queue_task() for managed_node2/setup 30564 1726882848.41499: worker is 1 (out of 1 available) 30564 1726882848.41513: exiting _queue_task() for managed_node2/setup 30564 1726882848.41527: done queuing things up, now waiting for results queue to drain 30564 1726882848.41528: waiting for pending results... 30564 1726882848.41716: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30564 1726882848.41812: in run() - task 0e448fcc-3ce9-4216-acec-000000001152 30564 1726882848.41823: variable 'ansible_search_path' from source: unknown 30564 1726882848.41826: variable 'ansible_search_path' from source: unknown 30564 1726882848.41856: calling self._execute() 30564 1726882848.41931: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882848.41935: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882848.41945: variable 'omit' from source: magic vars 30564 1726882848.42220: variable 'ansible_distribution_major_version' from source: facts 30564 1726882848.42230: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882848.42378: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882848.43918: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882848.43961: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882848.43990: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882848.44016: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882848.44037: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882848.44095: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882848.44116: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882848.44136: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882848.44164: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882848.44176: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882848.44211: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882848.44229: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882848.44248: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882848.44278: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882848.44289: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882848.44393: variable '__network_required_facts' from source: role '' defaults 30564 1726882848.44400: variable 'ansible_facts' from source: unknown 30564 1726882848.44881: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 30564 1726882848.44887: when evaluation is False, skipping this task 30564 1726882848.44890: _execute() done 30564 1726882848.44894: dumping result to json 30564 1726882848.44897: done dumping result, returning 30564 1726882848.44899: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0e448fcc-3ce9-4216-acec-000000001152] 30564 1726882848.44907: sending task result for task 0e448fcc-3ce9-4216-acec-000000001152 30564 1726882848.44989: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001152 30564 1726882848.44993: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30564 1726882848.45061: no more pending results, returning what we have 30564 1726882848.45067: results queue empty 30564 1726882848.45070: checking for any_errors_fatal 30564 1726882848.45072: done checking for any_errors_fatal 30564 1726882848.45073: checking for max_fail_percentage 30564 1726882848.45074: done checking for max_fail_percentage 30564 1726882848.45075: checking to see if all hosts have failed and the running result is not ok 30564 1726882848.45076: done checking to see if all hosts have failed 30564 1726882848.45077: getting the remaining hosts for this loop 30564 1726882848.45079: done getting the remaining hosts for this loop 30564 1726882848.45082: getting the next task for host managed_node2 30564 1726882848.45093: done getting next task for host managed_node2 30564 1726882848.45096: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 30564 1726882848.45102: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882848.45126: getting variables 30564 1726882848.45128: in VariableManager get_vars() 30564 1726882848.45158: Calling all_inventory to load vars for managed_node2 30564 1726882848.45160: Calling groups_inventory to load vars for managed_node2 30564 1726882848.45163: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882848.45176: Calling all_plugins_play to load vars for managed_node2 30564 1726882848.45178: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882848.45186: Calling groups_plugins_play to load vars for managed_node2 30564 1726882848.45999: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882848.46949: done with get_vars() 30564 1726882848.46967: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:40:48 -0400 (0:00:00.057) 0:00:47.051 ****** 30564 1726882848.47034: entering _queue_task() for managed_node2/stat 30564 1726882848.47243: worker is 1 (out of 1 available) 30564 1726882848.47258: exiting _queue_task() for managed_node2/stat 30564 1726882848.47273: done queuing things up, now waiting for results queue to drain 30564 1726882848.47275: waiting for pending results... 30564 1726882848.47451: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 30564 1726882848.47551: in run() - task 0e448fcc-3ce9-4216-acec-000000001154 30564 1726882848.47562: variable 'ansible_search_path' from source: unknown 30564 1726882848.47571: variable 'ansible_search_path' from source: unknown 30564 1726882848.47595: calling self._execute() 30564 1726882848.47670: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882848.47675: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882848.47681: variable 'omit' from source: magic vars 30564 1726882848.47941: variable 'ansible_distribution_major_version' from source: facts 30564 1726882848.47953: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882848.48066: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882848.48247: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882848.48282: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882848.48308: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882848.48331: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882848.48398: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882848.48415: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882848.48433: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882848.48450: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882848.48514: variable '__network_is_ostree' from source: set_fact 30564 1726882848.48519: Evaluated conditional (not __network_is_ostree is defined): False 30564 1726882848.48522: when evaluation is False, skipping this task 30564 1726882848.48526: _execute() done 30564 1726882848.48529: dumping result to json 30564 1726882848.48532: done dumping result, returning 30564 1726882848.48538: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0e448fcc-3ce9-4216-acec-000000001154] 30564 1726882848.48544: sending task result for task 0e448fcc-3ce9-4216-acec-000000001154 30564 1726882848.48630: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001154 30564 1726882848.48633: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30564 1726882848.48688: no more pending results, returning what we have 30564 1726882848.48691: results queue empty 30564 1726882848.48693: checking for any_errors_fatal 30564 1726882848.48698: done checking for any_errors_fatal 30564 1726882848.48699: checking for max_fail_percentage 30564 1726882848.48701: done checking for max_fail_percentage 30564 1726882848.48701: checking to see if all hosts have failed and the running result is not ok 30564 1726882848.48702: done checking to see if all hosts have failed 30564 1726882848.48703: getting the remaining hosts for this loop 30564 1726882848.48704: done getting the remaining hosts for this loop 30564 1726882848.48707: getting the next task for host managed_node2 30564 1726882848.48714: done getting next task for host managed_node2 30564 1726882848.48717: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30564 1726882848.48722: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882848.48738: getting variables 30564 1726882848.48739: in VariableManager get_vars() 30564 1726882848.48779: Calling all_inventory to load vars for managed_node2 30564 1726882848.48782: Calling groups_inventory to load vars for managed_node2 30564 1726882848.48783: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882848.48790: Calling all_plugins_play to load vars for managed_node2 30564 1726882848.48791: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882848.48793: Calling groups_plugins_play to load vars for managed_node2 30564 1726882848.49694: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882848.50631: done with get_vars() 30564 1726882848.50645: done getting variables 30564 1726882848.50687: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:40:48 -0400 (0:00:00.036) 0:00:47.088 ****** 30564 1726882848.50712: entering _queue_task() for managed_node2/set_fact 30564 1726882848.50906: worker is 1 (out of 1 available) 30564 1726882848.50917: exiting _queue_task() for managed_node2/set_fact 30564 1726882848.50929: done queuing things up, now waiting for results queue to drain 30564 1726882848.50930: waiting for pending results... 30564 1726882848.51108: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30564 1726882848.51209: in run() - task 0e448fcc-3ce9-4216-acec-000000001155 30564 1726882848.51219: variable 'ansible_search_path' from source: unknown 30564 1726882848.51222: variable 'ansible_search_path' from source: unknown 30564 1726882848.51253: calling self._execute() 30564 1726882848.51324: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882848.51327: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882848.51337: variable 'omit' from source: magic vars 30564 1726882848.51593: variable 'ansible_distribution_major_version' from source: facts 30564 1726882848.51604: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882848.51715: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882848.51901: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882848.51931: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882848.51954: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882848.51980: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882848.52043: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882848.52060: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882848.52082: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882848.52099: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882848.52160: variable '__network_is_ostree' from source: set_fact 30564 1726882848.52167: Evaluated conditional (not __network_is_ostree is defined): False 30564 1726882848.52172: when evaluation is False, skipping this task 30564 1726882848.52175: _execute() done 30564 1726882848.52177: dumping result to json 30564 1726882848.52180: done dumping result, returning 30564 1726882848.52185: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0e448fcc-3ce9-4216-acec-000000001155] 30564 1726882848.52191: sending task result for task 0e448fcc-3ce9-4216-acec-000000001155 30564 1726882848.52278: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001155 30564 1726882848.52282: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30564 1726882848.52323: no more pending results, returning what we have 30564 1726882848.52326: results queue empty 30564 1726882848.52328: checking for any_errors_fatal 30564 1726882848.52332: done checking for any_errors_fatal 30564 1726882848.52333: checking for max_fail_percentage 30564 1726882848.52334: done checking for max_fail_percentage 30564 1726882848.52335: checking to see if all hosts have failed and the running result is not ok 30564 1726882848.52336: done checking to see if all hosts have failed 30564 1726882848.52337: getting the remaining hosts for this loop 30564 1726882848.52338: done getting the remaining hosts for this loop 30564 1726882848.52341: getting the next task for host managed_node2 30564 1726882848.52349: done getting next task for host managed_node2 30564 1726882848.52353: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 30564 1726882848.52359: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882848.52385: getting variables 30564 1726882848.52387: in VariableManager get_vars() 30564 1726882848.52415: Calling all_inventory to load vars for managed_node2 30564 1726882848.52417: Calling groups_inventory to load vars for managed_node2 30564 1726882848.52418: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882848.52425: Calling all_plugins_play to load vars for managed_node2 30564 1726882848.52426: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882848.52428: Calling groups_plugins_play to load vars for managed_node2 30564 1726882848.53212: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882848.54259: done with get_vars() 30564 1726882848.54278: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:40:48 -0400 (0:00:00.036) 0:00:47.124 ****** 30564 1726882848.54344: entering _queue_task() for managed_node2/service_facts 30564 1726882848.54544: worker is 1 (out of 1 available) 30564 1726882848.54558: exiting _queue_task() for managed_node2/service_facts 30564 1726882848.54575: done queuing things up, now waiting for results queue to drain 30564 1726882848.54577: waiting for pending results... 30564 1726882848.54746: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 30564 1726882848.54846: in run() - task 0e448fcc-3ce9-4216-acec-000000001157 30564 1726882848.54857: variable 'ansible_search_path' from source: unknown 30564 1726882848.54860: variable 'ansible_search_path' from source: unknown 30564 1726882848.54892: calling self._execute() 30564 1726882848.54961: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882848.54971: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882848.54982: variable 'omit' from source: magic vars 30564 1726882848.55240: variable 'ansible_distribution_major_version' from source: facts 30564 1726882848.55251: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882848.55256: variable 'omit' from source: magic vars 30564 1726882848.55311: variable 'omit' from source: magic vars 30564 1726882848.55333: variable 'omit' from source: magic vars 30564 1726882848.55366: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882848.55393: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882848.55407: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882848.55424: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882848.55434: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882848.55456: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882848.55460: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882848.55462: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882848.55533: Set connection var ansible_timeout to 10 30564 1726882848.55537: Set connection var ansible_pipelining to False 30564 1726882848.55540: Set connection var ansible_shell_type to sh 30564 1726882848.55545: Set connection var ansible_shell_executable to /bin/sh 30564 1726882848.55553: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882848.55555: Set connection var ansible_connection to ssh 30564 1726882848.55575: variable 'ansible_shell_executable' from source: unknown 30564 1726882848.55578: variable 'ansible_connection' from source: unknown 30564 1726882848.55581: variable 'ansible_module_compression' from source: unknown 30564 1726882848.55583: variable 'ansible_shell_type' from source: unknown 30564 1726882848.55586: variable 'ansible_shell_executable' from source: unknown 30564 1726882848.55588: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882848.55590: variable 'ansible_pipelining' from source: unknown 30564 1726882848.55594: variable 'ansible_timeout' from source: unknown 30564 1726882848.55598: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882848.55743: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30564 1726882848.55747: variable 'omit' from source: magic vars 30564 1726882848.55752: starting attempt loop 30564 1726882848.55755: running the handler 30564 1726882848.55772: _low_level_execute_command(): starting 30564 1726882848.55775: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882848.56298: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882848.56310: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882848.56333: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 30564 1726882848.56347: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 30564 1726882848.56357: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882848.56407: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882848.56411: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882848.56428: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882848.56545: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882848.58210: stdout chunk (state=3): >>>/root <<< 30564 1726882848.58309: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882848.58358: stderr chunk (state=3): >>><<< 30564 1726882848.58367: stdout chunk (state=3): >>><<< 30564 1726882848.58387: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882848.58397: _low_level_execute_command(): starting 30564 1726882848.58402: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882848.5838537-32687-42371680388094 `" && echo ansible-tmp-1726882848.5838537-32687-42371680388094="` echo /root/.ansible/tmp/ansible-tmp-1726882848.5838537-32687-42371680388094 `" ) && sleep 0' 30564 1726882848.58829: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882848.58832: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882848.58873: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882848.58884: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882848.58887: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882848.58926: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882848.58930: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882848.59042: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882848.60915: stdout chunk (state=3): >>>ansible-tmp-1726882848.5838537-32687-42371680388094=/root/.ansible/tmp/ansible-tmp-1726882848.5838537-32687-42371680388094 <<< 30564 1726882848.61027: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882848.61071: stderr chunk (state=3): >>><<< 30564 1726882848.61077: stdout chunk (state=3): >>><<< 30564 1726882848.61090: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882848.5838537-32687-42371680388094=/root/.ansible/tmp/ansible-tmp-1726882848.5838537-32687-42371680388094 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882848.61127: variable 'ansible_module_compression' from source: unknown 30564 1726882848.61166: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 30564 1726882848.61198: variable 'ansible_facts' from source: unknown 30564 1726882848.61256: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882848.5838537-32687-42371680388094/AnsiballZ_service_facts.py 30564 1726882848.61359: Sending initial data 30564 1726882848.61362: Sent initial data (161 bytes) 30564 1726882848.62011: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882848.62015: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882848.62046: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882848.62052: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration <<< 30564 1726882848.62058: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882848.62076: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882848.62082: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882848.62138: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882848.62141: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882848.62152: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882848.62259: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882848.64069: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 <<< 30564 1726882848.64073: stderr chunk (state=3): >>>debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882848.64166: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882848.64261: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmplm24spmf /root/.ansible/tmp/ansible-tmp-1726882848.5838537-32687-42371680388094/AnsiballZ_service_facts.py <<< 30564 1726882848.64359: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882848.65409: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882848.65501: stderr chunk (state=3): >>><<< 30564 1726882848.65504: stdout chunk (state=3): >>><<< 30564 1726882848.65517: done transferring module to remote 30564 1726882848.65525: _low_level_execute_command(): starting 30564 1726882848.65530: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882848.5838537-32687-42371680388094/ /root/.ansible/tmp/ansible-tmp-1726882848.5838537-32687-42371680388094/AnsiballZ_service_facts.py && sleep 0' 30564 1726882848.65959: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882848.65962: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882848.66003: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882848.66008: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882848.66010: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882848.66053: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882848.66062: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882848.66171: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882848.67975: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882848.68014: stderr chunk (state=3): >>><<< 30564 1726882848.68018: stdout chunk (state=3): >>><<< 30564 1726882848.68031: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882848.68035: _low_level_execute_command(): starting 30564 1726882848.68037: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882848.5838537-32687-42371680388094/AnsiballZ_service_facts.py && sleep 0' 30564 1726882848.68447: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882848.68452: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882848.68493: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882848.68499: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882848.68509: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882848.68515: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882848.68577: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882848.68583: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882848.68697: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882850.03790: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-f<<< 30564 1726882850.03800: stdout chunk (state=3): >>>ound", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "s<<< 30564 1726882850.03803: stdout chunk (state=3): >>>tatic", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alia<<< 30564 1726882850.03807: stdout chunk (state=3): >>>s", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper<<< 30564 1726882850.03809: stdout chunk (state=3): >>>-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "<<< 30564 1726882850.03813: stdout chunk (state=3): >>>source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 30564 1726882850.05083: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882850.05087: stdout chunk (state=3): >>><<< 30564 1726882850.05090: stderr chunk (state=3): >>><<< 30564 1726882850.05239: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882850.05816: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882848.5838537-32687-42371680388094/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882850.05831: _low_level_execute_command(): starting 30564 1726882850.05841: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882848.5838537-32687-42371680388094/ > /dev/null 2>&1 && sleep 0' 30564 1726882850.06504: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882850.06519: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882850.06534: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882850.06554: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882850.06605: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882850.06618: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882850.06646: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882850.06666: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882850.06683: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882850.06694: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882850.06706: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882850.06718: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882850.06733: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882850.06744: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882850.06754: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882850.06772: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882850.06848: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882850.06873: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882850.06889: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882850.07126: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882850.08986: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882850.09048: stderr chunk (state=3): >>><<< 30564 1726882850.09051: stdout chunk (state=3): >>><<< 30564 1726882850.09414: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882850.09417: handler run complete 30564 1726882850.09420: variable 'ansible_facts' from source: unknown 30564 1726882850.09471: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882850.09988: variable 'ansible_facts' from source: unknown 30564 1726882850.10145: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882850.10372: attempt loop complete, returning result 30564 1726882850.10385: _execute() done 30564 1726882850.10398: dumping result to json 30564 1726882850.10471: done dumping result, returning 30564 1726882850.10485: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [0e448fcc-3ce9-4216-acec-000000001157] 30564 1726882850.10502: sending task result for task 0e448fcc-3ce9-4216-acec-000000001157 ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30564 1726882850.11442: no more pending results, returning what we have 30564 1726882850.11445: results queue empty 30564 1726882850.11446: checking for any_errors_fatal 30564 1726882850.11453: done checking for any_errors_fatal 30564 1726882850.11454: checking for max_fail_percentage 30564 1726882850.11456: done checking for max_fail_percentage 30564 1726882850.11457: checking to see if all hosts have failed and the running result is not ok 30564 1726882850.11457: done checking to see if all hosts have failed 30564 1726882850.11458: getting the remaining hosts for this loop 30564 1726882850.11460: done getting the remaining hosts for this loop 30564 1726882850.11473: getting the next task for host managed_node2 30564 1726882850.11484: done getting next task for host managed_node2 30564 1726882850.11488: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 30564 1726882850.11495: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882850.11511: getting variables 30564 1726882850.11512: in VariableManager get_vars() 30564 1726882850.11552: Calling all_inventory to load vars for managed_node2 30564 1726882850.11555: Calling groups_inventory to load vars for managed_node2 30564 1726882850.11558: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882850.11574: Calling all_plugins_play to load vars for managed_node2 30564 1726882850.11583: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882850.11587: Calling groups_plugins_play to load vars for managed_node2 30564 1726882850.12583: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001157 30564 1726882850.12586: WORKER PROCESS EXITING 30564 1726882850.13699: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882850.21908: done with get_vars() 30564 1726882850.21924: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:40:50 -0400 (0:00:01.676) 0:00:48.801 ****** 30564 1726882850.21983: entering _queue_task() for managed_node2/package_facts 30564 1726882850.22216: worker is 1 (out of 1 available) 30564 1726882850.22230: exiting _queue_task() for managed_node2/package_facts 30564 1726882850.22244: done queuing things up, now waiting for results queue to drain 30564 1726882850.22246: waiting for pending results... 30564 1726882850.22435: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 30564 1726882850.22544: in run() - task 0e448fcc-3ce9-4216-acec-000000001158 30564 1726882850.22557: variable 'ansible_search_path' from source: unknown 30564 1726882850.22562: variable 'ansible_search_path' from source: unknown 30564 1726882850.22593: calling self._execute() 30564 1726882850.22671: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882850.22675: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882850.22686: variable 'omit' from source: magic vars 30564 1726882850.23056: variable 'ansible_distribution_major_version' from source: facts 30564 1726882850.23093: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882850.23109: variable 'omit' from source: magic vars 30564 1726882850.23199: variable 'omit' from source: magic vars 30564 1726882850.23241: variable 'omit' from source: magic vars 30564 1726882850.23299: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882850.23348: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882850.23375: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882850.23406: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882850.23424: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882850.23460: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882850.23470: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882850.23479: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882850.23588: Set connection var ansible_timeout to 10 30564 1726882850.23601: Set connection var ansible_pipelining to False 30564 1726882850.23611: Set connection var ansible_shell_type to sh 30564 1726882850.23624: Set connection var ansible_shell_executable to /bin/sh 30564 1726882850.23636: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882850.23643: Set connection var ansible_connection to ssh 30564 1726882850.23677: variable 'ansible_shell_executable' from source: unknown 30564 1726882850.23687: variable 'ansible_connection' from source: unknown 30564 1726882850.23695: variable 'ansible_module_compression' from source: unknown 30564 1726882850.23727: variable 'ansible_shell_type' from source: unknown 30564 1726882850.23742: variable 'ansible_shell_executable' from source: unknown 30564 1726882850.23750: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882850.23772: variable 'ansible_pipelining' from source: unknown 30564 1726882850.23793: variable 'ansible_timeout' from source: unknown 30564 1726882850.23813: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882850.24074: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30564 1726882850.24104: variable 'omit' from source: magic vars 30564 1726882850.24108: starting attempt loop 30564 1726882850.24111: running the handler 30564 1726882850.24121: _low_level_execute_command(): starting 30564 1726882850.24128: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882850.24631: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882850.24646: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882850.24667: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882850.24682: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 30564 1726882850.24697: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882850.24740: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882850.24753: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882850.24873: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882850.26533: stdout chunk (state=3): >>>/root <<< 30564 1726882850.26637: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882850.26712: stderr chunk (state=3): >>><<< 30564 1726882850.26716: stdout chunk (state=3): >>><<< 30564 1726882850.26737: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882850.26748: _low_level_execute_command(): starting 30564 1726882850.26754: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882850.2673457-32724-53180859854279 `" && echo ansible-tmp-1726882850.2673457-32724-53180859854279="` echo /root/.ansible/tmp/ansible-tmp-1726882850.2673457-32724-53180859854279 `" ) && sleep 0' 30564 1726882850.27339: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882850.27348: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882850.27360: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882850.27380: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882850.27428: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882850.27439: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882850.27448: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882850.27469: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882850.27483: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882850.27491: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882850.27499: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882850.27508: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882850.27520: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882850.27529: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882850.27536: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882850.27553: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882850.27685: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882850.27699: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882850.27709: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882850.27827: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882850.29757: stdout chunk (state=3): >>>ansible-tmp-1726882850.2673457-32724-53180859854279=/root/.ansible/tmp/ansible-tmp-1726882850.2673457-32724-53180859854279 <<< 30564 1726882850.29886: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882850.29936: stderr chunk (state=3): >>><<< 30564 1726882850.29939: stdout chunk (state=3): >>><<< 30564 1726882850.29951: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882850.2673457-32724-53180859854279=/root/.ansible/tmp/ansible-tmp-1726882850.2673457-32724-53180859854279 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882850.30249: variable 'ansible_module_compression' from source: unknown 30564 1726882850.30253: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 30564 1726882850.30256: variable 'ansible_facts' from source: unknown 30564 1726882850.30554: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882850.2673457-32724-53180859854279/AnsiballZ_package_facts.py 30564 1726882850.30558: Sending initial data 30564 1726882850.30561: Sent initial data (161 bytes) 30564 1726882850.31354: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882850.31357: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882850.31378: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882850.31384: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882850.31415: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882850.31418: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882850.31472: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882850.31487: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882850.31593: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882850.33346: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882850.33443: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882850.33541: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmppe9g1s5a /root/.ansible/tmp/ansible-tmp-1726882850.2673457-32724-53180859854279/AnsiballZ_package_facts.py <<< 30564 1726882850.33633: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882850.35914: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882850.36004: stderr chunk (state=3): >>><<< 30564 1726882850.36008: stdout chunk (state=3): >>><<< 30564 1726882850.36023: done transferring module to remote 30564 1726882850.36033: _low_level_execute_command(): starting 30564 1726882850.36039: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882850.2673457-32724-53180859854279/ /root/.ansible/tmp/ansible-tmp-1726882850.2673457-32724-53180859854279/AnsiballZ_package_facts.py && sleep 0' 30564 1726882850.36436: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882850.36441: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882850.36477: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 30564 1726882850.36499: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30564 1726882850.36502: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882850.36508: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882850.36554: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882850.36558: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882850.36667: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882850.38442: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882850.38487: stderr chunk (state=3): >>><<< 30564 1726882850.38490: stdout chunk (state=3): >>><<< 30564 1726882850.38506: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882850.38509: _low_level_execute_command(): starting 30564 1726882850.38512: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882850.2673457-32724-53180859854279/AnsiballZ_package_facts.py && sleep 0' 30564 1726882850.38924: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882850.38934: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882850.38960: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882850.38978: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882850.39027: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882850.39042: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882850.39146: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882850.85324: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [<<< 30564 1726882850.85376: stdout chunk (state=3): >>>{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": <<< 30564 1726882850.85419: stdout chunk (state=3): >>>"7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1<<< 30564 1726882850.85440: stdout chunk (state=3): >>>.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", <<< 30564 1726882850.85453: stdout chunk (state=3): >>>"release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-bas<<< 30564 1726882850.85459: stdout chunk (state=3): >>>e-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch<<< 30564 1726882850.85462: stdout chunk (state=3): >>>": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source"<<< 30564 1726882850.85470: stdout chunk (state=3): >>>: "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "ar<<< 30564 1726882850.85493: stdout chunk (state=3): >>>ch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", <<< 30564 1726882850.85499: stdout chunk (state=3): >>>"release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "<<< 30564 1726882850.85502: stdout chunk (state=3): >>>version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64",<<< 30564 1726882850.85506: stdout chunk (state=3): >>> "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "re<<< 30564 1726882850.85509: stdout chunk (state=3): >>>lease": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", <<< 30564 1726882850.85513: stdout chunk (state=3): >>>"source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 30564 1726882850.86982: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882850.87084: stderr chunk (state=3): >>><<< 30564 1726882850.87087: stdout chunk (state=3): >>><<< 30564 1726882850.87283: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882850.89712: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882850.2673457-32724-53180859854279/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882850.89734: _low_level_execute_command(): starting 30564 1726882850.89742: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882850.2673457-32724-53180859854279/ > /dev/null 2>&1 && sleep 0' 30564 1726882850.90407: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882850.90427: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882850.90442: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882850.90462: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882850.90508: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882850.90521: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882850.90541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882850.90560: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882850.90578: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882850.90591: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882850.90604: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882850.90618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882850.90635: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882850.90653: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882850.90671: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882850.90687: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882850.90767: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882850.90792: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882850.90809: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882850.90938: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882850.92849: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882850.92852: stdout chunk (state=3): >>><<< 30564 1726882850.92855: stderr chunk (state=3): >>><<< 30564 1726882850.93071: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882850.93075: handler run complete 30564 1726882850.93839: variable 'ansible_facts' from source: unknown 30564 1726882850.94419: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882850.96571: variable 'ansible_facts' from source: unknown 30564 1726882850.97065: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882850.97895: attempt loop complete, returning result 30564 1726882850.97913: _execute() done 30564 1726882850.97921: dumping result to json 30564 1726882850.98197: done dumping result, returning 30564 1726882850.98211: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0e448fcc-3ce9-4216-acec-000000001158] 30564 1726882850.98221: sending task result for task 0e448fcc-3ce9-4216-acec-000000001158 30564 1726882851.00845: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001158 30564 1726882851.00848: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30564 1726882851.00994: no more pending results, returning what we have 30564 1726882851.00996: results queue empty 30564 1726882851.00997: checking for any_errors_fatal 30564 1726882851.01002: done checking for any_errors_fatal 30564 1726882851.01002: checking for max_fail_percentage 30564 1726882851.01004: done checking for max_fail_percentage 30564 1726882851.01004: checking to see if all hosts have failed and the running result is not ok 30564 1726882851.01005: done checking to see if all hosts have failed 30564 1726882851.01006: getting the remaining hosts for this loop 30564 1726882851.01007: done getting the remaining hosts for this loop 30564 1726882851.01010: getting the next task for host managed_node2 30564 1726882851.01018: done getting next task for host managed_node2 30564 1726882851.01021: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 30564 1726882851.01026: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882851.01036: getting variables 30564 1726882851.01038: in VariableManager get_vars() 30564 1726882851.01066: Calling all_inventory to load vars for managed_node2 30564 1726882851.01069: Calling groups_inventory to load vars for managed_node2 30564 1726882851.01076: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882851.01085: Calling all_plugins_play to load vars for managed_node2 30564 1726882851.01088: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882851.01090: Calling groups_plugins_play to load vars for managed_node2 30564 1726882851.02513: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882851.04319: done with get_vars() 30564 1726882851.04342: done getting variables 30564 1726882851.04398: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:40:51 -0400 (0:00:00.824) 0:00:49.625 ****** 30564 1726882851.04434: entering _queue_task() for managed_node2/debug 30564 1726882851.04755: worker is 1 (out of 1 available) 30564 1726882851.04775: exiting _queue_task() for managed_node2/debug 30564 1726882851.04792: done queuing things up, now waiting for results queue to drain 30564 1726882851.04793: waiting for pending results... 30564 1726882851.05119: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 30564 1726882851.05281: in run() - task 0e448fcc-3ce9-4216-acec-0000000010f6 30564 1726882851.05304: variable 'ansible_search_path' from source: unknown 30564 1726882851.05312: variable 'ansible_search_path' from source: unknown 30564 1726882851.05373: calling self._execute() 30564 1726882851.05495: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882851.05508: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882851.05524: variable 'omit' from source: magic vars 30564 1726882851.05945: variable 'ansible_distribution_major_version' from source: facts 30564 1726882851.05965: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882851.05981: variable 'omit' from source: magic vars 30564 1726882851.06059: variable 'omit' from source: magic vars 30564 1726882851.06171: variable 'network_provider' from source: set_fact 30564 1726882851.06194: variable 'omit' from source: magic vars 30564 1726882851.06242: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882851.06293: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882851.06319: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882851.06344: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882851.06370: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882851.06425: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882851.06439: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882851.06447: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882851.06561: Set connection var ansible_timeout to 10 30564 1726882851.06583: Set connection var ansible_pipelining to False 30564 1726882851.06591: Set connection var ansible_shell_type to sh 30564 1726882851.06603: Set connection var ansible_shell_executable to /bin/sh 30564 1726882851.06615: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882851.06622: Set connection var ansible_connection to ssh 30564 1726882851.06658: variable 'ansible_shell_executable' from source: unknown 30564 1726882851.06673: variable 'ansible_connection' from source: unknown 30564 1726882851.06682: variable 'ansible_module_compression' from source: unknown 30564 1726882851.06694: variable 'ansible_shell_type' from source: unknown 30564 1726882851.06702: variable 'ansible_shell_executable' from source: unknown 30564 1726882851.06708: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882851.06717: variable 'ansible_pipelining' from source: unknown 30564 1726882851.06723: variable 'ansible_timeout' from source: unknown 30564 1726882851.06731: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882851.06889: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882851.06911: variable 'omit' from source: magic vars 30564 1726882851.06921: starting attempt loop 30564 1726882851.06928: running the handler 30564 1726882851.06984: handler run complete 30564 1726882851.07001: attempt loop complete, returning result 30564 1726882851.07007: _execute() done 30564 1726882851.07017: dumping result to json 30564 1726882851.07024: done dumping result, returning 30564 1726882851.07033: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [0e448fcc-3ce9-4216-acec-0000000010f6] 30564 1726882851.07041: sending task result for task 0e448fcc-3ce9-4216-acec-0000000010f6 ok: [managed_node2] => {} MSG: Using network provider: nm 30564 1726882851.07198: no more pending results, returning what we have 30564 1726882851.07201: results queue empty 30564 1726882851.07202: checking for any_errors_fatal 30564 1726882851.07211: done checking for any_errors_fatal 30564 1726882851.07212: checking for max_fail_percentage 30564 1726882851.07213: done checking for max_fail_percentage 30564 1726882851.07214: checking to see if all hosts have failed and the running result is not ok 30564 1726882851.07215: done checking to see if all hosts have failed 30564 1726882851.07216: getting the remaining hosts for this loop 30564 1726882851.07218: done getting the remaining hosts for this loop 30564 1726882851.07221: getting the next task for host managed_node2 30564 1726882851.07230: done getting next task for host managed_node2 30564 1726882851.07234: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30564 1726882851.07240: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882851.07252: getting variables 30564 1726882851.07254: in VariableManager get_vars() 30564 1726882851.07296: Calling all_inventory to load vars for managed_node2 30564 1726882851.07299: Calling groups_inventory to load vars for managed_node2 30564 1726882851.07301: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882851.07311: Calling all_plugins_play to load vars for managed_node2 30564 1726882851.07314: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882851.07317: Calling groups_plugins_play to load vars for managed_node2 30564 1726882851.08327: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000010f6 30564 1726882851.08331: WORKER PROCESS EXITING 30564 1726882851.09100: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882851.10922: done with get_vars() 30564 1726882851.10943: done getting variables 30564 1726882851.11005: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:40:51 -0400 (0:00:00.066) 0:00:49.691 ****** 30564 1726882851.11042: entering _queue_task() for managed_node2/fail 30564 1726882851.11314: worker is 1 (out of 1 available) 30564 1726882851.11326: exiting _queue_task() for managed_node2/fail 30564 1726882851.11339: done queuing things up, now waiting for results queue to drain 30564 1726882851.11341: waiting for pending results... 30564 1726882851.11637: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30564 1726882851.11797: in run() - task 0e448fcc-3ce9-4216-acec-0000000010f7 30564 1726882851.11819: variable 'ansible_search_path' from source: unknown 30564 1726882851.11831: variable 'ansible_search_path' from source: unknown 30564 1726882851.11878: calling self._execute() 30564 1726882851.11980: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882851.11997: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882851.12015: variable 'omit' from source: magic vars 30564 1726882851.12414: variable 'ansible_distribution_major_version' from source: facts 30564 1726882851.12435: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882851.12575: variable 'network_state' from source: role '' defaults 30564 1726882851.12594: Evaluated conditional (network_state != {}): False 30564 1726882851.12603: when evaluation is False, skipping this task 30564 1726882851.12611: _execute() done 30564 1726882851.12617: dumping result to json 30564 1726882851.12624: done dumping result, returning 30564 1726882851.12635: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0e448fcc-3ce9-4216-acec-0000000010f7] 30564 1726882851.12647: sending task result for task 0e448fcc-3ce9-4216-acec-0000000010f7 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30564 1726882851.12810: no more pending results, returning what we have 30564 1726882851.12814: results queue empty 30564 1726882851.12815: checking for any_errors_fatal 30564 1726882851.12823: done checking for any_errors_fatal 30564 1726882851.12824: checking for max_fail_percentage 30564 1726882851.12825: done checking for max_fail_percentage 30564 1726882851.12826: checking to see if all hosts have failed and the running result is not ok 30564 1726882851.12827: done checking to see if all hosts have failed 30564 1726882851.12828: getting the remaining hosts for this loop 30564 1726882851.12830: done getting the remaining hosts for this loop 30564 1726882851.12834: getting the next task for host managed_node2 30564 1726882851.12842: done getting next task for host managed_node2 30564 1726882851.12846: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30564 1726882851.12851: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882851.12880: getting variables 30564 1726882851.12882: in VariableManager get_vars() 30564 1726882851.12920: Calling all_inventory to load vars for managed_node2 30564 1726882851.12924: Calling groups_inventory to load vars for managed_node2 30564 1726882851.12927: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882851.12939: Calling all_plugins_play to load vars for managed_node2 30564 1726882851.12942: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882851.12946: Calling groups_plugins_play to load vars for managed_node2 30564 1726882851.14088: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000010f7 30564 1726882851.14091: WORKER PROCESS EXITING 30564 1726882851.14787: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882851.16570: done with get_vars() 30564 1726882851.16590: done getting variables 30564 1726882851.16639: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:40:51 -0400 (0:00:00.056) 0:00:49.748 ****** 30564 1726882851.16681: entering _queue_task() for managed_node2/fail 30564 1726882851.16923: worker is 1 (out of 1 available) 30564 1726882851.16935: exiting _queue_task() for managed_node2/fail 30564 1726882851.16946: done queuing things up, now waiting for results queue to drain 30564 1726882851.16947: waiting for pending results... 30564 1726882851.17233: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30564 1726882851.17376: in run() - task 0e448fcc-3ce9-4216-acec-0000000010f8 30564 1726882851.17403: variable 'ansible_search_path' from source: unknown 30564 1726882851.17410: variable 'ansible_search_path' from source: unknown 30564 1726882851.17449: calling self._execute() 30564 1726882851.17556: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882851.17572: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882851.17587: variable 'omit' from source: magic vars 30564 1726882851.17978: variable 'ansible_distribution_major_version' from source: facts 30564 1726882851.17994: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882851.18126: variable 'network_state' from source: role '' defaults 30564 1726882851.18141: Evaluated conditional (network_state != {}): False 30564 1726882851.18152: when evaluation is False, skipping this task 30564 1726882851.18162: _execute() done 30564 1726882851.18174: dumping result to json 30564 1726882851.18181: done dumping result, returning 30564 1726882851.18191: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0e448fcc-3ce9-4216-acec-0000000010f8] 30564 1726882851.18200: sending task result for task 0e448fcc-3ce9-4216-acec-0000000010f8 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30564 1726882851.18338: no more pending results, returning what we have 30564 1726882851.18342: results queue empty 30564 1726882851.18343: checking for any_errors_fatal 30564 1726882851.18354: done checking for any_errors_fatal 30564 1726882851.18355: checking for max_fail_percentage 30564 1726882851.18357: done checking for max_fail_percentage 30564 1726882851.18358: checking to see if all hosts have failed and the running result is not ok 30564 1726882851.18358: done checking to see if all hosts have failed 30564 1726882851.18359: getting the remaining hosts for this loop 30564 1726882851.18361: done getting the remaining hosts for this loop 30564 1726882851.18367: getting the next task for host managed_node2 30564 1726882851.18379: done getting next task for host managed_node2 30564 1726882851.18382: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30564 1726882851.18389: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882851.18411: getting variables 30564 1726882851.18413: in VariableManager get_vars() 30564 1726882851.18450: Calling all_inventory to load vars for managed_node2 30564 1726882851.18453: Calling groups_inventory to load vars for managed_node2 30564 1726882851.18455: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882851.18470: Calling all_plugins_play to load vars for managed_node2 30564 1726882851.18473: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882851.18476: Calling groups_plugins_play to load vars for managed_node2 30564 1726882851.19485: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000010f8 30564 1726882851.19489: WORKER PROCESS EXITING 30564 1726882851.20210: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882851.23304: done with get_vars() 30564 1726882851.23332: done getting variables 30564 1726882851.23402: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:40:51 -0400 (0:00:00.067) 0:00:49.815 ****** 30564 1726882851.23437: entering _queue_task() for managed_node2/fail 30564 1726882851.23832: worker is 1 (out of 1 available) 30564 1726882851.23847: exiting _queue_task() for managed_node2/fail 30564 1726882851.23861: done queuing things up, now waiting for results queue to drain 30564 1726882851.23863: waiting for pending results... 30564 1726882851.24872: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30564 1726882851.25038: in run() - task 0e448fcc-3ce9-4216-acec-0000000010f9 30564 1726882851.25056: variable 'ansible_search_path' from source: unknown 30564 1726882851.25073: variable 'ansible_search_path' from source: unknown 30564 1726882851.25125: calling self._execute() 30564 1726882851.25248: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882851.25329: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882851.25344: variable 'omit' from source: magic vars 30564 1726882851.25810: variable 'ansible_distribution_major_version' from source: facts 30564 1726882851.25829: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882851.26028: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882851.28522: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882851.28605: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882851.28646: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882851.28697: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882851.28726: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882851.28819: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882851.28871: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882851.28913: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882851.28959: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882851.28984: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882851.29090: variable 'ansible_distribution_major_version' from source: facts 30564 1726882851.29115: Evaluated conditional (ansible_distribution_major_version | int > 9): False 30564 1726882851.29126: when evaluation is False, skipping this task 30564 1726882851.29133: _execute() done 30564 1726882851.29139: dumping result to json 30564 1726882851.29146: done dumping result, returning 30564 1726882851.29157: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0e448fcc-3ce9-4216-acec-0000000010f9] 30564 1726882851.29172: sending task result for task 0e448fcc-3ce9-4216-acec-0000000010f9 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 30564 1726882851.29310: no more pending results, returning what we have 30564 1726882851.29314: results queue empty 30564 1726882851.29315: checking for any_errors_fatal 30564 1726882851.29322: done checking for any_errors_fatal 30564 1726882851.29322: checking for max_fail_percentage 30564 1726882851.29325: done checking for max_fail_percentage 30564 1726882851.29326: checking to see if all hosts have failed and the running result is not ok 30564 1726882851.29326: done checking to see if all hosts have failed 30564 1726882851.29327: getting the remaining hosts for this loop 30564 1726882851.29329: done getting the remaining hosts for this loop 30564 1726882851.29333: getting the next task for host managed_node2 30564 1726882851.29342: done getting next task for host managed_node2 30564 1726882851.29346: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30564 1726882851.29350: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882851.29376: getting variables 30564 1726882851.29378: in VariableManager get_vars() 30564 1726882851.29418: Calling all_inventory to load vars for managed_node2 30564 1726882851.29420: Calling groups_inventory to load vars for managed_node2 30564 1726882851.29423: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882851.29433: Calling all_plugins_play to load vars for managed_node2 30564 1726882851.29436: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882851.29438: Calling groups_plugins_play to load vars for managed_node2 30564 1726882851.30496: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000010f9 30564 1726882851.30500: WORKER PROCESS EXITING 30564 1726882851.31294: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882851.33137: done with get_vars() 30564 1726882851.33161: done getting variables 30564 1726882851.33224: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:40:51 -0400 (0:00:00.098) 0:00:49.913 ****** 30564 1726882851.33263: entering _queue_task() for managed_node2/dnf 30564 1726882851.33574: worker is 1 (out of 1 available) 30564 1726882851.33588: exiting _queue_task() for managed_node2/dnf 30564 1726882851.33601: done queuing things up, now waiting for results queue to drain 30564 1726882851.33602: waiting for pending results... 30564 1726882851.33920: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30564 1726882851.34081: in run() - task 0e448fcc-3ce9-4216-acec-0000000010fa 30564 1726882851.34102: variable 'ansible_search_path' from source: unknown 30564 1726882851.34115: variable 'ansible_search_path' from source: unknown 30564 1726882851.34160: calling self._execute() 30564 1726882851.34271: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882851.34285: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882851.34302: variable 'omit' from source: magic vars 30564 1726882851.34715: variable 'ansible_distribution_major_version' from source: facts 30564 1726882851.34731: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882851.34934: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882851.37772: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882851.37837: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882851.37880: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882851.37920: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882851.37950: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882851.38037: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882851.38078: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882851.38109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882851.38157: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882851.38183: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882851.38301: variable 'ansible_distribution' from source: facts 30564 1726882851.38310: variable 'ansible_distribution_major_version' from source: facts 30564 1726882851.38331: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 30564 1726882851.38471: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882851.38618: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882851.38648: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882851.38687: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882851.38734: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882851.38752: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882851.38804: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882851.38835: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882851.38862: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882851.38914: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882851.38936: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882851.38982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882851.39014: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882851.39045: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882851.39093: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882851.39115: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882851.39286: variable 'network_connections' from source: include params 30564 1726882851.39300: variable 'interface' from source: play vars 30564 1726882851.39375: variable 'interface' from source: play vars 30564 1726882851.39449: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882851.39643: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882851.39693: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882851.39726: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882851.39760: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882851.39812: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882851.39838: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882851.39885: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882851.39918: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882851.39983: variable '__network_team_connections_defined' from source: role '' defaults 30564 1726882851.40243: variable 'network_connections' from source: include params 30564 1726882851.40252: variable 'interface' from source: play vars 30564 1726882851.40323: variable 'interface' from source: play vars 30564 1726882851.40361: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30564 1726882851.40373: when evaluation is False, skipping this task 30564 1726882851.40380: _execute() done 30564 1726882851.40386: dumping result to json 30564 1726882851.40393: done dumping result, returning 30564 1726882851.40402: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0e448fcc-3ce9-4216-acec-0000000010fa] 30564 1726882851.40415: sending task result for task 0e448fcc-3ce9-4216-acec-0000000010fa 30564 1726882851.40531: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000010fa 30564 1726882851.40538: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30564 1726882851.40603: no more pending results, returning what we have 30564 1726882851.40608: results queue empty 30564 1726882851.40609: checking for any_errors_fatal 30564 1726882851.40616: done checking for any_errors_fatal 30564 1726882851.40617: checking for max_fail_percentage 30564 1726882851.40619: done checking for max_fail_percentage 30564 1726882851.40620: checking to see if all hosts have failed and the running result is not ok 30564 1726882851.40621: done checking to see if all hosts have failed 30564 1726882851.40622: getting the remaining hosts for this loop 30564 1726882851.40624: done getting the remaining hosts for this loop 30564 1726882851.40629: getting the next task for host managed_node2 30564 1726882851.40638: done getting next task for host managed_node2 30564 1726882851.40642: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30564 1726882851.40648: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882851.40673: getting variables 30564 1726882851.40676: in VariableManager get_vars() 30564 1726882851.40717: Calling all_inventory to load vars for managed_node2 30564 1726882851.40719: Calling groups_inventory to load vars for managed_node2 30564 1726882851.40722: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882851.40733: Calling all_plugins_play to load vars for managed_node2 30564 1726882851.40736: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882851.40739: Calling groups_plugins_play to load vars for managed_node2 30564 1726882851.42704: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882851.44524: done with get_vars() 30564 1726882851.44545: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30564 1726882851.44625: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:40:51 -0400 (0:00:00.113) 0:00:50.027 ****** 30564 1726882851.44657: entering _queue_task() for managed_node2/yum 30564 1726882851.44957: worker is 1 (out of 1 available) 30564 1726882851.44974: exiting _queue_task() for managed_node2/yum 30564 1726882851.44990: done queuing things up, now waiting for results queue to drain 30564 1726882851.44991: waiting for pending results... 30564 1726882851.45292: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30564 1726882851.45452: in run() - task 0e448fcc-3ce9-4216-acec-0000000010fb 30564 1726882851.45475: variable 'ansible_search_path' from source: unknown 30564 1726882851.45485: variable 'ansible_search_path' from source: unknown 30564 1726882851.45524: calling self._execute() 30564 1726882851.45632: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882851.45650: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882851.45672: variable 'omit' from source: magic vars 30564 1726882851.46060: variable 'ansible_distribution_major_version' from source: facts 30564 1726882851.46090: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882851.46272: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882851.48750: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882851.48828: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882851.48874: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882851.48921: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882851.48951: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882851.49042: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882851.49091: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882851.49129: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882851.49179: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882851.49199: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882851.49298: variable 'ansible_distribution_major_version' from source: facts 30564 1726882851.49317: Evaluated conditional (ansible_distribution_major_version | int < 8): False 30564 1726882851.49332: when evaluation is False, skipping this task 30564 1726882851.49343: _execute() done 30564 1726882851.49350: dumping result to json 30564 1726882851.49358: done dumping result, returning 30564 1726882851.49374: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0e448fcc-3ce9-4216-acec-0000000010fb] 30564 1726882851.49384: sending task result for task 0e448fcc-3ce9-4216-acec-0000000010fb skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 30564 1726882851.49540: no more pending results, returning what we have 30564 1726882851.49543: results queue empty 30564 1726882851.49544: checking for any_errors_fatal 30564 1726882851.49552: done checking for any_errors_fatal 30564 1726882851.49553: checking for max_fail_percentage 30564 1726882851.49555: done checking for max_fail_percentage 30564 1726882851.49556: checking to see if all hosts have failed and the running result is not ok 30564 1726882851.49556: done checking to see if all hosts have failed 30564 1726882851.49557: getting the remaining hosts for this loop 30564 1726882851.49559: done getting the remaining hosts for this loop 30564 1726882851.49563: getting the next task for host managed_node2 30564 1726882851.49576: done getting next task for host managed_node2 30564 1726882851.49580: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30564 1726882851.49585: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882851.49608: getting variables 30564 1726882851.49610: in VariableManager get_vars() 30564 1726882851.49647: Calling all_inventory to load vars for managed_node2 30564 1726882851.49649: Calling groups_inventory to load vars for managed_node2 30564 1726882851.49652: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882851.49662: Calling all_plugins_play to load vars for managed_node2 30564 1726882851.49667: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882851.49673: Calling groups_plugins_play to load vars for managed_node2 30564 1726882851.50703: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000010fb 30564 1726882851.50706: WORKER PROCESS EXITING 30564 1726882851.51437: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882851.53434: done with get_vars() 30564 1726882851.53461: done getting variables 30564 1726882851.53523: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:40:51 -0400 (0:00:00.088) 0:00:50.116 ****** 30564 1726882851.53562: entering _queue_task() for managed_node2/fail 30564 1726882851.53851: worker is 1 (out of 1 available) 30564 1726882851.53863: exiting _queue_task() for managed_node2/fail 30564 1726882851.53879: done queuing things up, now waiting for results queue to drain 30564 1726882851.53881: waiting for pending results... 30564 1726882851.54160: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30564 1726882851.54317: in run() - task 0e448fcc-3ce9-4216-acec-0000000010fc 30564 1726882851.54338: variable 'ansible_search_path' from source: unknown 30564 1726882851.54346: variable 'ansible_search_path' from source: unknown 30564 1726882851.54387: calling self._execute() 30564 1726882851.54488: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882851.54498: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882851.54510: variable 'omit' from source: magic vars 30564 1726882851.54895: variable 'ansible_distribution_major_version' from source: facts 30564 1726882851.54913: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882851.55041: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882851.55266: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882851.57754: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882851.57832: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882851.57881: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882851.57930: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882851.57960: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882851.58050: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882851.58103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882851.58141: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882851.58192: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882851.58213: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882851.58272: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882851.58302: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882851.58331: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882851.58387: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882851.58408: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882851.58451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882851.58492: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882851.58522: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882851.58571: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882851.58594: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882851.58789: variable 'network_connections' from source: include params 30564 1726882851.58811: variable 'interface' from source: play vars 30564 1726882851.58881: variable 'interface' from source: play vars 30564 1726882851.58960: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882851.59139: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882851.59184: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882851.59220: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882851.59260: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882851.59311: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882851.59344: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882851.59381: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882851.59413: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882851.59487: variable '__network_team_connections_defined' from source: role '' defaults 30564 1726882851.59745: variable 'network_connections' from source: include params 30564 1726882851.59755: variable 'interface' from source: play vars 30564 1726882851.59828: variable 'interface' from source: play vars 30564 1726882851.59865: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30564 1726882851.59882: when evaluation is False, skipping this task 30564 1726882851.59895: _execute() done 30564 1726882851.59902: dumping result to json 30564 1726882851.59910: done dumping result, returning 30564 1726882851.59921: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-4216-acec-0000000010fc] 30564 1726882851.59931: sending task result for task 0e448fcc-3ce9-4216-acec-0000000010fc skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30564 1726882851.60097: no more pending results, returning what we have 30564 1726882851.60101: results queue empty 30564 1726882851.60103: checking for any_errors_fatal 30564 1726882851.60111: done checking for any_errors_fatal 30564 1726882851.60112: checking for max_fail_percentage 30564 1726882851.60114: done checking for max_fail_percentage 30564 1726882851.60115: checking to see if all hosts have failed and the running result is not ok 30564 1726882851.60116: done checking to see if all hosts have failed 30564 1726882851.60117: getting the remaining hosts for this loop 30564 1726882851.60119: done getting the remaining hosts for this loop 30564 1726882851.60123: getting the next task for host managed_node2 30564 1726882851.60132: done getting next task for host managed_node2 30564 1726882851.60137: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 30564 1726882851.60142: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882851.60171: getting variables 30564 1726882851.60173: in VariableManager get_vars() 30564 1726882851.60210: Calling all_inventory to load vars for managed_node2 30564 1726882851.60212: Calling groups_inventory to load vars for managed_node2 30564 1726882851.60215: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882851.60226: Calling all_plugins_play to load vars for managed_node2 30564 1726882851.60229: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882851.60232: Calling groups_plugins_play to load vars for managed_node2 30564 1726882851.61204: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000010fc 30564 1726882851.61208: WORKER PROCESS EXITING 30564 1726882851.62031: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882851.63861: done with get_vars() 30564 1726882851.63889: done getting variables 30564 1726882851.63951: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:40:51 -0400 (0:00:00.104) 0:00:50.221 ****** 30564 1726882851.63990: entering _queue_task() for managed_node2/package 30564 1726882851.64266: worker is 1 (out of 1 available) 30564 1726882851.64289: exiting _queue_task() for managed_node2/package 30564 1726882851.64304: done queuing things up, now waiting for results queue to drain 30564 1726882851.64306: waiting for pending results... 30564 1726882851.64504: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 30564 1726882851.64594: in run() - task 0e448fcc-3ce9-4216-acec-0000000010fd 30564 1726882851.64610: variable 'ansible_search_path' from source: unknown 30564 1726882851.64613: variable 'ansible_search_path' from source: unknown 30564 1726882851.64642: calling self._execute() 30564 1726882851.64721: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882851.64726: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882851.64735: variable 'omit' from source: magic vars 30564 1726882851.65009: variable 'ansible_distribution_major_version' from source: facts 30564 1726882851.65021: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882851.65158: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882851.65347: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882851.65387: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882851.65411: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882851.65450: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882851.65534: variable 'network_packages' from source: role '' defaults 30564 1726882851.65611: variable '__network_provider_setup' from source: role '' defaults 30564 1726882851.65618: variable '__network_service_name_default_nm' from source: role '' defaults 30564 1726882851.65663: variable '__network_service_name_default_nm' from source: role '' defaults 30564 1726882851.65674: variable '__network_packages_default_nm' from source: role '' defaults 30564 1726882851.65721: variable '__network_packages_default_nm' from source: role '' defaults 30564 1726882851.65838: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882851.67901: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882851.67950: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882851.67979: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882851.68005: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882851.68025: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882851.68082: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882851.68105: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882851.68124: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882851.68150: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882851.68161: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882851.68195: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882851.68212: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882851.68232: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882851.68257: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882851.68268: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882851.68409: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30564 1726882851.68481: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882851.68498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882851.68514: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882851.68545: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882851.68552: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882851.68616: variable 'ansible_python' from source: facts 30564 1726882851.68628: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30564 1726882851.68687: variable '__network_wpa_supplicant_required' from source: role '' defaults 30564 1726882851.68739: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30564 1726882851.68825: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882851.68842: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882851.68860: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882851.68892: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882851.68902: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882851.68933: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882851.68952: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882851.68978: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882851.69000: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882851.69011: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882851.69108: variable 'network_connections' from source: include params 30564 1726882851.69112: variable 'interface' from source: play vars 30564 1726882851.69184: variable 'interface' from source: play vars 30564 1726882851.69234: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882851.69253: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882851.69277: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882851.69317: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882851.69349: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882851.69812: variable 'network_connections' from source: include params 30564 1726882851.69815: variable 'interface' from source: play vars 30564 1726882851.69817: variable 'interface' from source: play vars 30564 1726882851.69819: variable '__network_packages_default_wireless' from source: role '' defaults 30564 1726882851.69920: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882851.70141: variable 'network_connections' from source: include params 30564 1726882851.70146: variable 'interface' from source: play vars 30564 1726882851.70212: variable 'interface' from source: play vars 30564 1726882851.70234: variable '__network_packages_default_team' from source: role '' defaults 30564 1726882851.70313: variable '__network_team_connections_defined' from source: role '' defaults 30564 1726882851.70612: variable 'network_connections' from source: include params 30564 1726882851.70615: variable 'interface' from source: play vars 30564 1726882851.70685: variable 'interface' from source: play vars 30564 1726882851.70739: variable '__network_service_name_default_initscripts' from source: role '' defaults 30564 1726882851.70798: variable '__network_service_name_default_initscripts' from source: role '' defaults 30564 1726882851.70805: variable '__network_packages_default_initscripts' from source: role '' defaults 30564 1726882851.70866: variable '__network_packages_default_initscripts' from source: role '' defaults 30564 1726882851.71080: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30564 1726882851.71568: variable 'network_connections' from source: include params 30564 1726882851.71575: variable 'interface' from source: play vars 30564 1726882851.71632: variable 'interface' from source: play vars 30564 1726882851.71640: variable 'ansible_distribution' from source: facts 30564 1726882851.71643: variable '__network_rh_distros' from source: role '' defaults 30564 1726882851.71650: variable 'ansible_distribution_major_version' from source: facts 30564 1726882851.71681: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30564 1726882851.71841: variable 'ansible_distribution' from source: facts 30564 1726882851.71845: variable '__network_rh_distros' from source: role '' defaults 30564 1726882851.71850: variable 'ansible_distribution_major_version' from source: facts 30564 1726882851.71860: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30564 1726882851.72024: variable 'ansible_distribution' from source: facts 30564 1726882851.72028: variable '__network_rh_distros' from source: role '' defaults 30564 1726882851.72033: variable 'ansible_distribution_major_version' from source: facts 30564 1726882851.72067: variable 'network_provider' from source: set_fact 30564 1726882851.72085: variable 'ansible_facts' from source: unknown 30564 1726882851.72767: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 30564 1726882851.72770: when evaluation is False, skipping this task 30564 1726882851.72772: _execute() done 30564 1726882851.72777: dumping result to json 30564 1726882851.72779: done dumping result, returning 30564 1726882851.72788: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [0e448fcc-3ce9-4216-acec-0000000010fd] 30564 1726882851.72794: sending task result for task 0e448fcc-3ce9-4216-acec-0000000010fd 30564 1726882851.72888: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000010fd 30564 1726882851.72891: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 30564 1726882851.72939: no more pending results, returning what we have 30564 1726882851.72942: results queue empty 30564 1726882851.72943: checking for any_errors_fatal 30564 1726882851.72951: done checking for any_errors_fatal 30564 1726882851.72951: checking for max_fail_percentage 30564 1726882851.72953: done checking for max_fail_percentage 30564 1726882851.72954: checking to see if all hosts have failed and the running result is not ok 30564 1726882851.72954: done checking to see if all hosts have failed 30564 1726882851.72955: getting the remaining hosts for this loop 30564 1726882851.72958: done getting the remaining hosts for this loop 30564 1726882851.72962: getting the next task for host managed_node2 30564 1726882851.72972: done getting next task for host managed_node2 30564 1726882851.72976: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30564 1726882851.72981: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882851.73003: getting variables 30564 1726882851.73004: in VariableManager get_vars() 30564 1726882851.73044: Calling all_inventory to load vars for managed_node2 30564 1726882851.73046: Calling groups_inventory to load vars for managed_node2 30564 1726882851.73048: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882851.73057: Calling all_plugins_play to load vars for managed_node2 30564 1726882851.73060: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882851.73062: Calling groups_plugins_play to load vars for managed_node2 30564 1726882851.74703: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882851.76540: done with get_vars() 30564 1726882851.76571: done getting variables 30564 1726882851.76629: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:40:51 -0400 (0:00:00.126) 0:00:50.348 ****** 30564 1726882851.76671: entering _queue_task() for managed_node2/package 30564 1726882851.76967: worker is 1 (out of 1 available) 30564 1726882851.76980: exiting _queue_task() for managed_node2/package 30564 1726882851.76999: done queuing things up, now waiting for results queue to drain 30564 1726882851.77000: waiting for pending results... 30564 1726882851.77333: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30564 1726882851.77479: in run() - task 0e448fcc-3ce9-4216-acec-0000000010fe 30564 1726882851.77499: variable 'ansible_search_path' from source: unknown 30564 1726882851.77507: variable 'ansible_search_path' from source: unknown 30564 1726882851.77554: calling self._execute() 30564 1726882851.77678: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882851.77691: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882851.77705: variable 'omit' from source: magic vars 30564 1726882851.78101: variable 'ansible_distribution_major_version' from source: facts 30564 1726882851.78122: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882851.78266: variable 'network_state' from source: role '' defaults 30564 1726882851.78284: Evaluated conditional (network_state != {}): False 30564 1726882851.78295: when evaluation is False, skipping this task 30564 1726882851.78307: _execute() done 30564 1726882851.78314: dumping result to json 30564 1726882851.78326: done dumping result, returning 30564 1726882851.78339: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0e448fcc-3ce9-4216-acec-0000000010fe] 30564 1726882851.78351: sending task result for task 0e448fcc-3ce9-4216-acec-0000000010fe skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30564 1726882851.78512: no more pending results, returning what we have 30564 1726882851.78516: results queue empty 30564 1726882851.78517: checking for any_errors_fatal 30564 1726882851.78525: done checking for any_errors_fatal 30564 1726882851.78525: checking for max_fail_percentage 30564 1726882851.78527: done checking for max_fail_percentage 30564 1726882851.78528: checking to see if all hosts have failed and the running result is not ok 30564 1726882851.78529: done checking to see if all hosts have failed 30564 1726882851.78530: getting the remaining hosts for this loop 30564 1726882851.78532: done getting the remaining hosts for this loop 30564 1726882851.78535: getting the next task for host managed_node2 30564 1726882851.78545: done getting next task for host managed_node2 30564 1726882851.78549: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30564 1726882851.78555: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882851.78583: getting variables 30564 1726882851.78585: in VariableManager get_vars() 30564 1726882851.78624: Calling all_inventory to load vars for managed_node2 30564 1726882851.78627: Calling groups_inventory to load vars for managed_node2 30564 1726882851.78629: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882851.78642: Calling all_plugins_play to load vars for managed_node2 30564 1726882851.78645: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882851.78649: Calling groups_plugins_play to load vars for managed_node2 30564 1726882851.79623: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000010fe 30564 1726882851.79627: WORKER PROCESS EXITING 30564 1726882851.80434: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882851.82420: done with get_vars() 30564 1726882851.82441: done getting variables 30564 1726882851.82506: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:40:51 -0400 (0:00:00.058) 0:00:50.406 ****** 30564 1726882851.82537: entering _queue_task() for managed_node2/package 30564 1726882851.82809: worker is 1 (out of 1 available) 30564 1726882851.82824: exiting _queue_task() for managed_node2/package 30564 1726882851.82837: done queuing things up, now waiting for results queue to drain 30564 1726882851.82838: waiting for pending results... 30564 1726882851.83131: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30564 1726882851.83295: in run() - task 0e448fcc-3ce9-4216-acec-0000000010ff 30564 1726882851.83313: variable 'ansible_search_path' from source: unknown 30564 1726882851.83320: variable 'ansible_search_path' from source: unknown 30564 1726882851.83362: calling self._execute() 30564 1726882851.83456: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882851.83477: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882851.83493: variable 'omit' from source: magic vars 30564 1726882851.83871: variable 'ansible_distribution_major_version' from source: facts 30564 1726882851.83891: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882851.84036: variable 'network_state' from source: role '' defaults 30564 1726882851.84052: Evaluated conditional (network_state != {}): False 30564 1726882851.84059: when evaluation is False, skipping this task 30564 1726882851.84069: _execute() done 30564 1726882851.84076: dumping result to json 30564 1726882851.84082: done dumping result, returning 30564 1726882851.84093: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0e448fcc-3ce9-4216-acec-0000000010ff] 30564 1726882851.84103: sending task result for task 0e448fcc-3ce9-4216-acec-0000000010ff skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30564 1726882851.84265: no more pending results, returning what we have 30564 1726882851.84270: results queue empty 30564 1726882851.84271: checking for any_errors_fatal 30564 1726882851.84281: done checking for any_errors_fatal 30564 1726882851.84282: checking for max_fail_percentage 30564 1726882851.84284: done checking for max_fail_percentage 30564 1726882851.84285: checking to see if all hosts have failed and the running result is not ok 30564 1726882851.84285: done checking to see if all hosts have failed 30564 1726882851.84286: getting the remaining hosts for this loop 30564 1726882851.84289: done getting the remaining hosts for this loop 30564 1726882851.84292: getting the next task for host managed_node2 30564 1726882851.84301: done getting next task for host managed_node2 30564 1726882851.84305: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30564 1726882851.84310: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882851.84334: getting variables 30564 1726882851.84337: in VariableManager get_vars() 30564 1726882851.84374: Calling all_inventory to load vars for managed_node2 30564 1726882851.84377: Calling groups_inventory to load vars for managed_node2 30564 1726882851.84380: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882851.84392: Calling all_plugins_play to load vars for managed_node2 30564 1726882851.84395: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882851.84398: Calling groups_plugins_play to load vars for managed_node2 30564 1726882851.85415: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000010ff 30564 1726882851.85418: WORKER PROCESS EXITING 30564 1726882851.86118: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882851.87973: done with get_vars() 30564 1726882851.87995: done getting variables 30564 1726882851.88048: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:40:51 -0400 (0:00:00.055) 0:00:50.462 ****** 30564 1726882851.88090: entering _queue_task() for managed_node2/service 30564 1726882851.88335: worker is 1 (out of 1 available) 30564 1726882851.88346: exiting _queue_task() for managed_node2/service 30564 1726882851.88358: done queuing things up, now waiting for results queue to drain 30564 1726882851.88359: waiting for pending results... 30564 1726882851.88645: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30564 1726882851.88798: in run() - task 0e448fcc-3ce9-4216-acec-000000001100 30564 1726882851.88828: variable 'ansible_search_path' from source: unknown 30564 1726882851.88838: variable 'ansible_search_path' from source: unknown 30564 1726882851.88878: calling self._execute() 30564 1726882851.88985: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882851.88997: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882851.89011: variable 'omit' from source: magic vars 30564 1726882851.89463: variable 'ansible_distribution_major_version' from source: facts 30564 1726882851.89486: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882851.89620: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882851.89838: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882851.92699: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882851.92780: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882851.92820: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882851.92873: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882851.92903: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882851.92992: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882851.93028: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882851.93065: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882851.93120: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882851.93141: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882851.93202: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882851.93231: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882851.93261: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882851.93318: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882851.93337: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882851.93388: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882851.93424: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882851.93454: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882851.93510: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882851.93532: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882851.93730: variable 'network_connections' from source: include params 30564 1726882851.93748: variable 'interface' from source: play vars 30564 1726882851.93821: variable 'interface' from source: play vars 30564 1726882851.93906: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882851.94106: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882851.94152: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882851.94192: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882851.94224: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882851.94283: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882851.94310: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882851.94340: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882851.94382: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882851.94448: variable '__network_team_connections_defined' from source: role '' defaults 30564 1726882851.94728: variable 'network_connections' from source: include params 30564 1726882851.94738: variable 'interface' from source: play vars 30564 1726882851.94806: variable 'interface' from source: play vars 30564 1726882851.94849: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30564 1726882851.94858: when evaluation is False, skipping this task 30564 1726882851.94868: _execute() done 30564 1726882851.94876: dumping result to json 30564 1726882851.94882: done dumping result, returning 30564 1726882851.94893: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-4216-acec-000000001100] 30564 1726882851.94908: sending task result for task 0e448fcc-3ce9-4216-acec-000000001100 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30564 1726882851.95082: no more pending results, returning what we have 30564 1726882851.95087: results queue empty 30564 1726882851.95088: checking for any_errors_fatal 30564 1726882851.95096: done checking for any_errors_fatal 30564 1726882851.95097: checking for max_fail_percentage 30564 1726882851.95099: done checking for max_fail_percentage 30564 1726882851.95100: checking to see if all hosts have failed and the running result is not ok 30564 1726882851.95101: done checking to see if all hosts have failed 30564 1726882851.95102: getting the remaining hosts for this loop 30564 1726882851.95104: done getting the remaining hosts for this loop 30564 1726882851.95108: getting the next task for host managed_node2 30564 1726882851.95116: done getting next task for host managed_node2 30564 1726882851.95121: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30564 1726882851.95126: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882851.95148: getting variables 30564 1726882851.95150: in VariableManager get_vars() 30564 1726882851.95191: Calling all_inventory to load vars for managed_node2 30564 1726882851.95194: Calling groups_inventory to load vars for managed_node2 30564 1726882851.95196: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882851.95207: Calling all_plugins_play to load vars for managed_node2 30564 1726882851.95210: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882851.95213: Calling groups_plugins_play to load vars for managed_node2 30564 1726882851.96217: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001100 30564 1726882851.96220: WORKER PROCESS EXITING 30564 1726882851.97211: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882851.98980: done with get_vars() 30564 1726882851.99002: done getting variables 30564 1726882851.99062: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:40:51 -0400 (0:00:00.110) 0:00:50.572 ****** 30564 1726882851.99098: entering _queue_task() for managed_node2/service 30564 1726882851.99397: worker is 1 (out of 1 available) 30564 1726882851.99408: exiting _queue_task() for managed_node2/service 30564 1726882851.99420: done queuing things up, now waiting for results queue to drain 30564 1726882851.99421: waiting for pending results... 30564 1726882851.99717: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30564 1726882851.99869: in run() - task 0e448fcc-3ce9-4216-acec-000000001101 30564 1726882851.99900: variable 'ansible_search_path' from source: unknown 30564 1726882851.99922: variable 'ansible_search_path' from source: unknown 30564 1726882851.99976: calling self._execute() 30564 1726882852.00079: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882852.00094: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882852.00110: variable 'omit' from source: magic vars 30564 1726882852.00533: variable 'ansible_distribution_major_version' from source: facts 30564 1726882852.00551: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882852.00740: variable 'network_provider' from source: set_fact 30564 1726882852.00752: variable 'network_state' from source: role '' defaults 30564 1726882852.00768: Evaluated conditional (network_provider == "nm" or network_state != {}): True 30564 1726882852.00778: variable 'omit' from source: magic vars 30564 1726882852.00860: variable 'omit' from source: magic vars 30564 1726882852.00895: variable 'network_service_name' from source: role '' defaults 30564 1726882852.00975: variable 'network_service_name' from source: role '' defaults 30564 1726882852.01093: variable '__network_provider_setup' from source: role '' defaults 30564 1726882852.01103: variable '__network_service_name_default_nm' from source: role '' defaults 30564 1726882852.01178: variable '__network_service_name_default_nm' from source: role '' defaults 30564 1726882852.01191: variable '__network_packages_default_nm' from source: role '' defaults 30564 1726882852.01261: variable '__network_packages_default_nm' from source: role '' defaults 30564 1726882852.01625: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882852.03184: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882852.03232: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882852.03259: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882852.03298: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882852.03318: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882852.03380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882852.03399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882852.03417: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882852.03446: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882852.03457: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882852.03493: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882852.03508: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882852.03525: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882852.03584: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882852.03595: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882852.03793: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30564 1726882852.03891: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882852.03908: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882852.03924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882852.03969: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882852.04774: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882852.04778: variable 'ansible_python' from source: facts 30564 1726882852.04780: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30564 1726882852.04782: variable '__network_wpa_supplicant_required' from source: role '' defaults 30564 1726882852.04784: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30564 1726882852.04787: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882852.04789: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882852.04791: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882852.04793: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882852.04795: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882852.04797: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882852.04807: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882852.04810: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882852.04812: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882852.04814: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882852.04816: variable 'network_connections' from source: include params 30564 1726882852.04819: variable 'interface' from source: play vars 30564 1726882852.05174: variable 'interface' from source: play vars 30564 1726882852.05177: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882852.05180: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882852.05272: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882852.05277: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882852.05397: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882852.05400: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882852.05413: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882852.05450: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882852.05487: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882852.05532: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882852.05831: variable 'network_connections' from source: include params 30564 1726882852.05834: variable 'interface' from source: play vars 30564 1726882852.05908: variable 'interface' from source: play vars 30564 1726882852.05952: variable '__network_packages_default_wireless' from source: role '' defaults 30564 1726882852.06012: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882852.06286: variable 'network_connections' from source: include params 30564 1726882852.06290: variable 'interface' from source: play vars 30564 1726882852.06342: variable 'interface' from source: play vars 30564 1726882852.06374: variable '__network_packages_default_team' from source: role '' defaults 30564 1726882852.06427: variable '__network_team_connections_defined' from source: role '' defaults 30564 1726882852.06726: variable 'network_connections' from source: include params 30564 1726882852.06730: variable 'interface' from source: play vars 30564 1726882852.06873: variable 'interface' from source: play vars 30564 1726882852.06876: variable '__network_service_name_default_initscripts' from source: role '' defaults 30564 1726882852.06878: variable '__network_service_name_default_initscripts' from source: role '' defaults 30564 1726882852.06887: variable '__network_packages_default_initscripts' from source: role '' defaults 30564 1726882852.06948: variable '__network_packages_default_initscripts' from source: role '' defaults 30564 1726882852.07177: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30564 1726882852.07683: variable 'network_connections' from source: include params 30564 1726882852.07687: variable 'interface' from source: play vars 30564 1726882852.07750: variable 'interface' from source: play vars 30564 1726882852.07753: variable 'ansible_distribution' from source: facts 30564 1726882852.07756: variable '__network_rh_distros' from source: role '' defaults 30564 1726882852.07758: variable 'ansible_distribution_major_version' from source: facts 30564 1726882852.07796: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30564 1726882852.07963: variable 'ansible_distribution' from source: facts 30564 1726882852.07968: variable '__network_rh_distros' from source: role '' defaults 30564 1726882852.07975: variable 'ansible_distribution_major_version' from source: facts 30564 1726882852.07982: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30564 1726882852.08118: variable 'ansible_distribution' from source: facts 30564 1726882852.08121: variable '__network_rh_distros' from source: role '' defaults 30564 1726882852.08126: variable 'ansible_distribution_major_version' from source: facts 30564 1726882852.08151: variable 'network_provider' from source: set_fact 30564 1726882852.08170: variable 'omit' from source: magic vars 30564 1726882852.08191: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882852.08218: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882852.08230: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882852.08242: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882852.08251: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882852.08279: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882852.08282: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882852.08285: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882852.08350: Set connection var ansible_timeout to 10 30564 1726882852.08354: Set connection var ansible_pipelining to False 30564 1726882852.08356: Set connection var ansible_shell_type to sh 30564 1726882852.08362: Set connection var ansible_shell_executable to /bin/sh 30564 1726882852.08371: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882852.08375: Set connection var ansible_connection to ssh 30564 1726882852.08395: variable 'ansible_shell_executable' from source: unknown 30564 1726882852.08398: variable 'ansible_connection' from source: unknown 30564 1726882852.08400: variable 'ansible_module_compression' from source: unknown 30564 1726882852.08402: variable 'ansible_shell_type' from source: unknown 30564 1726882852.08404: variable 'ansible_shell_executable' from source: unknown 30564 1726882852.08406: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882852.08409: variable 'ansible_pipelining' from source: unknown 30564 1726882852.08412: variable 'ansible_timeout' from source: unknown 30564 1726882852.08416: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882852.08492: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882852.08498: variable 'omit' from source: magic vars 30564 1726882852.08501: starting attempt loop 30564 1726882852.08504: running the handler 30564 1726882852.08559: variable 'ansible_facts' from source: unknown 30564 1726882852.09168: _low_level_execute_command(): starting 30564 1726882852.09176: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882852.09639: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882852.09648: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882852.09671: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882852.09686: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882852.09696: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882852.09740: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882852.09752: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882852.09804: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882852.09903: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882852.11545: stdout chunk (state=3): >>>/root <<< 30564 1726882852.11649: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882852.11694: stderr chunk (state=3): >>><<< 30564 1726882852.11698: stdout chunk (state=3): >>><<< 30564 1726882852.11713: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882852.11722: _low_level_execute_command(): starting 30564 1726882852.11730: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882852.1171205-32765-62257508313473 `" && echo ansible-tmp-1726882852.1171205-32765-62257508313473="` echo /root/.ansible/tmp/ansible-tmp-1726882852.1171205-32765-62257508313473 `" ) && sleep 0' 30564 1726882852.12126: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882852.12131: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882852.12183: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882852.12186: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882852.12189: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 30564 1726882852.12191: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882852.12241: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882852.12244: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882852.12348: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882852.14231: stdout chunk (state=3): >>>ansible-tmp-1726882852.1171205-32765-62257508313473=/root/.ansible/tmp/ansible-tmp-1726882852.1171205-32765-62257508313473 <<< 30564 1726882852.14343: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882852.14390: stderr chunk (state=3): >>><<< 30564 1726882852.14394: stdout chunk (state=3): >>><<< 30564 1726882852.14408: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882852.1171205-32765-62257508313473=/root/.ansible/tmp/ansible-tmp-1726882852.1171205-32765-62257508313473 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882852.14428: variable 'ansible_module_compression' from source: unknown 30564 1726882852.14467: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 30564 1726882852.14516: variable 'ansible_facts' from source: unknown 30564 1726882852.14644: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882852.1171205-32765-62257508313473/AnsiballZ_systemd.py 30564 1726882852.14748: Sending initial data 30564 1726882852.14751: Sent initial data (155 bytes) 30564 1726882852.15372: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882852.15376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882852.15408: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882852.15421: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882852.15474: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882852.15490: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882852.15587: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882852.17376: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882852.17474: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882852.17568: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmp_83dmp90 /root/.ansible/tmp/ansible-tmp-1726882852.1171205-32765-62257508313473/AnsiballZ_systemd.py <<< 30564 1726882852.17662: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882852.19670: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882852.19750: stderr chunk (state=3): >>><<< 30564 1726882852.19755: stdout chunk (state=3): >>><<< 30564 1726882852.19772: done transferring module to remote 30564 1726882852.19778: _low_level_execute_command(): starting 30564 1726882852.19784: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882852.1171205-32765-62257508313473/ /root/.ansible/tmp/ansible-tmp-1726882852.1171205-32765-62257508313473/AnsiballZ_systemd.py && sleep 0' 30564 1726882852.20185: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882852.20192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882852.20226: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882852.20233: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882852.20238: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882852.20247: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882852.20297: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882852.20309: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882852.20416: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882852.22215: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882852.22257: stderr chunk (state=3): >>><<< 30564 1726882852.22260: stdout chunk (state=3): >>><<< 30564 1726882852.22273: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882852.22276: _low_level_execute_command(): starting 30564 1726882852.22280: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882852.1171205-32765-62257508313473/AnsiballZ_systemd.py && sleep 0' 30564 1726882852.22677: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882852.22695: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882852.22707: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882852.22718: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882852.22763: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882852.22786: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882852.22887: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882852.47859: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6692", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ExecMainStartTimestampMonotonic": "202392137", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6692", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManag<<< 30564 1726882852.47893: stdout chunk (state=3): >>>er.service", "ControlGroupId": "3602", "MemoryCurrent": "9142272", "MemoryAvailable": "infinity", "CPUUsageNSec": "2168812000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Watchdo<<< 30564 1726882852.47897: stdout chunk (state=3): >>>gSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service multi-user.target network.target shutdown.target cloud-init.service", "After": "cloud-init-local.service dbus-broker.service network-pre.target system.slice dbus.socket systemd-journald.socket basic.target sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:57 EDT", "StateChangeTimestampMonotonic": "316658837", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveExitTimestampMonotonic": "202392395", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveEnterTimestampMonotonic": "202472383", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveExitTimestampMonotonic": "202362940", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveEnterTimestampMonotonic": "202381901", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ConditionTimestampMonotonic": "202382734", "AssertTimestamp": "Fri 2024-09-20 21:31:03 EDT", "AssertTimestampMonotonic": "202382737", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "55e27919215348fab37a11b7ea324f90", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 30564 1726882852.49519: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882852.49524: stdout chunk (state=3): >>><<< 30564 1726882852.49528: stderr chunk (state=3): >>><<< 30564 1726882852.49816: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6692", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ExecMainStartTimestampMonotonic": "202392137", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6692", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3602", "MemoryCurrent": "9142272", "MemoryAvailable": "infinity", "CPUUsageNSec": "2168812000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service multi-user.target network.target shutdown.target cloud-init.service", "After": "cloud-init-local.service dbus-broker.service network-pre.target system.slice dbus.socket systemd-journald.socket basic.target sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:57 EDT", "StateChangeTimestampMonotonic": "316658837", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveExitTimestampMonotonic": "202392395", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveEnterTimestampMonotonic": "202472383", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveExitTimestampMonotonic": "202362940", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveEnterTimestampMonotonic": "202381901", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ConditionTimestampMonotonic": "202382734", "AssertTimestamp": "Fri 2024-09-20 21:31:03 EDT", "AssertTimestampMonotonic": "202382737", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "55e27919215348fab37a11b7ea324f90", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882852.49828: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882852.1171205-32765-62257508313473/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882852.49831: _low_level_execute_command(): starting 30564 1726882852.49834: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882852.1171205-32765-62257508313473/ > /dev/null 2>&1 && sleep 0' 30564 1726882852.50427: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882852.50441: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882852.50456: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882852.50481: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882852.50523: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882852.50536: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882852.50552: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882852.50578: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882852.50592: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882852.50603: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882852.50616: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882852.50631: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882852.50647: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882852.50659: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882852.50676: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882852.50692: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882852.50774: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882852.50792: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882852.50808: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882852.50947: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882852.52808: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882852.52811: stdout chunk (state=3): >>><<< 30564 1726882852.52818: stderr chunk (state=3): >>><<< 30564 1726882852.52837: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882852.52844: handler run complete 30564 1726882852.52904: attempt loop complete, returning result 30564 1726882852.52907: _execute() done 30564 1726882852.52909: dumping result to json 30564 1726882852.52926: done dumping result, returning 30564 1726882852.52938: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0e448fcc-3ce9-4216-acec-000000001101] 30564 1726882852.52943: sending task result for task 0e448fcc-3ce9-4216-acec-000000001101 30564 1726882852.53179: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001101 30564 1726882852.53182: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30564 1726882852.53345: no more pending results, returning what we have 30564 1726882852.53348: results queue empty 30564 1726882852.53349: checking for any_errors_fatal 30564 1726882852.53354: done checking for any_errors_fatal 30564 1726882852.53354: checking for max_fail_percentage 30564 1726882852.53356: done checking for max_fail_percentage 30564 1726882852.53358: checking to see if all hosts have failed and the running result is not ok 30564 1726882852.53358: done checking to see if all hosts have failed 30564 1726882852.53359: getting the remaining hosts for this loop 30564 1726882852.53361: done getting the remaining hosts for this loop 30564 1726882852.53367: getting the next task for host managed_node2 30564 1726882852.53376: done getting next task for host managed_node2 30564 1726882852.53380: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30564 1726882852.53385: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882852.53398: getting variables 30564 1726882852.53400: in VariableManager get_vars() 30564 1726882852.53439: Calling all_inventory to load vars for managed_node2 30564 1726882852.53441: Calling groups_inventory to load vars for managed_node2 30564 1726882852.53443: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882852.53453: Calling all_plugins_play to load vars for managed_node2 30564 1726882852.53456: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882852.53468: Calling groups_plugins_play to load vars for managed_node2 30564 1726882852.55246: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882852.57027: done with get_vars() 30564 1726882852.57055: done getting variables 30564 1726882852.57119: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:40:52 -0400 (0:00:00.580) 0:00:51.152 ****** 30564 1726882852.57160: entering _queue_task() for managed_node2/service 30564 1726882852.57491: worker is 1 (out of 1 available) 30564 1726882852.57504: exiting _queue_task() for managed_node2/service 30564 1726882852.57518: done queuing things up, now waiting for results queue to drain 30564 1726882852.57519: waiting for pending results... 30564 1726882852.57831: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30564 1726882852.57976: in run() - task 0e448fcc-3ce9-4216-acec-000000001102 30564 1726882852.57987: variable 'ansible_search_path' from source: unknown 30564 1726882852.57992: variable 'ansible_search_path' from source: unknown 30564 1726882852.58032: calling self._execute() 30564 1726882852.58133: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882852.58138: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882852.58148: variable 'omit' from source: magic vars 30564 1726882852.58537: variable 'ansible_distribution_major_version' from source: facts 30564 1726882852.58554: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882852.58680: variable 'network_provider' from source: set_fact 30564 1726882852.58686: Evaluated conditional (network_provider == "nm"): True 30564 1726882852.58781: variable '__network_wpa_supplicant_required' from source: role '' defaults 30564 1726882852.58870: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30564 1726882852.59036: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882852.61329: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882852.61399: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882852.61437: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882852.61472: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882852.61499: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882852.61597: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882852.61622: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882852.61643: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882852.61685: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882852.61701: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882852.61744: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882852.61772: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882852.61794: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882852.61835: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882852.61851: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882852.61894: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882852.61912: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882852.61937: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882852.61979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882852.61992: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882852.62144: variable 'network_connections' from source: include params 30564 1726882852.62158: variable 'interface' from source: play vars 30564 1726882852.62219: variable 'interface' from source: play vars 30564 1726882852.62301: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882852.62473: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882852.62508: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882852.62544: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882852.62578: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882852.62618: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882852.62644: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882852.62674: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882852.62700: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882852.62750: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882852.63013: variable 'network_connections' from source: include params 30564 1726882852.63016: variable 'interface' from source: play vars 30564 1726882852.63082: variable 'interface' from source: play vars 30564 1726882852.63123: Evaluated conditional (__network_wpa_supplicant_required): False 30564 1726882852.63126: when evaluation is False, skipping this task 30564 1726882852.63129: _execute() done 30564 1726882852.63132: dumping result to json 30564 1726882852.63134: done dumping result, returning 30564 1726882852.63142: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0e448fcc-3ce9-4216-acec-000000001102] 30564 1726882852.63153: sending task result for task 0e448fcc-3ce9-4216-acec-000000001102 30564 1726882852.63239: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001102 30564 1726882852.63242: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 30564 1726882852.63321: no more pending results, returning what we have 30564 1726882852.63325: results queue empty 30564 1726882852.63326: checking for any_errors_fatal 30564 1726882852.63351: done checking for any_errors_fatal 30564 1726882852.63352: checking for max_fail_percentage 30564 1726882852.63354: done checking for max_fail_percentage 30564 1726882852.63355: checking to see if all hosts have failed and the running result is not ok 30564 1726882852.63356: done checking to see if all hosts have failed 30564 1726882852.63356: getting the remaining hosts for this loop 30564 1726882852.63358: done getting the remaining hosts for this loop 30564 1726882852.63364: getting the next task for host managed_node2 30564 1726882852.63374: done getting next task for host managed_node2 30564 1726882852.63379: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 30564 1726882852.63384: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882852.63406: getting variables 30564 1726882852.63408: in VariableManager get_vars() 30564 1726882852.63446: Calling all_inventory to load vars for managed_node2 30564 1726882852.63449: Calling groups_inventory to load vars for managed_node2 30564 1726882852.63452: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882852.63462: Calling all_plugins_play to load vars for managed_node2 30564 1726882852.63467: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882852.63470: Calling groups_plugins_play to load vars for managed_node2 30564 1726882852.65115: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882852.66869: done with get_vars() 30564 1726882852.66894: done getting variables 30564 1726882852.66959: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:40:52 -0400 (0:00:00.098) 0:00:51.251 ****** 30564 1726882852.66996: entering _queue_task() for managed_node2/service 30564 1726882852.67313: worker is 1 (out of 1 available) 30564 1726882852.67326: exiting _queue_task() for managed_node2/service 30564 1726882852.67339: done queuing things up, now waiting for results queue to drain 30564 1726882852.67340: waiting for pending results... 30564 1726882852.67633: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 30564 1726882852.67757: in run() - task 0e448fcc-3ce9-4216-acec-000000001103 30564 1726882852.67773: variable 'ansible_search_path' from source: unknown 30564 1726882852.67778: variable 'ansible_search_path' from source: unknown 30564 1726882852.67812: calling self._execute() 30564 1726882852.67904: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882852.67911: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882852.67920: variable 'omit' from source: magic vars 30564 1726882852.68276: variable 'ansible_distribution_major_version' from source: facts 30564 1726882852.68289: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882852.68401: variable 'network_provider' from source: set_fact 30564 1726882852.68406: Evaluated conditional (network_provider == "initscripts"): False 30564 1726882852.68408: when evaluation is False, skipping this task 30564 1726882852.68411: _execute() done 30564 1726882852.68413: dumping result to json 30564 1726882852.68416: done dumping result, returning 30564 1726882852.68424: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [0e448fcc-3ce9-4216-acec-000000001103] 30564 1726882852.68428: sending task result for task 0e448fcc-3ce9-4216-acec-000000001103 30564 1726882852.68531: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001103 30564 1726882852.68534: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30564 1726882852.68591: no more pending results, returning what we have 30564 1726882852.68595: results queue empty 30564 1726882852.68596: checking for any_errors_fatal 30564 1726882852.68603: done checking for any_errors_fatal 30564 1726882852.68603: checking for max_fail_percentage 30564 1726882852.68606: done checking for max_fail_percentage 30564 1726882852.68607: checking to see if all hosts have failed and the running result is not ok 30564 1726882852.68607: done checking to see if all hosts have failed 30564 1726882852.68608: getting the remaining hosts for this loop 30564 1726882852.68611: done getting the remaining hosts for this loop 30564 1726882852.68614: getting the next task for host managed_node2 30564 1726882852.68624: done getting next task for host managed_node2 30564 1726882852.68628: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30564 1726882852.68633: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882852.68659: getting variables 30564 1726882852.68661: in VariableManager get_vars() 30564 1726882852.68705: Calling all_inventory to load vars for managed_node2 30564 1726882852.68708: Calling groups_inventory to load vars for managed_node2 30564 1726882852.68711: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882852.68723: Calling all_plugins_play to load vars for managed_node2 30564 1726882852.68727: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882852.68730: Calling groups_plugins_play to load vars for managed_node2 30564 1726882852.75184: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882852.76937: done with get_vars() 30564 1726882852.76960: done getting variables 30564 1726882852.77014: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:40:52 -0400 (0:00:00.100) 0:00:51.351 ****** 30564 1726882852.77043: entering _queue_task() for managed_node2/copy 30564 1726882852.77380: worker is 1 (out of 1 available) 30564 1726882852.77392: exiting _queue_task() for managed_node2/copy 30564 1726882852.77405: done queuing things up, now waiting for results queue to drain 30564 1726882852.77407: waiting for pending results... 30564 1726882852.77709: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30564 1726882852.77859: in run() - task 0e448fcc-3ce9-4216-acec-000000001104 30564 1726882852.77876: variable 'ansible_search_path' from source: unknown 30564 1726882852.77880: variable 'ansible_search_path' from source: unknown 30564 1726882852.77916: calling self._execute() 30564 1726882852.78017: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882852.78028: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882852.78041: variable 'omit' from source: magic vars 30564 1726882852.78438: variable 'ansible_distribution_major_version' from source: facts 30564 1726882852.78451: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882852.78576: variable 'network_provider' from source: set_fact 30564 1726882852.78581: Evaluated conditional (network_provider == "initscripts"): False 30564 1726882852.78584: when evaluation is False, skipping this task 30564 1726882852.78587: _execute() done 30564 1726882852.78591: dumping result to json 30564 1726882852.78594: done dumping result, returning 30564 1726882852.78603: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0e448fcc-3ce9-4216-acec-000000001104] 30564 1726882852.78615: sending task result for task 0e448fcc-3ce9-4216-acec-000000001104 skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 30564 1726882852.78771: no more pending results, returning what we have 30564 1726882852.78775: results queue empty 30564 1726882852.78776: checking for any_errors_fatal 30564 1726882852.78784: done checking for any_errors_fatal 30564 1726882852.78784: checking for max_fail_percentage 30564 1726882852.78786: done checking for max_fail_percentage 30564 1726882852.78787: checking to see if all hosts have failed and the running result is not ok 30564 1726882852.78788: done checking to see if all hosts have failed 30564 1726882852.78789: getting the remaining hosts for this loop 30564 1726882852.78791: done getting the remaining hosts for this loop 30564 1726882852.78794: getting the next task for host managed_node2 30564 1726882852.78804: done getting next task for host managed_node2 30564 1726882852.78808: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30564 1726882852.78815: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882852.78843: getting variables 30564 1726882852.78845: in VariableManager get_vars() 30564 1726882852.78893: Calling all_inventory to load vars for managed_node2 30564 1726882852.78896: Calling groups_inventory to load vars for managed_node2 30564 1726882852.78899: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882852.78914: Calling all_plugins_play to load vars for managed_node2 30564 1726882852.78918: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882852.78922: Calling groups_plugins_play to load vars for managed_node2 30564 1726882852.79525: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001104 30564 1726882852.79528: WORKER PROCESS EXITING 30564 1726882852.80270: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882852.81233: done with get_vars() 30564 1726882852.81250: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:40:52 -0400 (0:00:00.042) 0:00:51.394 ****** 30564 1726882852.81312: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 30564 1726882852.81524: worker is 1 (out of 1 available) 30564 1726882852.81538: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 30564 1726882852.81550: done queuing things up, now waiting for results queue to drain 30564 1726882852.81552: waiting for pending results... 30564 1726882852.81774: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30564 1726882852.81930: in run() - task 0e448fcc-3ce9-4216-acec-000000001105 30564 1726882852.81950: variable 'ansible_search_path' from source: unknown 30564 1726882852.81957: variable 'ansible_search_path' from source: unknown 30564 1726882852.82007: calling self._execute() 30564 1726882852.82128: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882852.82142: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882852.82156: variable 'omit' from source: magic vars 30564 1726882852.82548: variable 'ansible_distribution_major_version' from source: facts 30564 1726882852.82571: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882852.82584: variable 'omit' from source: magic vars 30564 1726882852.82655: variable 'omit' from source: magic vars 30564 1726882852.82825: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882852.84653: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882852.84700: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882852.84728: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882852.84760: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882852.84785: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882852.84841: variable 'network_provider' from source: set_fact 30564 1726882852.84943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882852.84961: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882852.85000: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882852.85037: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882852.85051: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882852.85124: variable 'omit' from source: magic vars 30564 1726882852.85374: variable 'omit' from source: magic vars 30564 1726882852.85378: variable 'network_connections' from source: include params 30564 1726882852.85381: variable 'interface' from source: play vars 30564 1726882852.85399: variable 'interface' from source: play vars 30564 1726882852.85542: variable 'omit' from source: magic vars 30564 1726882852.85550: variable '__lsr_ansible_managed' from source: task vars 30564 1726882852.85610: variable '__lsr_ansible_managed' from source: task vars 30564 1726882852.85780: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 30564 1726882852.85999: Loaded config def from plugin (lookup/template) 30564 1726882852.86003: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 30564 1726882852.86029: File lookup term: get_ansible_managed.j2 30564 1726882852.86032: variable 'ansible_search_path' from source: unknown 30564 1726882852.86036: evaluation_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 30564 1726882852.86049: search_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 30564 1726882852.86066: variable 'ansible_search_path' from source: unknown 30564 1726882852.92417: variable 'ansible_managed' from source: unknown 30564 1726882852.92563: variable 'omit' from source: magic vars 30564 1726882852.92593: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882852.92620: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882852.92637: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882852.92655: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882852.92664: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882852.92696: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882852.92699: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882852.92702: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882852.92793: Set connection var ansible_timeout to 10 30564 1726882852.92797: Set connection var ansible_pipelining to False 30564 1726882852.92800: Set connection var ansible_shell_type to sh 30564 1726882852.92806: Set connection var ansible_shell_executable to /bin/sh 30564 1726882852.92813: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882852.92816: Set connection var ansible_connection to ssh 30564 1726882852.92842: variable 'ansible_shell_executable' from source: unknown 30564 1726882852.92845: variable 'ansible_connection' from source: unknown 30564 1726882852.92848: variable 'ansible_module_compression' from source: unknown 30564 1726882852.92851: variable 'ansible_shell_type' from source: unknown 30564 1726882852.92853: variable 'ansible_shell_executable' from source: unknown 30564 1726882852.92856: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882852.92859: variable 'ansible_pipelining' from source: unknown 30564 1726882852.92862: variable 'ansible_timeout' from source: unknown 30564 1726882852.92864: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882852.92994: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30564 1726882852.93006: variable 'omit' from source: magic vars 30564 1726882852.93012: starting attempt loop 30564 1726882852.93015: running the handler 30564 1726882852.93027: _low_level_execute_command(): starting 30564 1726882852.93033: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882852.93725: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882852.93736: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882852.93747: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882852.93759: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882852.93800: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882852.93806: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882852.93816: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882852.93828: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882852.93835: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882852.93841: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882852.93850: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882852.93859: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882852.93874: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882852.93882: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882852.93888: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882852.93897: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882852.93980: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882852.93984: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882852.93992: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882852.94127: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882852.95805: stdout chunk (state=3): >>>/root <<< 30564 1726882852.95979: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882852.95984: stdout chunk (state=3): >>><<< 30564 1726882852.95994: stderr chunk (state=3): >>><<< 30564 1726882852.96015: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882852.96028: _low_level_execute_command(): starting 30564 1726882852.96036: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882852.9601495-32793-138411208111383 `" && echo ansible-tmp-1726882852.9601495-32793-138411208111383="` echo /root/.ansible/tmp/ansible-tmp-1726882852.9601495-32793-138411208111383 `" ) && sleep 0' 30564 1726882852.96677: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882852.96687: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882852.96698: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882852.96714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882852.96751: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882852.96758: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882852.96774: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882852.96788: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882852.96796: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882852.96802: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882852.96811: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882852.96819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882852.96830: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882852.96838: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882852.96844: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882852.96853: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882852.96937: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882852.96944: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882852.96952: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882852.97087: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882852.98989: stdout chunk (state=3): >>>ansible-tmp-1726882852.9601495-32793-138411208111383=/root/.ansible/tmp/ansible-tmp-1726882852.9601495-32793-138411208111383 <<< 30564 1726882852.99104: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882852.99196: stderr chunk (state=3): >>><<< 30564 1726882852.99208: stdout chunk (state=3): >>><<< 30564 1726882852.99478: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882852.9601495-32793-138411208111383=/root/.ansible/tmp/ansible-tmp-1726882852.9601495-32793-138411208111383 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882852.99486: variable 'ansible_module_compression' from source: unknown 30564 1726882852.99489: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 30564 1726882852.99491: variable 'ansible_facts' from source: unknown 30564 1726882852.99501: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882852.9601495-32793-138411208111383/AnsiballZ_network_connections.py 30564 1726882852.99671: Sending initial data 30564 1726882852.99676: Sent initial data (168 bytes) 30564 1726882853.00730: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882853.00743: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882853.00756: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882853.00786: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882853.00826: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882853.00837: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882853.00849: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882853.00866: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882853.00880: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882853.00897: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882853.00909: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882853.00920: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882853.00934: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882853.00944: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882853.00954: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882853.00967: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882853.01050: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882853.01078: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882853.01095: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882853.01242: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882853.03039: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882853.03133: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882853.03233: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmpwk6g59px /root/.ansible/tmp/ansible-tmp-1726882852.9601495-32793-138411208111383/AnsiballZ_network_connections.py <<< 30564 1726882853.03329: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882853.05062: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882853.05271: stderr chunk (state=3): >>><<< 30564 1726882853.05274: stdout chunk (state=3): >>><<< 30564 1726882853.05276: done transferring module to remote 30564 1726882853.05380: _low_level_execute_command(): starting 30564 1726882853.05384: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882852.9601495-32793-138411208111383/ /root/.ansible/tmp/ansible-tmp-1726882852.9601495-32793-138411208111383/AnsiballZ_network_connections.py && sleep 0' 30564 1726882853.06017: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882853.06043: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882853.06060: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882853.06087: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882853.06127: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882853.06148: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882853.06162: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882853.06185: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882853.06196: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882853.06206: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882853.06217: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882853.06230: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882853.06245: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882853.06266: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882853.06280: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882853.06293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882853.06381: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882853.06407: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882853.06432: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882853.06604: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882853.08376: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882853.08426: stderr chunk (state=3): >>><<< 30564 1726882853.08430: stdout chunk (state=3): >>><<< 30564 1726882853.08447: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882853.08461: _low_level_execute_command(): starting 30564 1726882853.08467: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882852.9601495-32793-138411208111383/AnsiballZ_network_connections.py && sleep 0' 30564 1726882853.09155: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882853.09159: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882853.09161: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882853.09163: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882853.09166: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882853.09169: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882853.09170: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882853.09172: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882853.09174: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882853.09176: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882853.09178: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882853.09180: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882853.09181: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882853.09183: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882853.09185: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882853.09187: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882853.09231: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882853.09234: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882853.09253: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882853.09395: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882853.35392: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, ef91e5fd-4b93-4ee4-ae54-4de7a703b196\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 30564 1726882853.37094: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882853.37147: stderr chunk (state=3): >>><<< 30564 1726882853.37152: stdout chunk (state=3): >>><<< 30564 1726882853.37168: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, ef91e5fd-4b93-4ee4-ae54-4de7a703b196\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882853.37201: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'persistent_state': 'present', 'type': 'bridge', 'ip': {'dhcp4': False, 'auto6': False}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882852.9601495-32793-138411208111383/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882853.37208: _low_level_execute_command(): starting 30564 1726882853.37213: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882852.9601495-32793-138411208111383/ > /dev/null 2>&1 && sleep 0' 30564 1726882853.37662: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882853.37668: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882853.37705: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 30564 1726882853.37708: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882853.37711: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882853.37769: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882853.37773: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882853.37778: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882853.37882: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882853.39704: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882853.39749: stderr chunk (state=3): >>><<< 30564 1726882853.39754: stdout chunk (state=3): >>><<< 30564 1726882853.39771: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882853.39775: handler run complete 30564 1726882853.39796: attempt loop complete, returning result 30564 1726882853.39799: _execute() done 30564 1726882853.39801: dumping result to json 30564 1726882853.39806: done dumping result, returning 30564 1726882853.39814: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0e448fcc-3ce9-4216-acec-000000001105] 30564 1726882853.39821: sending task result for task 0e448fcc-3ce9-4216-acec-000000001105 30564 1726882853.39921: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001105 30564 1726882853.39926: WORKER PROCESS EXITING changed: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [002] #0, state:None persistent_state:present, 'statebr': add connection statebr, ef91e5fd-4b93-4ee4-ae54-4de7a703b196 30564 1726882853.40027: no more pending results, returning what we have 30564 1726882853.40030: results queue empty 30564 1726882853.40031: checking for any_errors_fatal 30564 1726882853.40038: done checking for any_errors_fatal 30564 1726882853.40039: checking for max_fail_percentage 30564 1726882853.40040: done checking for max_fail_percentage 30564 1726882853.40041: checking to see if all hosts have failed and the running result is not ok 30564 1726882853.40042: done checking to see if all hosts have failed 30564 1726882853.40043: getting the remaining hosts for this loop 30564 1726882853.40045: done getting the remaining hosts for this loop 30564 1726882853.40048: getting the next task for host managed_node2 30564 1726882853.40055: done getting next task for host managed_node2 30564 1726882853.40059: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 30564 1726882853.40065: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882853.40078: getting variables 30564 1726882853.40079: in VariableManager get_vars() 30564 1726882853.40114: Calling all_inventory to load vars for managed_node2 30564 1726882853.40116: Calling groups_inventory to load vars for managed_node2 30564 1726882853.40119: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882853.40128: Calling all_plugins_play to load vars for managed_node2 30564 1726882853.40131: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882853.40133: Calling groups_plugins_play to load vars for managed_node2 30564 1726882853.41114: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882853.42057: done with get_vars() 30564 1726882853.42076: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:40:53 -0400 (0:00:00.608) 0:00:52.002 ****** 30564 1726882853.42137: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 30564 1726882853.42360: worker is 1 (out of 1 available) 30564 1726882853.42375: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 30564 1726882853.42387: done queuing things up, now waiting for results queue to drain 30564 1726882853.42388: waiting for pending results... 30564 1726882853.42584: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 30564 1726882853.42690: in run() - task 0e448fcc-3ce9-4216-acec-000000001106 30564 1726882853.42702: variable 'ansible_search_path' from source: unknown 30564 1726882853.42706: variable 'ansible_search_path' from source: unknown 30564 1726882853.42735: calling self._execute() 30564 1726882853.42812: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882853.42816: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882853.42826: variable 'omit' from source: magic vars 30564 1726882853.43114: variable 'ansible_distribution_major_version' from source: facts 30564 1726882853.43124: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882853.43216: variable 'network_state' from source: role '' defaults 30564 1726882853.43223: Evaluated conditional (network_state != {}): False 30564 1726882853.43226: when evaluation is False, skipping this task 30564 1726882853.43229: _execute() done 30564 1726882853.43232: dumping result to json 30564 1726882853.43234: done dumping result, returning 30564 1726882853.43241: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [0e448fcc-3ce9-4216-acec-000000001106] 30564 1726882853.43246: sending task result for task 0e448fcc-3ce9-4216-acec-000000001106 30564 1726882853.43334: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001106 30564 1726882853.43337: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30564 1726882853.43412: no more pending results, returning what we have 30564 1726882853.43415: results queue empty 30564 1726882853.43416: checking for any_errors_fatal 30564 1726882853.43423: done checking for any_errors_fatal 30564 1726882853.43424: checking for max_fail_percentage 30564 1726882853.43425: done checking for max_fail_percentage 30564 1726882853.43426: checking to see if all hosts have failed and the running result is not ok 30564 1726882853.43426: done checking to see if all hosts have failed 30564 1726882853.43427: getting the remaining hosts for this loop 30564 1726882853.43429: done getting the remaining hosts for this loop 30564 1726882853.43432: getting the next task for host managed_node2 30564 1726882853.43438: done getting next task for host managed_node2 30564 1726882853.43441: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30564 1726882853.43452: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882853.43472: getting variables 30564 1726882853.43474: in VariableManager get_vars() 30564 1726882853.43506: Calling all_inventory to load vars for managed_node2 30564 1726882853.43509: Calling groups_inventory to load vars for managed_node2 30564 1726882853.43510: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882853.43517: Calling all_plugins_play to load vars for managed_node2 30564 1726882853.43519: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882853.43521: Calling groups_plugins_play to load vars for managed_node2 30564 1726882853.44382: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882853.45344: done with get_vars() 30564 1726882853.45359: done getting variables 30564 1726882853.45402: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:40:53 -0400 (0:00:00.032) 0:00:52.035 ****** 30564 1726882853.45426: entering _queue_task() for managed_node2/debug 30564 1726882853.45626: worker is 1 (out of 1 available) 30564 1726882853.45638: exiting _queue_task() for managed_node2/debug 30564 1726882853.45650: done queuing things up, now waiting for results queue to drain 30564 1726882853.45652: waiting for pending results... 30564 1726882853.45838: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30564 1726882853.45938: in run() - task 0e448fcc-3ce9-4216-acec-000000001107 30564 1726882853.45954: variable 'ansible_search_path' from source: unknown 30564 1726882853.45958: variable 'ansible_search_path' from source: unknown 30564 1726882853.45990: calling self._execute() 30564 1726882853.46060: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882853.46069: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882853.46080: variable 'omit' from source: magic vars 30564 1726882853.46353: variable 'ansible_distribution_major_version' from source: facts 30564 1726882853.46363: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882853.46369: variable 'omit' from source: magic vars 30564 1726882853.46418: variable 'omit' from source: magic vars 30564 1726882853.46440: variable 'omit' from source: magic vars 30564 1726882853.46475: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882853.46503: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882853.46519: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882853.46534: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882853.46543: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882853.46567: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882853.46573: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882853.46576: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882853.46643: Set connection var ansible_timeout to 10 30564 1726882853.46648: Set connection var ansible_pipelining to False 30564 1726882853.46651: Set connection var ansible_shell_type to sh 30564 1726882853.46656: Set connection var ansible_shell_executable to /bin/sh 30564 1726882853.46662: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882853.46667: Set connection var ansible_connection to ssh 30564 1726882853.46688: variable 'ansible_shell_executable' from source: unknown 30564 1726882853.46691: variable 'ansible_connection' from source: unknown 30564 1726882853.46693: variable 'ansible_module_compression' from source: unknown 30564 1726882853.46696: variable 'ansible_shell_type' from source: unknown 30564 1726882853.46698: variable 'ansible_shell_executable' from source: unknown 30564 1726882853.46701: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882853.46704: variable 'ansible_pipelining' from source: unknown 30564 1726882853.46706: variable 'ansible_timeout' from source: unknown 30564 1726882853.46710: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882853.46808: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882853.46820: variable 'omit' from source: magic vars 30564 1726882853.46823: starting attempt loop 30564 1726882853.46827: running the handler 30564 1726882853.46917: variable '__network_connections_result' from source: set_fact 30564 1726882853.46959: handler run complete 30564 1726882853.46976: attempt loop complete, returning result 30564 1726882853.46979: _execute() done 30564 1726882853.46982: dumping result to json 30564 1726882853.46985: done dumping result, returning 30564 1726882853.46993: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0e448fcc-3ce9-4216-acec-000000001107] 30564 1726882853.46997: sending task result for task 0e448fcc-3ce9-4216-acec-000000001107 30564 1726882853.47092: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001107 30564 1726882853.47094: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, ef91e5fd-4b93-4ee4-ae54-4de7a703b196" ] } 30564 1726882853.47162: no more pending results, returning what we have 30564 1726882853.47167: results queue empty 30564 1726882853.47168: checking for any_errors_fatal 30564 1726882853.47172: done checking for any_errors_fatal 30564 1726882853.47173: checking for max_fail_percentage 30564 1726882853.47174: done checking for max_fail_percentage 30564 1726882853.47175: checking to see if all hosts have failed and the running result is not ok 30564 1726882853.47176: done checking to see if all hosts have failed 30564 1726882853.47177: getting the remaining hosts for this loop 30564 1726882853.47178: done getting the remaining hosts for this loop 30564 1726882853.47181: getting the next task for host managed_node2 30564 1726882853.47187: done getting next task for host managed_node2 30564 1726882853.47191: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30564 1726882853.47196: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882853.47206: getting variables 30564 1726882853.47207: in VariableManager get_vars() 30564 1726882853.47237: Calling all_inventory to load vars for managed_node2 30564 1726882853.47239: Calling groups_inventory to load vars for managed_node2 30564 1726882853.47241: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882853.47248: Calling all_plugins_play to load vars for managed_node2 30564 1726882853.47252: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882853.47255: Calling groups_plugins_play to load vars for managed_node2 30564 1726882853.48062: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882853.49023: done with get_vars() 30564 1726882853.49037: done getting variables 30564 1726882853.49078: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:40:53 -0400 (0:00:00.036) 0:00:52.072 ****** 30564 1726882853.49107: entering _queue_task() for managed_node2/debug 30564 1726882853.49292: worker is 1 (out of 1 available) 30564 1726882853.49304: exiting _queue_task() for managed_node2/debug 30564 1726882853.49316: done queuing things up, now waiting for results queue to drain 30564 1726882853.49317: waiting for pending results... 30564 1726882853.49496: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30564 1726882853.49647: in run() - task 0e448fcc-3ce9-4216-acec-000000001108 30564 1726882853.49676: variable 'ansible_search_path' from source: unknown 30564 1726882853.49685: variable 'ansible_search_path' from source: unknown 30564 1726882853.49722: calling self._execute() 30564 1726882853.49827: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882853.49838: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882853.49852: variable 'omit' from source: magic vars 30564 1726882853.50256: variable 'ansible_distribution_major_version' from source: facts 30564 1726882853.50275: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882853.50287: variable 'omit' from source: magic vars 30564 1726882853.50360: variable 'omit' from source: magic vars 30564 1726882853.50399: variable 'omit' from source: magic vars 30564 1726882853.50455: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882853.50497: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882853.50526: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882853.50554: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882853.50574: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882853.50608: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882853.50618: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882853.50631: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882853.50744: Set connection var ansible_timeout to 10 30564 1726882853.50772: Set connection var ansible_pipelining to False 30564 1726882853.50784: Set connection var ansible_shell_type to sh 30564 1726882853.50799: Set connection var ansible_shell_executable to /bin/sh 30564 1726882853.50807: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882853.50810: Set connection var ansible_connection to ssh 30564 1726882853.50835: variable 'ansible_shell_executable' from source: unknown 30564 1726882853.50840: variable 'ansible_connection' from source: unknown 30564 1726882853.50843: variable 'ansible_module_compression' from source: unknown 30564 1726882853.50845: variable 'ansible_shell_type' from source: unknown 30564 1726882853.50847: variable 'ansible_shell_executable' from source: unknown 30564 1726882853.50849: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882853.50851: variable 'ansible_pipelining' from source: unknown 30564 1726882853.50853: variable 'ansible_timeout' from source: unknown 30564 1726882853.50856: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882853.50983: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882853.50994: variable 'omit' from source: magic vars 30564 1726882853.50998: starting attempt loop 30564 1726882853.51002: running the handler 30564 1726882853.51040: variable '__network_connections_result' from source: set_fact 30564 1726882853.51097: variable '__network_connections_result' from source: set_fact 30564 1726882853.51178: handler run complete 30564 1726882853.51200: attempt loop complete, returning result 30564 1726882853.51203: _execute() done 30564 1726882853.51205: dumping result to json 30564 1726882853.51207: done dumping result, returning 30564 1726882853.51214: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0e448fcc-3ce9-4216-acec-000000001108] 30564 1726882853.51219: sending task result for task 0e448fcc-3ce9-4216-acec-000000001108 30564 1726882853.51312: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001108 30564 1726882853.51315: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, ef91e5fd-4b93-4ee4-ae54-4de7a703b196\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, ef91e5fd-4b93-4ee4-ae54-4de7a703b196" ] } } 30564 1726882853.51409: no more pending results, returning what we have 30564 1726882853.51411: results queue empty 30564 1726882853.51412: checking for any_errors_fatal 30564 1726882853.51418: done checking for any_errors_fatal 30564 1726882853.51418: checking for max_fail_percentage 30564 1726882853.51420: done checking for max_fail_percentage 30564 1726882853.51421: checking to see if all hosts have failed and the running result is not ok 30564 1726882853.51421: done checking to see if all hosts have failed 30564 1726882853.51422: getting the remaining hosts for this loop 30564 1726882853.51425: done getting the remaining hosts for this loop 30564 1726882853.51428: getting the next task for host managed_node2 30564 1726882853.51435: done getting next task for host managed_node2 30564 1726882853.51438: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30564 1726882853.51442: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882853.51452: getting variables 30564 1726882853.51453: in VariableManager get_vars() 30564 1726882853.51491: Calling all_inventory to load vars for managed_node2 30564 1726882853.51493: Calling groups_inventory to load vars for managed_node2 30564 1726882853.51495: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882853.51501: Calling all_plugins_play to load vars for managed_node2 30564 1726882853.51503: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882853.51504: Calling groups_plugins_play to load vars for managed_node2 30564 1726882853.52409: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882853.53722: done with get_vars() 30564 1726882853.53742: done getting variables 30564 1726882853.53794: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:40:53 -0400 (0:00:00.047) 0:00:52.119 ****** 30564 1726882853.53821: entering _queue_task() for managed_node2/debug 30564 1726882853.54051: worker is 1 (out of 1 available) 30564 1726882853.54062: exiting _queue_task() for managed_node2/debug 30564 1726882853.54078: done queuing things up, now waiting for results queue to drain 30564 1726882853.54079: waiting for pending results... 30564 1726882853.54359: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30564 1726882853.54509: in run() - task 0e448fcc-3ce9-4216-acec-000000001109 30564 1726882853.54533: variable 'ansible_search_path' from source: unknown 30564 1726882853.54540: variable 'ansible_search_path' from source: unknown 30564 1726882853.54582: calling self._execute() 30564 1726882853.54680: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882853.54691: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882853.54705: variable 'omit' from source: magic vars 30564 1726882853.55085: variable 'ansible_distribution_major_version' from source: facts 30564 1726882853.55104: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882853.55235: variable 'network_state' from source: role '' defaults 30564 1726882853.55249: Evaluated conditional (network_state != {}): False 30564 1726882853.55257: when evaluation is False, skipping this task 30564 1726882853.55266: _execute() done 30564 1726882853.55281: dumping result to json 30564 1726882853.55288: done dumping result, returning 30564 1726882853.55299: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0e448fcc-3ce9-4216-acec-000000001109] 30564 1726882853.55308: sending task result for task 0e448fcc-3ce9-4216-acec-000000001109 skipping: [managed_node2] => { "false_condition": "network_state != {}" } 30564 1726882853.55452: no more pending results, returning what we have 30564 1726882853.55456: results queue empty 30564 1726882853.55457: checking for any_errors_fatal 30564 1726882853.55471: done checking for any_errors_fatal 30564 1726882853.55472: checking for max_fail_percentage 30564 1726882853.55474: done checking for max_fail_percentage 30564 1726882853.55476: checking to see if all hosts have failed and the running result is not ok 30564 1726882853.55476: done checking to see if all hosts have failed 30564 1726882853.55478: getting the remaining hosts for this loop 30564 1726882853.55480: done getting the remaining hosts for this loop 30564 1726882853.55484: getting the next task for host managed_node2 30564 1726882853.55492: done getting next task for host managed_node2 30564 1726882853.55496: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 30564 1726882853.55501: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882853.55523: getting variables 30564 1726882853.55525: in VariableManager get_vars() 30564 1726882853.55562: Calling all_inventory to load vars for managed_node2 30564 1726882853.55567: Calling groups_inventory to load vars for managed_node2 30564 1726882853.55571: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882853.55584: Calling all_plugins_play to load vars for managed_node2 30564 1726882853.55586: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882853.55589: Calling groups_plugins_play to load vars for managed_node2 30564 1726882853.56584: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001109 30564 1726882853.56587: WORKER PROCESS EXITING 30564 1726882853.57227: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882853.59091: done with get_vars() 30564 1726882853.59115: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:40:53 -0400 (0:00:00.053) 0:00:52.173 ****** 30564 1726882853.59215: entering _queue_task() for managed_node2/ping 30564 1726882853.59507: worker is 1 (out of 1 available) 30564 1726882853.59520: exiting _queue_task() for managed_node2/ping 30564 1726882853.59534: done queuing things up, now waiting for results queue to drain 30564 1726882853.59535: waiting for pending results... 30564 1726882853.59837: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 30564 1726882853.60008: in run() - task 0e448fcc-3ce9-4216-acec-00000000110a 30564 1726882853.60027: variable 'ansible_search_path' from source: unknown 30564 1726882853.60036: variable 'ansible_search_path' from source: unknown 30564 1726882853.60081: calling self._execute() 30564 1726882853.60189: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882853.60206: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882853.60222: variable 'omit' from source: magic vars 30564 1726882853.60586: variable 'ansible_distribution_major_version' from source: facts 30564 1726882853.60602: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882853.60611: variable 'omit' from source: magic vars 30564 1726882853.60681: variable 'omit' from source: magic vars 30564 1726882853.60716: variable 'omit' from source: magic vars 30564 1726882853.60767: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882853.60807: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882853.60829: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882853.60852: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882853.60873: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882853.60904: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882853.60911: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882853.60918: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882853.61022: Set connection var ansible_timeout to 10 30564 1726882853.61031: Set connection var ansible_pipelining to False 30564 1726882853.61036: Set connection var ansible_shell_type to sh 30564 1726882853.61043: Set connection var ansible_shell_executable to /bin/sh 30564 1726882853.61051: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882853.61056: Set connection var ansible_connection to ssh 30564 1726882853.61090: variable 'ansible_shell_executable' from source: unknown 30564 1726882853.61096: variable 'ansible_connection' from source: unknown 30564 1726882853.61101: variable 'ansible_module_compression' from source: unknown 30564 1726882853.61106: variable 'ansible_shell_type' from source: unknown 30564 1726882853.61110: variable 'ansible_shell_executable' from source: unknown 30564 1726882853.61115: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882853.61123: variable 'ansible_pipelining' from source: unknown 30564 1726882853.61128: variable 'ansible_timeout' from source: unknown 30564 1726882853.61134: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882853.61342: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30564 1726882853.61359: variable 'omit' from source: magic vars 30564 1726882853.61373: starting attempt loop 30564 1726882853.61381: running the handler 30564 1726882853.61403: _low_level_execute_command(): starting 30564 1726882853.61415: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882853.62190: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882853.62205: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882853.62222: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882853.62241: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882853.62293: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882853.62305: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882853.62319: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882853.62338: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882853.62350: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882853.62362: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882853.62381: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882853.62399: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882853.62414: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882853.62426: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882853.62439: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882853.62453: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882853.62538: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882853.62562: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882853.62585: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882853.62734: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882853.64394: stdout chunk (state=3): >>>/root <<< 30564 1726882853.64498: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882853.64583: stderr chunk (state=3): >>><<< 30564 1726882853.64595: stdout chunk (state=3): >>><<< 30564 1726882853.64714: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882853.64718: _low_level_execute_command(): starting 30564 1726882853.64721: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882853.6462572-32819-33125559234605 `" && echo ansible-tmp-1726882853.6462572-32819-33125559234605="` echo /root/.ansible/tmp/ansible-tmp-1726882853.6462572-32819-33125559234605 `" ) && sleep 0' 30564 1726882853.65314: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882853.65329: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882853.65344: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882853.65361: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882853.65413: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882853.65425: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882853.65440: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882853.65458: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882853.65476: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882853.65489: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882853.65510: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882853.65524: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882853.65540: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882853.65552: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882853.65563: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882853.65588: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882853.65675: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882853.65698: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882853.65721: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882853.65855: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882853.67831: stdout chunk (state=3): >>>ansible-tmp-1726882853.6462572-32819-33125559234605=/root/.ansible/tmp/ansible-tmp-1726882853.6462572-32819-33125559234605 <<< 30564 1726882853.68009: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882853.68012: stdout chunk (state=3): >>><<< 30564 1726882853.68019: stderr chunk (state=3): >>><<< 30564 1726882853.68038: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882853.6462572-32819-33125559234605=/root/.ansible/tmp/ansible-tmp-1726882853.6462572-32819-33125559234605 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882853.68089: variable 'ansible_module_compression' from source: unknown 30564 1726882853.68130: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 30564 1726882853.68163: variable 'ansible_facts' from source: unknown 30564 1726882853.68243: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882853.6462572-32819-33125559234605/AnsiballZ_ping.py 30564 1726882853.68381: Sending initial data 30564 1726882853.68384: Sent initial data (152 bytes) 30564 1726882853.69322: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882853.69332: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882853.69342: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882853.69355: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882853.69395: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882853.69402: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882853.69412: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882853.69425: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882853.69432: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882853.69439: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882853.69450: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882853.69456: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882853.69472: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882853.69478: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882853.69485: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882853.69495: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882853.69565: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882853.69584: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882853.69596: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882853.69761: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882853.71482: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882853.71573: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882853.71682: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmp5ainy0sz /root/.ansible/tmp/ansible-tmp-1726882853.6462572-32819-33125559234605/AnsiballZ_ping.py <<< 30564 1726882853.71775: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882853.73054: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882853.73145: stderr chunk (state=3): >>><<< 30564 1726882853.73148: stdout chunk (state=3): >>><<< 30564 1726882853.73173: done transferring module to remote 30564 1726882853.73186: _low_level_execute_command(): starting 30564 1726882853.73191: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882853.6462572-32819-33125559234605/ /root/.ansible/tmp/ansible-tmp-1726882853.6462572-32819-33125559234605/AnsiballZ_ping.py && sleep 0' 30564 1726882853.73947: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882853.73955: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882853.73967: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882853.73984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882853.74022: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882853.74036: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882853.74048: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882853.74066: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882853.74080: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882853.74086: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882853.74095: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882853.74104: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882853.74116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882853.74124: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882853.74131: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882853.74147: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882853.74223: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882853.74241: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882853.74260: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882853.74398: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882853.76251: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882853.76360: stderr chunk (state=3): >>><<< 30564 1726882853.76369: stdout chunk (state=3): >>><<< 30564 1726882853.76393: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882853.76396: _low_level_execute_command(): starting 30564 1726882853.76401: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882853.6462572-32819-33125559234605/AnsiballZ_ping.py && sleep 0' 30564 1726882853.77393: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882853.77400: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882853.77441: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882853.77444: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration <<< 30564 1726882853.77457: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882853.77462: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 30564 1726882853.77481: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882853.77555: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882853.77577: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882853.77710: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882853.90757: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 30564 1726882853.91875: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882853.91880: stdout chunk (state=3): >>><<< 30564 1726882853.91886: stderr chunk (state=3): >>><<< 30564 1726882853.91903: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882853.91927: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882853.6462572-32819-33125559234605/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882853.91935: _low_level_execute_command(): starting 30564 1726882853.91940: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882853.6462572-32819-33125559234605/ > /dev/null 2>&1 && sleep 0' 30564 1726882853.93148: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882853.93156: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882853.93171: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882853.93182: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882853.93219: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882853.93226: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882853.93235: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882853.93248: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882853.93255: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882853.93262: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882853.93274: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882853.93284: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882853.93295: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882853.93302: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882853.93308: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882853.93317: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882853.93390: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882853.93408: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882853.93419: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882853.93543: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882853.95423: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882853.95427: stdout chunk (state=3): >>><<< 30564 1726882853.95434: stderr chunk (state=3): >>><<< 30564 1726882853.95787: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882853.95792: handler run complete 30564 1726882853.95811: attempt loop complete, returning result 30564 1726882853.95814: _execute() done 30564 1726882853.95816: dumping result to json 30564 1726882853.95818: done dumping result, returning 30564 1726882853.95829: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0e448fcc-3ce9-4216-acec-00000000110a] 30564 1726882853.95834: sending task result for task 0e448fcc-3ce9-4216-acec-00000000110a 30564 1726882853.95938: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000110a 30564 1726882853.95941: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 30564 1726882853.96131: no more pending results, returning what we have 30564 1726882853.96134: results queue empty 30564 1726882853.96135: checking for any_errors_fatal 30564 1726882853.96140: done checking for any_errors_fatal 30564 1726882853.96141: checking for max_fail_percentage 30564 1726882853.96143: done checking for max_fail_percentage 30564 1726882853.96144: checking to see if all hosts have failed and the running result is not ok 30564 1726882853.96144: done checking to see if all hosts have failed 30564 1726882853.96145: getting the remaining hosts for this loop 30564 1726882853.96147: done getting the remaining hosts for this loop 30564 1726882853.96151: getting the next task for host managed_node2 30564 1726882853.96162: done getting next task for host managed_node2 30564 1726882853.96166: ^ task is: TASK: meta (role_complete) 30564 1726882853.96171: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882853.96184: getting variables 30564 1726882853.96186: in VariableManager get_vars() 30564 1726882853.96228: Calling all_inventory to load vars for managed_node2 30564 1726882853.96231: Calling groups_inventory to load vars for managed_node2 30564 1726882853.96233: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882853.96245: Calling all_plugins_play to load vars for managed_node2 30564 1726882853.96248: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882853.96251: Calling groups_plugins_play to load vars for managed_node2 30564 1726882853.99608: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882854.02781: done with get_vars() 30564 1726882854.02814: done getting variables 30564 1726882854.02899: done queuing things up, now waiting for results queue to drain 30564 1726882854.02901: results queue empty 30564 1726882854.02902: checking for any_errors_fatal 30564 1726882854.02905: done checking for any_errors_fatal 30564 1726882854.02906: checking for max_fail_percentage 30564 1726882854.02908: done checking for max_fail_percentage 30564 1726882854.02908: checking to see if all hosts have failed and the running result is not ok 30564 1726882854.02909: done checking to see if all hosts have failed 30564 1726882854.02910: getting the remaining hosts for this loop 30564 1726882854.02911: done getting the remaining hosts for this loop 30564 1726882854.02919: getting the next task for host managed_node2 30564 1726882854.02924: done getting next task for host managed_node2 30564 1726882854.02926: ^ task is: TASK: Show result 30564 1726882854.02929: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882854.02931: getting variables 30564 1726882854.02932: in VariableManager get_vars() 30564 1726882854.02943: Calling all_inventory to load vars for managed_node2 30564 1726882854.02945: Calling groups_inventory to load vars for managed_node2 30564 1726882854.02947: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882854.02953: Calling all_plugins_play to load vars for managed_node2 30564 1726882854.02955: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882854.02957: Calling groups_plugins_play to load vars for managed_node2 30564 1726882854.05153: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882854.06909: done with get_vars() 30564 1726882854.06940: done getting variables 30564 1726882854.06990: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show result] ************************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml:14 Friday 20 September 2024 21:40:54 -0400 (0:00:00.478) 0:00:52.651 ****** 30564 1726882854.07030: entering _queue_task() for managed_node2/debug 30564 1726882854.07448: worker is 1 (out of 1 available) 30564 1726882854.07469: exiting _queue_task() for managed_node2/debug 30564 1726882854.07482: done queuing things up, now waiting for results queue to drain 30564 1726882854.07483: waiting for pending results... 30564 1726882854.07790: running TaskExecutor() for managed_node2/TASK: Show result 30564 1726882854.07912: in run() - task 0e448fcc-3ce9-4216-acec-000000001090 30564 1726882854.07930: variable 'ansible_search_path' from source: unknown 30564 1726882854.07934: variable 'ansible_search_path' from source: unknown 30564 1726882854.07984: calling self._execute() 30564 1726882854.08071: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882854.08075: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882854.08086: variable 'omit' from source: magic vars 30564 1726882854.08488: variable 'ansible_distribution_major_version' from source: facts 30564 1726882854.08496: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882854.08502: variable 'omit' from source: magic vars 30564 1726882854.08552: variable 'omit' from source: magic vars 30564 1726882854.08590: variable 'omit' from source: magic vars 30564 1726882854.08634: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882854.08675: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882854.08701: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882854.08718: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882854.08730: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882854.08760: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882854.08765: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882854.08772: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882854.08878: Set connection var ansible_timeout to 10 30564 1726882854.08881: Set connection var ansible_pipelining to False 30564 1726882854.08886: Set connection var ansible_shell_type to sh 30564 1726882854.08891: Set connection var ansible_shell_executable to /bin/sh 30564 1726882854.08904: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882854.08907: Set connection var ansible_connection to ssh 30564 1726882854.08932: variable 'ansible_shell_executable' from source: unknown 30564 1726882854.08935: variable 'ansible_connection' from source: unknown 30564 1726882854.08938: variable 'ansible_module_compression' from source: unknown 30564 1726882854.08940: variable 'ansible_shell_type' from source: unknown 30564 1726882854.08942: variable 'ansible_shell_executable' from source: unknown 30564 1726882854.08945: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882854.08947: variable 'ansible_pipelining' from source: unknown 30564 1726882854.08949: variable 'ansible_timeout' from source: unknown 30564 1726882854.08953: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882854.09101: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882854.09111: variable 'omit' from source: magic vars 30564 1726882854.09123: starting attempt loop 30564 1726882854.09126: running the handler 30564 1726882854.09174: variable '__network_connections_result' from source: set_fact 30564 1726882854.09250: variable '__network_connections_result' from source: set_fact 30564 1726882854.09373: handler run complete 30564 1726882854.09398: attempt loop complete, returning result 30564 1726882854.09402: _execute() done 30564 1726882854.09405: dumping result to json 30564 1726882854.09407: done dumping result, returning 30564 1726882854.09410: done running TaskExecutor() for managed_node2/TASK: Show result [0e448fcc-3ce9-4216-acec-000000001090] 30564 1726882854.09418: sending task result for task 0e448fcc-3ce9-4216-acec-000000001090 30564 1726882854.09535: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001090 30564 1726882854.09538: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, ef91e5fd-4b93-4ee4-ae54-4de7a703b196\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, ef91e5fd-4b93-4ee4-ae54-4de7a703b196" ] } } 30564 1726882854.09623: no more pending results, returning what we have 30564 1726882854.09627: results queue empty 30564 1726882854.09628: checking for any_errors_fatal 30564 1726882854.09629: done checking for any_errors_fatal 30564 1726882854.09630: checking for max_fail_percentage 30564 1726882854.09634: done checking for max_fail_percentage 30564 1726882854.09635: checking to see if all hosts have failed and the running result is not ok 30564 1726882854.09636: done checking to see if all hosts have failed 30564 1726882854.09637: getting the remaining hosts for this loop 30564 1726882854.09640: done getting the remaining hosts for this loop 30564 1726882854.09643: getting the next task for host managed_node2 30564 1726882854.09656: done getting next task for host managed_node2 30564 1726882854.09660: ^ task is: TASK: Include network role 30564 1726882854.09666: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882854.09673: getting variables 30564 1726882854.09675: in VariableManager get_vars() 30564 1726882854.09711: Calling all_inventory to load vars for managed_node2 30564 1726882854.09714: Calling groups_inventory to load vars for managed_node2 30564 1726882854.09719: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882854.09730: Calling all_plugins_play to load vars for managed_node2 30564 1726882854.09734: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882854.09737: Calling groups_plugins_play to load vars for managed_node2 30564 1726882854.11493: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882854.14721: done with get_vars() 30564 1726882854.14746: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml:3 Friday 20 September 2024 21:40:54 -0400 (0:00:00.079) 0:00:52.730 ****** 30564 1726882854.14946: entering _queue_task() for managed_node2/include_role 30564 1726882854.15594: worker is 1 (out of 1 available) 30564 1726882854.15607: exiting _queue_task() for managed_node2/include_role 30564 1726882854.15619: done queuing things up, now waiting for results queue to drain 30564 1726882854.15620: waiting for pending results... 30564 1726882854.15917: running TaskExecutor() for managed_node2/TASK: Include network role 30564 1726882854.16038: in run() - task 0e448fcc-3ce9-4216-acec-000000001094 30564 1726882854.16051: variable 'ansible_search_path' from source: unknown 30564 1726882854.16054: variable 'ansible_search_path' from source: unknown 30564 1726882854.16093: calling self._execute() 30564 1726882854.16189: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882854.16195: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882854.16206: variable 'omit' from source: magic vars 30564 1726882854.16578: variable 'ansible_distribution_major_version' from source: facts 30564 1726882854.16590: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882854.16596: _execute() done 30564 1726882854.16599: dumping result to json 30564 1726882854.16608: done dumping result, returning 30564 1726882854.16615: done running TaskExecutor() for managed_node2/TASK: Include network role [0e448fcc-3ce9-4216-acec-000000001094] 30564 1726882854.16621: sending task result for task 0e448fcc-3ce9-4216-acec-000000001094 30564 1726882854.16748: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001094 30564 1726882854.16751: WORKER PROCESS EXITING 30564 1726882854.16780: no more pending results, returning what we have 30564 1726882854.16786: in VariableManager get_vars() 30564 1726882854.16826: Calling all_inventory to load vars for managed_node2 30564 1726882854.16829: Calling groups_inventory to load vars for managed_node2 30564 1726882854.16833: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882854.16847: Calling all_plugins_play to load vars for managed_node2 30564 1726882854.16851: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882854.16854: Calling groups_plugins_play to load vars for managed_node2 30564 1726882854.18485: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882854.20979: done with get_vars() 30564 1726882854.21008: variable 'ansible_search_path' from source: unknown 30564 1726882854.21010: variable 'ansible_search_path' from source: unknown 30564 1726882854.21174: variable 'omit' from source: magic vars 30564 1726882854.21215: variable 'omit' from source: magic vars 30564 1726882854.21231: variable 'omit' from source: magic vars 30564 1726882854.21235: we have included files to process 30564 1726882854.21236: generating all_blocks data 30564 1726882854.21239: done generating all_blocks data 30564 1726882854.21245: processing included file: fedora.linux_system_roles.network 30564 1726882854.21269: in VariableManager get_vars() 30564 1726882854.21285: done with get_vars() 30564 1726882854.21312: in VariableManager get_vars() 30564 1726882854.21328: done with get_vars() 30564 1726882854.21368: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 30564 1726882854.21492: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 30564 1726882854.21704: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 30564 1726882854.22201: in VariableManager get_vars() 30564 1726882854.22223: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30564 1726882854.24378: iterating over new_blocks loaded from include file 30564 1726882854.24380: in VariableManager get_vars() 30564 1726882854.24399: done with get_vars() 30564 1726882854.24401: filtering new block on tags 30564 1726882854.24689: done filtering new block on tags 30564 1726882854.24693: in VariableManager get_vars() 30564 1726882854.24708: done with get_vars() 30564 1726882854.24710: filtering new block on tags 30564 1726882854.24726: done filtering new block on tags 30564 1726882854.24728: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node2 30564 1726882854.24734: extending task lists for all hosts with included blocks 30564 1726882854.24851: done extending task lists 30564 1726882854.24852: done processing included files 30564 1726882854.24853: results queue empty 30564 1726882854.24854: checking for any_errors_fatal 30564 1726882854.24859: done checking for any_errors_fatal 30564 1726882854.24860: checking for max_fail_percentage 30564 1726882854.24862: done checking for max_fail_percentage 30564 1726882854.24862: checking to see if all hosts have failed and the running result is not ok 30564 1726882854.24865: done checking to see if all hosts have failed 30564 1726882854.24865: getting the remaining hosts for this loop 30564 1726882854.24867: done getting the remaining hosts for this loop 30564 1726882854.24870: getting the next task for host managed_node2 30564 1726882854.24874: done getting next task for host managed_node2 30564 1726882854.24877: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30564 1726882854.24880: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882854.24890: getting variables 30564 1726882854.24891: in VariableManager get_vars() 30564 1726882854.24904: Calling all_inventory to load vars for managed_node2 30564 1726882854.24906: Calling groups_inventory to load vars for managed_node2 30564 1726882854.24909: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882854.24914: Calling all_plugins_play to load vars for managed_node2 30564 1726882854.24916: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882854.24919: Calling groups_plugins_play to load vars for managed_node2 30564 1726882854.27110: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882854.28798: done with get_vars() 30564 1726882854.28829: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:40:54 -0400 (0:00:00.139) 0:00:52.870 ****** 30564 1726882854.28915: entering _queue_task() for managed_node2/include_tasks 30564 1726882854.29266: worker is 1 (out of 1 available) 30564 1726882854.29278: exiting _queue_task() for managed_node2/include_tasks 30564 1726882854.29291: done queuing things up, now waiting for results queue to drain 30564 1726882854.29293: waiting for pending results... 30564 1726882854.30025: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30564 1726882854.30253: in run() - task 0e448fcc-3ce9-4216-acec-00000000127a 30564 1726882854.30273: variable 'ansible_search_path' from source: unknown 30564 1726882854.30399: variable 'ansible_search_path' from source: unknown 30564 1726882854.30435: calling self._execute() 30564 1726882854.30653: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882854.30657: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882854.30660: variable 'omit' from source: magic vars 30564 1726882854.31135: variable 'ansible_distribution_major_version' from source: facts 30564 1726882854.31152: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882854.31273: _execute() done 30564 1726882854.31277: dumping result to json 30564 1726882854.31280: done dumping result, returning 30564 1726882854.31285: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0e448fcc-3ce9-4216-acec-00000000127a] 30564 1726882854.31291: sending task result for task 0e448fcc-3ce9-4216-acec-00000000127a 30564 1726882854.31400: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000127a 30564 1726882854.31404: WORKER PROCESS EXITING 30564 1726882854.31456: no more pending results, returning what we have 30564 1726882854.31462: in VariableManager get_vars() 30564 1726882854.31515: Calling all_inventory to load vars for managed_node2 30564 1726882854.31518: Calling groups_inventory to load vars for managed_node2 30564 1726882854.31521: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882854.31536: Calling all_plugins_play to load vars for managed_node2 30564 1726882854.31540: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882854.31543: Calling groups_plugins_play to load vars for managed_node2 30564 1726882854.34021: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882854.35828: done with get_vars() 30564 1726882854.35848: variable 'ansible_search_path' from source: unknown 30564 1726882854.35850: variable 'ansible_search_path' from source: unknown 30564 1726882854.35891: we have included files to process 30564 1726882854.35893: generating all_blocks data 30564 1726882854.35895: done generating all_blocks data 30564 1726882854.35898: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30564 1726882854.35899: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30564 1726882854.35902: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30564 1726882854.36477: done processing included file 30564 1726882854.36480: iterating over new_blocks loaded from include file 30564 1726882854.36481: in VariableManager get_vars() 30564 1726882854.36506: done with get_vars() 30564 1726882854.36508: filtering new block on tags 30564 1726882854.36540: done filtering new block on tags 30564 1726882854.36543: in VariableManager get_vars() 30564 1726882854.36567: done with get_vars() 30564 1726882854.36569: filtering new block on tags 30564 1726882854.36613: done filtering new block on tags 30564 1726882854.36616: in VariableManager get_vars() 30564 1726882854.36638: done with get_vars() 30564 1726882854.36641: filtering new block on tags 30564 1726882854.36687: done filtering new block on tags 30564 1726882854.36689: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 30564 1726882854.36695: extending task lists for all hosts with included blocks 30564 1726882854.38178: done extending task lists 30564 1726882854.38179: done processing included files 30564 1726882854.38179: results queue empty 30564 1726882854.38180: checking for any_errors_fatal 30564 1726882854.38182: done checking for any_errors_fatal 30564 1726882854.38182: checking for max_fail_percentage 30564 1726882854.38183: done checking for max_fail_percentage 30564 1726882854.38184: checking to see if all hosts have failed and the running result is not ok 30564 1726882854.38184: done checking to see if all hosts have failed 30564 1726882854.38185: getting the remaining hosts for this loop 30564 1726882854.38186: done getting the remaining hosts for this loop 30564 1726882854.38188: getting the next task for host managed_node2 30564 1726882854.38192: done getting next task for host managed_node2 30564 1726882854.38194: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30564 1726882854.38197: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882854.38205: getting variables 30564 1726882854.38205: in VariableManager get_vars() 30564 1726882854.38214: Calling all_inventory to load vars for managed_node2 30564 1726882854.38216: Calling groups_inventory to load vars for managed_node2 30564 1726882854.38217: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882854.38220: Calling all_plugins_play to load vars for managed_node2 30564 1726882854.38222: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882854.38223: Calling groups_plugins_play to load vars for managed_node2 30564 1726882854.38886: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882854.40197: done with get_vars() 30564 1726882854.40217: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:40:54 -0400 (0:00:00.113) 0:00:52.984 ****** 30564 1726882854.40510: entering _queue_task() for managed_node2/setup 30564 1726882854.40862: worker is 1 (out of 1 available) 30564 1726882854.40878: exiting _queue_task() for managed_node2/setup 30564 1726882854.40891: done queuing things up, now waiting for results queue to drain 30564 1726882854.40893: waiting for pending results... 30564 1726882854.41191: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30564 1726882854.41326: in run() - task 0e448fcc-3ce9-4216-acec-0000000012d1 30564 1726882854.41344: variable 'ansible_search_path' from source: unknown 30564 1726882854.41348: variable 'ansible_search_path' from source: unknown 30564 1726882854.41382: calling self._execute() 30564 1726882854.41503: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882854.41506: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882854.41517: variable 'omit' from source: magic vars 30564 1726882854.41905: variable 'ansible_distribution_major_version' from source: facts 30564 1726882854.41915: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882854.42077: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882854.44534: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882854.44598: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882854.44631: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882854.44666: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882854.44691: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882854.44767: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882854.44795: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882854.44820: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882854.44860: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882854.44875: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882854.44922: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882854.44944: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882854.44970: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882854.45009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882854.45022: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882854.45178: variable '__network_required_facts' from source: role '' defaults 30564 1726882854.45185: variable 'ansible_facts' from source: unknown 30564 1726882854.46591: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 30564 1726882854.46595: when evaluation is False, skipping this task 30564 1726882854.46597: _execute() done 30564 1726882854.46600: dumping result to json 30564 1726882854.46603: done dumping result, returning 30564 1726882854.46605: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0e448fcc-3ce9-4216-acec-0000000012d1] 30564 1726882854.46610: sending task result for task 0e448fcc-3ce9-4216-acec-0000000012d1 30564 1726882854.46714: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000012d1 30564 1726882854.46718: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30564 1726882854.46759: no more pending results, returning what we have 30564 1726882854.46762: results queue empty 30564 1726882854.46766: checking for any_errors_fatal 30564 1726882854.46788: done checking for any_errors_fatal 30564 1726882854.46790: checking for max_fail_percentage 30564 1726882854.46792: done checking for max_fail_percentage 30564 1726882854.46793: checking to see if all hosts have failed and the running result is not ok 30564 1726882854.46794: done checking to see if all hosts have failed 30564 1726882854.46794: getting the remaining hosts for this loop 30564 1726882854.46796: done getting the remaining hosts for this loop 30564 1726882854.46801: getting the next task for host managed_node2 30564 1726882854.46812: done getting next task for host managed_node2 30564 1726882854.46815: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 30564 1726882854.46821: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882854.46843: getting variables 30564 1726882854.46844: in VariableManager get_vars() 30564 1726882854.46920: Calling all_inventory to load vars for managed_node2 30564 1726882854.46923: Calling groups_inventory to load vars for managed_node2 30564 1726882854.46925: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882854.46933: Calling all_plugins_play to load vars for managed_node2 30564 1726882854.46935: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882854.46942: Calling groups_plugins_play to load vars for managed_node2 30564 1726882854.48421: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882854.50445: done with get_vars() 30564 1726882854.50477: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:40:54 -0400 (0:00:00.102) 0:00:53.087 ****** 30564 1726882854.50586: entering _queue_task() for managed_node2/stat 30564 1726882854.50939: worker is 1 (out of 1 available) 30564 1726882854.50954: exiting _queue_task() for managed_node2/stat 30564 1726882854.51178: done queuing things up, now waiting for results queue to drain 30564 1726882854.51180: waiting for pending results... 30564 1726882854.51466: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 30564 1726882854.51649: in run() - task 0e448fcc-3ce9-4216-acec-0000000012d3 30564 1726882854.51675: variable 'ansible_search_path' from source: unknown 30564 1726882854.51684: variable 'ansible_search_path' from source: unknown 30564 1726882854.51739: calling self._execute() 30564 1726882854.51849: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882854.51861: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882854.51880: variable 'omit' from source: magic vars 30564 1726882854.52362: variable 'ansible_distribution_major_version' from source: facts 30564 1726882854.52387: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882854.52559: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882854.52858: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882854.52916: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882854.52959: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882854.53003: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882854.53100: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882854.53136: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882854.53196: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882854.53234: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882854.53353: variable '__network_is_ostree' from source: set_fact 30564 1726882854.53367: Evaluated conditional (not __network_is_ostree is defined): False 30564 1726882854.53380: when evaluation is False, skipping this task 30564 1726882854.53388: _execute() done 30564 1726882854.53395: dumping result to json 30564 1726882854.53402: done dumping result, returning 30564 1726882854.53413: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0e448fcc-3ce9-4216-acec-0000000012d3] 30564 1726882854.53424: sending task result for task 0e448fcc-3ce9-4216-acec-0000000012d3 skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30564 1726882854.53566: no more pending results, returning what we have 30564 1726882854.53573: results queue empty 30564 1726882854.53574: checking for any_errors_fatal 30564 1726882854.53584: done checking for any_errors_fatal 30564 1726882854.53585: checking for max_fail_percentage 30564 1726882854.53587: done checking for max_fail_percentage 30564 1726882854.53588: checking to see if all hosts have failed and the running result is not ok 30564 1726882854.53588: done checking to see if all hosts have failed 30564 1726882854.53589: getting the remaining hosts for this loop 30564 1726882854.53591: done getting the remaining hosts for this loop 30564 1726882854.53594: getting the next task for host managed_node2 30564 1726882854.53603: done getting next task for host managed_node2 30564 1726882854.53606: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30564 1726882854.53611: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882854.53653: getting variables 30564 1726882854.53656: in VariableManager get_vars() 30564 1726882854.53704: Calling all_inventory to load vars for managed_node2 30564 1726882854.53707: Calling groups_inventory to load vars for managed_node2 30564 1726882854.53709: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882854.53721: Calling all_plugins_play to load vars for managed_node2 30564 1726882854.53724: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882854.53727: Calling groups_plugins_play to load vars for managed_node2 30564 1726882854.54736: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000012d3 30564 1726882854.54740: WORKER PROCESS EXITING 30564 1726882854.56361: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882854.58704: done with get_vars() 30564 1726882854.58732: done getting variables 30564 1726882854.58796: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:40:54 -0400 (0:00:00.082) 0:00:53.169 ****** 30564 1726882854.58841: entering _queue_task() for managed_node2/set_fact 30564 1726882854.59177: worker is 1 (out of 1 available) 30564 1726882854.59191: exiting _queue_task() for managed_node2/set_fact 30564 1726882854.59204: done queuing things up, now waiting for results queue to drain 30564 1726882854.59205: waiting for pending results... 30564 1726882854.59525: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30564 1726882854.59710: in run() - task 0e448fcc-3ce9-4216-acec-0000000012d4 30564 1726882854.59728: variable 'ansible_search_path' from source: unknown 30564 1726882854.59734: variable 'ansible_search_path' from source: unknown 30564 1726882854.59782: calling self._execute() 30564 1726882854.59901: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882854.59921: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882854.59938: variable 'omit' from source: magic vars 30564 1726882854.60317: variable 'ansible_distribution_major_version' from source: facts 30564 1726882854.60333: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882854.60502: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882854.60797: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882854.60849: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882854.60890: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882854.60930: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882854.61028: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882854.61061: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882854.61097: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882854.61134: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882854.61231: variable '__network_is_ostree' from source: set_fact 30564 1726882854.61242: Evaluated conditional (not __network_is_ostree is defined): False 30564 1726882854.61248: when evaluation is False, skipping this task 30564 1726882854.61256: _execute() done 30564 1726882854.61266: dumping result to json 30564 1726882854.61279: done dumping result, returning 30564 1726882854.61292: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0e448fcc-3ce9-4216-acec-0000000012d4] 30564 1726882854.61301: sending task result for task 0e448fcc-3ce9-4216-acec-0000000012d4 skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30564 1726882854.61444: no more pending results, returning what we have 30564 1726882854.61449: results queue empty 30564 1726882854.61450: checking for any_errors_fatal 30564 1726882854.61458: done checking for any_errors_fatal 30564 1726882854.61459: checking for max_fail_percentage 30564 1726882854.61461: done checking for max_fail_percentage 30564 1726882854.61462: checking to see if all hosts have failed and the running result is not ok 30564 1726882854.61462: done checking to see if all hosts have failed 30564 1726882854.61465: getting the remaining hosts for this loop 30564 1726882854.61470: done getting the remaining hosts for this loop 30564 1726882854.61474: getting the next task for host managed_node2 30564 1726882854.61488: done getting next task for host managed_node2 30564 1726882854.61491: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 30564 1726882854.61497: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882854.61523: getting variables 30564 1726882854.61525: in VariableManager get_vars() 30564 1726882854.61562: Calling all_inventory to load vars for managed_node2 30564 1726882854.61566: Calling groups_inventory to load vars for managed_node2 30564 1726882854.61571: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882854.61582: Calling all_plugins_play to load vars for managed_node2 30564 1726882854.61585: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882854.61588: Calling groups_plugins_play to load vars for managed_node2 30564 1726882854.62585: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000012d4 30564 1726882854.62589: WORKER PROCESS EXITING 30564 1726882854.63371: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882854.65198: done with get_vars() 30564 1726882854.65219: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:40:54 -0400 (0:00:00.064) 0:00:53.234 ****** 30564 1726882854.65325: entering _queue_task() for managed_node2/service_facts 30564 1726882854.65626: worker is 1 (out of 1 available) 30564 1726882854.65639: exiting _queue_task() for managed_node2/service_facts 30564 1726882854.65653: done queuing things up, now waiting for results queue to drain 30564 1726882854.65654: waiting for pending results... 30564 1726882854.65943: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 30564 1726882854.66104: in run() - task 0e448fcc-3ce9-4216-acec-0000000012d6 30564 1726882854.66125: variable 'ansible_search_path' from source: unknown 30564 1726882854.66133: variable 'ansible_search_path' from source: unknown 30564 1726882854.66173: calling self._execute() 30564 1726882854.66276: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882854.66287: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882854.66299: variable 'omit' from source: magic vars 30564 1726882854.66676: variable 'ansible_distribution_major_version' from source: facts 30564 1726882854.66696: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882854.66708: variable 'omit' from source: magic vars 30564 1726882854.66813: variable 'omit' from source: magic vars 30564 1726882854.66851: variable 'omit' from source: magic vars 30564 1726882854.66907: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882854.66945: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882854.66981: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882854.67005: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882854.67022: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882854.67056: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882854.67070: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882854.67087: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882854.67204: Set connection var ansible_timeout to 10 30564 1726882854.67216: Set connection var ansible_pipelining to False 30564 1726882854.67223: Set connection var ansible_shell_type to sh 30564 1726882854.67233: Set connection var ansible_shell_executable to /bin/sh 30564 1726882854.67245: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882854.67252: Set connection var ansible_connection to ssh 30564 1726882854.67285: variable 'ansible_shell_executable' from source: unknown 30564 1726882854.67299: variable 'ansible_connection' from source: unknown 30564 1726882854.67311: variable 'ansible_module_compression' from source: unknown 30564 1726882854.67319: variable 'ansible_shell_type' from source: unknown 30564 1726882854.67326: variable 'ansible_shell_executable' from source: unknown 30564 1726882854.67333: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882854.67341: variable 'ansible_pipelining' from source: unknown 30564 1726882854.67348: variable 'ansible_timeout' from source: unknown 30564 1726882854.67356: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882854.67576: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30564 1726882854.67591: variable 'omit' from source: magic vars 30564 1726882854.67601: starting attempt loop 30564 1726882854.67607: running the handler 30564 1726882854.67633: _low_level_execute_command(): starting 30564 1726882854.67646: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882854.68445: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882854.68460: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882854.68481: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882854.68500: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882854.68549: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882854.68565: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882854.68583: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882854.68603: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882854.68614: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882854.68632: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882854.68645: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882854.68660: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882854.68684: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882854.68698: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882854.68711: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882854.68728: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882854.68810: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882854.68833: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882854.68857: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882854.69004: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882854.70657: stdout chunk (state=3): >>>/root <<< 30564 1726882854.70760: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882854.70833: stderr chunk (state=3): >>><<< 30564 1726882854.70836: stdout chunk (state=3): >>><<< 30564 1726882854.70939: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882854.70943: _low_level_execute_command(): starting 30564 1726882854.70946: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882854.708528-32863-277382339399833 `" && echo ansible-tmp-1726882854.708528-32863-277382339399833="` echo /root/.ansible/tmp/ansible-tmp-1726882854.708528-32863-277382339399833 `" ) && sleep 0' 30564 1726882854.71546: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882854.71559: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882854.71579: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882854.71601: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882854.71649: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882854.71662: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882854.71681: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882854.71699: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882854.71712: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882854.71730: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882854.71743: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882854.71756: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882854.71776: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882854.71789: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882854.71800: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882854.71813: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882854.71899: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882854.71920: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882854.71941: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882854.72082: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882854.73952: stdout chunk (state=3): >>>ansible-tmp-1726882854.708528-32863-277382339399833=/root/.ansible/tmp/ansible-tmp-1726882854.708528-32863-277382339399833 <<< 30564 1726882854.74066: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882854.74142: stderr chunk (state=3): >>><<< 30564 1726882854.74152: stdout chunk (state=3): >>><<< 30564 1726882854.74458: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882854.708528-32863-277382339399833=/root/.ansible/tmp/ansible-tmp-1726882854.708528-32863-277382339399833 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882854.74461: variable 'ansible_module_compression' from source: unknown 30564 1726882854.74466: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 30564 1726882854.74471: variable 'ansible_facts' from source: unknown 30564 1726882854.74473: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882854.708528-32863-277382339399833/AnsiballZ_service_facts.py 30564 1726882854.74530: Sending initial data 30564 1726882854.74533: Sent initial data (161 bytes) 30564 1726882854.75514: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882854.75529: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882854.75551: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882854.75576: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882854.75619: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882854.75631: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882854.75645: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882854.75674: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882854.75688: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882854.75699: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882854.75711: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882854.75724: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882854.75740: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882854.75751: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882854.75770: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882854.75787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882854.75861: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882854.75895: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882854.75910: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882854.76039: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882854.77806: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882854.77905: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882854.78009: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmpr1r6oqvp /root/.ansible/tmp/ansible-tmp-1726882854.708528-32863-277382339399833/AnsiballZ_service_facts.py <<< 30564 1726882854.78100: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882854.79426: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882854.79712: stderr chunk (state=3): >>><<< 30564 1726882854.79716: stdout chunk (state=3): >>><<< 30564 1726882854.79718: done transferring module to remote 30564 1726882854.79724: _low_level_execute_command(): starting 30564 1726882854.79727: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882854.708528-32863-277382339399833/ /root/.ansible/tmp/ansible-tmp-1726882854.708528-32863-277382339399833/AnsiballZ_service_facts.py && sleep 0' 30564 1726882854.80356: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882854.80378: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882854.80403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882854.80424: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882854.80475: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882854.80490: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882854.80516: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882854.80536: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882854.80550: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882854.80564: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882854.80583: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882854.80598: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882854.80625: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882854.80640: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882854.80653: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882854.80673: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882854.80761: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882854.80791: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882854.80810: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882854.80940: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882854.82721: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882854.82795: stderr chunk (state=3): >>><<< 30564 1726882854.82807: stdout chunk (state=3): >>><<< 30564 1726882854.82879: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882854.82882: _low_level_execute_command(): starting 30564 1726882854.82885: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882854.708528-32863-277382339399833/AnsiballZ_service_facts.py && sleep 0' 30564 1726882854.83766: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882854.83783: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882854.83801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882854.83820: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882854.83861: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882854.83880: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882854.83895: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882854.83915: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882854.83927: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882854.83938: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882854.83949: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882854.83962: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882854.83983: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882854.83995: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882854.84005: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882854.84020: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882854.84092: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882854.84109: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882854.84124: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882854.84262: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882856.18747: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "sourc<<< 30564 1726882856.18804: stdout chunk (state=3): >>>e": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 30564 1726882856.19986: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882856.20060: stderr chunk (state=3): >>><<< 30564 1726882856.20065: stdout chunk (state=3): >>><<< 30564 1726882856.20087: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882856.25919: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882854.708528-32863-277382339399833/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882856.25923: _low_level_execute_command(): starting 30564 1726882856.25928: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882854.708528-32863-277382339399833/ > /dev/null 2>&1 && sleep 0' 30564 1726882856.26573: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882856.26577: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882856.26591: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882856.26601: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882856.26640: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882856.26647: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882856.26657: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882856.26676: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882856.26686: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882856.26694: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882856.26701: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882856.26709: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882856.26721: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882856.26728: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882856.26735: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882856.26744: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882856.26816: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882856.26831: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882856.26836: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882856.26974: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882856.28837: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882856.28913: stderr chunk (state=3): >>><<< 30564 1726882856.28917: stdout chunk (state=3): >>><<< 30564 1726882856.28930: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882856.28936: handler run complete 30564 1726882856.29089: variable 'ansible_facts' from source: unknown 30564 1726882856.29227: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882856.29647: variable 'ansible_facts' from source: unknown 30564 1726882856.29767: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882856.29948: attempt loop complete, returning result 30564 1726882856.29954: _execute() done 30564 1726882856.29957: dumping result to json 30564 1726882856.30013: done dumping result, returning 30564 1726882856.30021: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [0e448fcc-3ce9-4216-acec-0000000012d6] 30564 1726882856.30024: sending task result for task 0e448fcc-3ce9-4216-acec-0000000012d6 30564 1726882856.37202: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000012d6 30564 1726882856.37206: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30564 1726882856.37246: no more pending results, returning what we have 30564 1726882856.37248: results queue empty 30564 1726882856.37249: checking for any_errors_fatal 30564 1726882856.37252: done checking for any_errors_fatal 30564 1726882856.37253: checking for max_fail_percentage 30564 1726882856.37254: done checking for max_fail_percentage 30564 1726882856.37255: checking to see if all hosts have failed and the running result is not ok 30564 1726882856.37256: done checking to see if all hosts have failed 30564 1726882856.37256: getting the remaining hosts for this loop 30564 1726882856.37257: done getting the remaining hosts for this loop 30564 1726882856.37260: getting the next task for host managed_node2 30564 1726882856.37267: done getting next task for host managed_node2 30564 1726882856.37270: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 30564 1726882856.37276: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882856.37289: getting variables 30564 1726882856.37290: in VariableManager get_vars() 30564 1726882856.37308: Calling all_inventory to load vars for managed_node2 30564 1726882856.37311: Calling groups_inventory to load vars for managed_node2 30564 1726882856.37313: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882856.37318: Calling all_plugins_play to load vars for managed_node2 30564 1726882856.37320: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882856.37323: Calling groups_plugins_play to load vars for managed_node2 30564 1726882856.38933: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882856.41911: done with get_vars() 30564 1726882856.41932: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:40:56 -0400 (0:00:01.767) 0:00:55.001 ****** 30564 1726882856.42040: entering _queue_task() for managed_node2/package_facts 30564 1726882856.42427: worker is 1 (out of 1 available) 30564 1726882856.42438: exiting _queue_task() for managed_node2/package_facts 30564 1726882856.42450: done queuing things up, now waiting for results queue to drain 30564 1726882856.42451: waiting for pending results... 30564 1726882856.44046: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 30564 1726882856.44554: in run() - task 0e448fcc-3ce9-4216-acec-0000000012d7 30564 1726882856.44570: variable 'ansible_search_path' from source: unknown 30564 1726882856.44575: variable 'ansible_search_path' from source: unknown 30564 1726882856.44607: calling self._execute() 30564 1726882856.44719: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882856.44738: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882856.44765: variable 'omit' from source: magic vars 30564 1726882856.45154: variable 'ansible_distribution_major_version' from source: facts 30564 1726882856.45176: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882856.45190: variable 'omit' from source: magic vars 30564 1726882856.45271: variable 'omit' from source: magic vars 30564 1726882856.45311: variable 'omit' from source: magic vars 30564 1726882856.45356: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882856.45401: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882856.45426: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882856.45448: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882856.45466: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882856.45503: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882856.45515: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882856.45523: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882856.45634: Set connection var ansible_timeout to 10 30564 1726882856.45646: Set connection var ansible_pipelining to False 30564 1726882856.45653: Set connection var ansible_shell_type to sh 30564 1726882856.45665: Set connection var ansible_shell_executable to /bin/sh 30564 1726882856.45679: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882856.45686: Set connection var ansible_connection to ssh 30564 1726882856.45716: variable 'ansible_shell_executable' from source: unknown 30564 1726882856.45727: variable 'ansible_connection' from source: unknown 30564 1726882856.45735: variable 'ansible_module_compression' from source: unknown 30564 1726882856.45741: variable 'ansible_shell_type' from source: unknown 30564 1726882856.45747: variable 'ansible_shell_executable' from source: unknown 30564 1726882856.45753: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882856.45760: variable 'ansible_pipelining' from source: unknown 30564 1726882856.45769: variable 'ansible_timeout' from source: unknown 30564 1726882856.45778: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882856.45979: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30564 1726882856.45995: variable 'omit' from source: magic vars 30564 1726882856.46005: starting attempt loop 30564 1726882856.46011: running the handler 30564 1726882856.46028: _low_level_execute_command(): starting 30564 1726882856.46039: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882856.46760: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882856.46782: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882856.46807: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882856.46829: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882856.46872: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882856.46913: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882856.46931: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882856.46949: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882856.46959: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882856.46971: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882856.46984: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882856.46997: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882856.47012: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882856.47024: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882856.47037: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882856.47055: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882856.47135: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882856.47154: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882856.47169: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882856.47462: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882856.49055: stdout chunk (state=3): >>>/root <<< 30564 1726882856.49237: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882856.49240: stdout chunk (state=3): >>><<< 30564 1726882856.49243: stderr chunk (state=3): >>><<< 30564 1726882856.49352: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882856.49355: _low_level_execute_command(): starting 30564 1726882856.49358: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882856.4926112-32915-107933471309794 `" && echo ansible-tmp-1726882856.4926112-32915-107933471309794="` echo /root/.ansible/tmp/ansible-tmp-1726882856.4926112-32915-107933471309794 `" ) && sleep 0' 30564 1726882856.49983: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882856.49986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882856.50027: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882856.50031: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882856.50034: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882856.50100: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882856.50114: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882856.50243: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882856.52186: stdout chunk (state=3): >>>ansible-tmp-1726882856.4926112-32915-107933471309794=/root/.ansible/tmp/ansible-tmp-1726882856.4926112-32915-107933471309794 <<< 30564 1726882856.52367: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882856.52370: stdout chunk (state=3): >>><<< 30564 1726882856.52373: stderr chunk (state=3): >>><<< 30564 1726882856.52473: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882856.4926112-32915-107933471309794=/root/.ansible/tmp/ansible-tmp-1726882856.4926112-32915-107933471309794 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882856.52477: variable 'ansible_module_compression' from source: unknown 30564 1726882856.52571: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 30564 1726882856.52575: variable 'ansible_facts' from source: unknown 30564 1726882856.52747: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882856.4926112-32915-107933471309794/AnsiballZ_package_facts.py 30564 1726882856.52903: Sending initial data 30564 1726882856.52906: Sent initial data (162 bytes) 30564 1726882856.53812: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882856.53826: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882856.53841: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882856.53859: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882856.53902: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882856.53913: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882856.53925: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882856.53942: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882856.53952: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882856.53962: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882856.53978: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882856.53991: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882856.54008: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882856.54020: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882856.54029: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882856.54041: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882856.54117: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882856.54133: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882856.54146: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882856.54287: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882856.56043: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882856.56144: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882856.56258: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmptb_pbv6u /root/.ansible/tmp/ansible-tmp-1726882856.4926112-32915-107933471309794/AnsiballZ_package_facts.py <<< 30564 1726882856.56345: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882856.59361: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882856.59566: stderr chunk (state=3): >>><<< 30564 1726882856.59571: stdout chunk (state=3): >>><<< 30564 1726882856.59574: done transferring module to remote 30564 1726882856.59576: _low_level_execute_command(): starting 30564 1726882856.59579: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882856.4926112-32915-107933471309794/ /root/.ansible/tmp/ansible-tmp-1726882856.4926112-32915-107933471309794/AnsiballZ_package_facts.py && sleep 0' 30564 1726882856.60242: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882856.60257: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882856.60279: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882856.60302: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882856.60353: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882856.60370: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882856.60385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882856.60400: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882856.60411: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882856.60423: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882856.60439: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882856.60452: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882856.60476: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882856.60488: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882856.60498: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882856.60511: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882856.60601: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882856.60639: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882856.60681: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882856.60843: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882856.62659: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882856.62766: stderr chunk (state=3): >>><<< 30564 1726882856.62783: stdout chunk (state=3): >>><<< 30564 1726882856.62870: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882856.62874: _low_level_execute_command(): starting 30564 1726882856.62877: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882856.4926112-32915-107933471309794/AnsiballZ_package_facts.py && sleep 0' 30564 1726882856.63431: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882856.63445: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882856.63471: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882856.63496: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882856.63543: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882856.63572: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882856.63600: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882856.63633: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882856.63652: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882856.63664: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882856.63677: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882856.63690: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882856.63706: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882856.63718: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882856.63742: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882856.63768: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882856.63832: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882856.63835: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882856.63947: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882857.10446: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{<<< 30564 1726882857.10583: stdout chunk (state=3): >>>"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8<<< 30564 1726882857.10599: stdout chunk (state=3): >>>.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 30564 1726882857.12159: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882857.12163: stdout chunk (state=3): >>><<< 30564 1726882857.12167: stderr chunk (state=3): >>><<< 30564 1726882857.12472: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882857.14863: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882856.4926112-32915-107933471309794/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882857.14895: _low_level_execute_command(): starting 30564 1726882857.14904: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882856.4926112-32915-107933471309794/ > /dev/null 2>&1 && sleep 0' 30564 1726882857.15594: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882857.15609: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882857.15624: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882857.15647: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882857.15696: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882857.15709: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882857.15724: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882857.15741: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882857.15758: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882857.15777: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882857.15791: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882857.15805: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882857.15822: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882857.15834: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882857.15845: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882857.15860: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882857.15943: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882857.15968: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882857.15994: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882857.16128: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882857.17984: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882857.18054: stderr chunk (state=3): >>><<< 30564 1726882857.18066: stdout chunk (state=3): >>><<< 30564 1726882857.18374: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882857.18377: handler run complete 30564 1726882857.19111: variable 'ansible_facts' from source: unknown 30564 1726882857.19651: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882857.21116: variable 'ansible_facts' from source: unknown 30564 1726882857.21386: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882857.21824: attempt loop complete, returning result 30564 1726882857.21834: _execute() done 30564 1726882857.21839: dumping result to json 30564 1726882857.21989: done dumping result, returning 30564 1726882857.21991: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0e448fcc-3ce9-4216-acec-0000000012d7] 30564 1726882857.21994: sending task result for task 0e448fcc-3ce9-4216-acec-0000000012d7 30564 1726882857.23960: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000012d7 30564 1726882857.23965: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30564 1726882857.24050: no more pending results, returning what we have 30564 1726882857.24052: results queue empty 30564 1726882857.24053: checking for any_errors_fatal 30564 1726882857.24056: done checking for any_errors_fatal 30564 1726882857.24057: checking for max_fail_percentage 30564 1726882857.24058: done checking for max_fail_percentage 30564 1726882857.24058: checking to see if all hosts have failed and the running result is not ok 30564 1726882857.24059: done checking to see if all hosts have failed 30564 1726882857.24059: getting the remaining hosts for this loop 30564 1726882857.24062: done getting the remaining hosts for this loop 30564 1726882857.24066: getting the next task for host managed_node2 30564 1726882857.24075: done getting next task for host managed_node2 30564 1726882857.24078: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 30564 1726882857.24081: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882857.24089: getting variables 30564 1726882857.24090: in VariableManager get_vars() 30564 1726882857.24113: Calling all_inventory to load vars for managed_node2 30564 1726882857.24115: Calling groups_inventory to load vars for managed_node2 30564 1726882857.24120: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882857.24127: Calling all_plugins_play to load vars for managed_node2 30564 1726882857.24129: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882857.24131: Calling groups_plugins_play to load vars for managed_node2 30564 1726882857.24869: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882857.26361: done with get_vars() 30564 1726882857.26386: done getting variables 30564 1726882857.26448: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:40:57 -0400 (0:00:00.844) 0:00:55.846 ****** 30564 1726882857.26490: entering _queue_task() for managed_node2/debug 30564 1726882857.26903: worker is 1 (out of 1 available) 30564 1726882857.26914: exiting _queue_task() for managed_node2/debug 30564 1726882857.26926: done queuing things up, now waiting for results queue to drain 30564 1726882857.26927: waiting for pending results... 30564 1726882857.27223: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 30564 1726882857.27339: in run() - task 0e448fcc-3ce9-4216-acec-00000000127b 30564 1726882857.27350: variable 'ansible_search_path' from source: unknown 30564 1726882857.27353: variable 'ansible_search_path' from source: unknown 30564 1726882857.27387: calling self._execute() 30564 1726882857.27465: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882857.27470: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882857.27482: variable 'omit' from source: magic vars 30564 1726882857.27763: variable 'ansible_distribution_major_version' from source: facts 30564 1726882857.27780: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882857.27784: variable 'omit' from source: magic vars 30564 1726882857.27831: variable 'omit' from source: magic vars 30564 1726882857.27899: variable 'network_provider' from source: set_fact 30564 1726882857.27912: variable 'omit' from source: magic vars 30564 1726882857.27948: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882857.27979: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882857.27997: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882857.28009: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882857.28018: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882857.28044: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882857.28048: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882857.28050: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882857.28121: Set connection var ansible_timeout to 10 30564 1726882857.28124: Set connection var ansible_pipelining to False 30564 1726882857.28127: Set connection var ansible_shell_type to sh 30564 1726882857.28134: Set connection var ansible_shell_executable to /bin/sh 30564 1726882857.28139: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882857.28144: Set connection var ansible_connection to ssh 30564 1726882857.28166: variable 'ansible_shell_executable' from source: unknown 30564 1726882857.28172: variable 'ansible_connection' from source: unknown 30564 1726882857.28175: variable 'ansible_module_compression' from source: unknown 30564 1726882857.28177: variable 'ansible_shell_type' from source: unknown 30564 1726882857.28179: variable 'ansible_shell_executable' from source: unknown 30564 1726882857.28181: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882857.28184: variable 'ansible_pipelining' from source: unknown 30564 1726882857.28186: variable 'ansible_timeout' from source: unknown 30564 1726882857.28188: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882857.28294: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882857.28303: variable 'omit' from source: magic vars 30564 1726882857.28307: starting attempt loop 30564 1726882857.28310: running the handler 30564 1726882857.28347: handler run complete 30564 1726882857.28358: attempt loop complete, returning result 30564 1726882857.28360: _execute() done 30564 1726882857.28365: dumping result to json 30564 1726882857.28370: done dumping result, returning 30564 1726882857.28374: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [0e448fcc-3ce9-4216-acec-00000000127b] 30564 1726882857.28383: sending task result for task 0e448fcc-3ce9-4216-acec-00000000127b ok: [managed_node2] => {} MSG: Using network provider: nm 30564 1726882857.28532: no more pending results, returning what we have 30564 1726882857.28537: results queue empty 30564 1726882857.28538: checking for any_errors_fatal 30564 1726882857.28552: done checking for any_errors_fatal 30564 1726882857.28553: checking for max_fail_percentage 30564 1726882857.28554: done checking for max_fail_percentage 30564 1726882857.28555: checking to see if all hosts have failed and the running result is not ok 30564 1726882857.28556: done checking to see if all hosts have failed 30564 1726882857.28557: getting the remaining hosts for this loop 30564 1726882857.28559: done getting the remaining hosts for this loop 30564 1726882857.28562: getting the next task for host managed_node2 30564 1726882857.28573: done getting next task for host managed_node2 30564 1726882857.28577: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30564 1726882857.28583: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882857.28595: getting variables 30564 1726882857.28597: in VariableManager get_vars() 30564 1726882857.28629: Calling all_inventory to load vars for managed_node2 30564 1726882857.28632: Calling groups_inventory to load vars for managed_node2 30564 1726882857.28634: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882857.28643: Calling all_plugins_play to load vars for managed_node2 30564 1726882857.28645: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882857.28648: Calling groups_plugins_play to load vars for managed_node2 30564 1726882857.29211: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000127b 30564 1726882857.29215: WORKER PROCESS EXITING 30564 1726882857.30002: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882857.31550: done with get_vars() 30564 1726882857.31577: done getting variables 30564 1726882857.31640: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:40:57 -0400 (0:00:00.051) 0:00:55.898 ****** 30564 1726882857.31685: entering _queue_task() for managed_node2/fail 30564 1726882857.31957: worker is 1 (out of 1 available) 30564 1726882857.31976: exiting _queue_task() for managed_node2/fail 30564 1726882857.31988: done queuing things up, now waiting for results queue to drain 30564 1726882857.31989: waiting for pending results... 30564 1726882857.32160: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30564 1726882857.32249: in run() - task 0e448fcc-3ce9-4216-acec-00000000127c 30564 1726882857.32267: variable 'ansible_search_path' from source: unknown 30564 1726882857.32271: variable 'ansible_search_path' from source: unknown 30564 1726882857.32303: calling self._execute() 30564 1726882857.32387: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882857.32391: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882857.32400: variable 'omit' from source: magic vars 30564 1726882857.32693: variable 'ansible_distribution_major_version' from source: facts 30564 1726882857.32699: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882857.32787: variable 'network_state' from source: role '' defaults 30564 1726882857.32797: Evaluated conditional (network_state != {}): False 30564 1726882857.32805: when evaluation is False, skipping this task 30564 1726882857.32808: _execute() done 30564 1726882857.32810: dumping result to json 30564 1726882857.32813: done dumping result, returning 30564 1726882857.32820: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0e448fcc-3ce9-4216-acec-00000000127c] 30564 1726882857.32825: sending task result for task 0e448fcc-3ce9-4216-acec-00000000127c 30564 1726882857.32913: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000127c 30564 1726882857.32916: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30564 1726882857.32960: no more pending results, returning what we have 30564 1726882857.32967: results queue empty 30564 1726882857.32970: checking for any_errors_fatal 30564 1726882857.32976: done checking for any_errors_fatal 30564 1726882857.32977: checking for max_fail_percentage 30564 1726882857.32978: done checking for max_fail_percentage 30564 1726882857.32979: checking to see if all hosts have failed and the running result is not ok 30564 1726882857.32980: done checking to see if all hosts have failed 30564 1726882857.32981: getting the remaining hosts for this loop 30564 1726882857.32983: done getting the remaining hosts for this loop 30564 1726882857.32986: getting the next task for host managed_node2 30564 1726882857.32993: done getting next task for host managed_node2 30564 1726882857.32997: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30564 1726882857.33002: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882857.33021: getting variables 30564 1726882857.33023: in VariableManager get_vars() 30564 1726882857.33053: Calling all_inventory to load vars for managed_node2 30564 1726882857.33055: Calling groups_inventory to load vars for managed_node2 30564 1726882857.33057: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882857.33071: Calling all_plugins_play to load vars for managed_node2 30564 1726882857.33074: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882857.33078: Calling groups_plugins_play to load vars for managed_node2 30564 1726882857.34189: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882857.35424: done with get_vars() 30564 1726882857.35439: done getting variables 30564 1726882857.35482: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:40:57 -0400 (0:00:00.038) 0:00:55.936 ****** 30564 1726882857.35504: entering _queue_task() for managed_node2/fail 30564 1726882857.35690: worker is 1 (out of 1 available) 30564 1726882857.35702: exiting _queue_task() for managed_node2/fail 30564 1726882857.35712: done queuing things up, now waiting for results queue to drain 30564 1726882857.35714: waiting for pending results... 30564 1726882857.35887: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30564 1726882857.35983: in run() - task 0e448fcc-3ce9-4216-acec-00000000127d 30564 1726882857.35993: variable 'ansible_search_path' from source: unknown 30564 1726882857.35996: variable 'ansible_search_path' from source: unknown 30564 1726882857.36024: calling self._execute() 30564 1726882857.36098: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882857.36102: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882857.36111: variable 'omit' from source: magic vars 30564 1726882857.36372: variable 'ansible_distribution_major_version' from source: facts 30564 1726882857.36390: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882857.36471: variable 'network_state' from source: role '' defaults 30564 1726882857.36477: Evaluated conditional (network_state != {}): False 30564 1726882857.36481: when evaluation is False, skipping this task 30564 1726882857.36484: _execute() done 30564 1726882857.36488: dumping result to json 30564 1726882857.36491: done dumping result, returning 30564 1726882857.36494: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0e448fcc-3ce9-4216-acec-00000000127d] 30564 1726882857.36501: sending task result for task 0e448fcc-3ce9-4216-acec-00000000127d 30564 1726882857.36597: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000127d 30564 1726882857.36600: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30564 1726882857.36653: no more pending results, returning what we have 30564 1726882857.36656: results queue empty 30564 1726882857.36657: checking for any_errors_fatal 30564 1726882857.36662: done checking for any_errors_fatal 30564 1726882857.36665: checking for max_fail_percentage 30564 1726882857.36667: done checking for max_fail_percentage 30564 1726882857.36670: checking to see if all hosts have failed and the running result is not ok 30564 1726882857.36670: done checking to see if all hosts have failed 30564 1726882857.36671: getting the remaining hosts for this loop 30564 1726882857.36672: done getting the remaining hosts for this loop 30564 1726882857.36676: getting the next task for host managed_node2 30564 1726882857.36683: done getting next task for host managed_node2 30564 1726882857.36686: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30564 1726882857.36691: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882857.36710: getting variables 30564 1726882857.36716: in VariableManager get_vars() 30564 1726882857.36740: Calling all_inventory to load vars for managed_node2 30564 1726882857.36742: Calling groups_inventory to load vars for managed_node2 30564 1726882857.36743: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882857.36749: Calling all_plugins_play to load vars for managed_node2 30564 1726882857.36751: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882857.36753: Calling groups_plugins_play to load vars for managed_node2 30564 1726882857.37521: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882857.38492: done with get_vars() 30564 1726882857.38510: done getting variables 30564 1726882857.38551: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:40:57 -0400 (0:00:00.030) 0:00:55.967 ****** 30564 1726882857.38581: entering _queue_task() for managed_node2/fail 30564 1726882857.38799: worker is 1 (out of 1 available) 30564 1726882857.38813: exiting _queue_task() for managed_node2/fail 30564 1726882857.38826: done queuing things up, now waiting for results queue to drain 30564 1726882857.38827: waiting for pending results... 30564 1726882857.39006: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30564 1726882857.39102: in run() - task 0e448fcc-3ce9-4216-acec-00000000127e 30564 1726882857.39112: variable 'ansible_search_path' from source: unknown 30564 1726882857.39116: variable 'ansible_search_path' from source: unknown 30564 1726882857.39146: calling self._execute() 30564 1726882857.39231: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882857.39235: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882857.39243: variable 'omit' from source: magic vars 30564 1726882857.39526: variable 'ansible_distribution_major_version' from source: facts 30564 1726882857.39536: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882857.39656: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882857.41460: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882857.41506: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882857.41535: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882857.41560: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882857.41582: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882857.41637: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882857.41666: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882857.41688: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882857.41714: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882857.41725: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882857.41795: variable 'ansible_distribution_major_version' from source: facts 30564 1726882857.41806: Evaluated conditional (ansible_distribution_major_version | int > 9): False 30564 1726882857.41809: when evaluation is False, skipping this task 30564 1726882857.41812: _execute() done 30564 1726882857.41814: dumping result to json 30564 1726882857.41817: done dumping result, returning 30564 1726882857.41824: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0e448fcc-3ce9-4216-acec-00000000127e] 30564 1726882857.41829: sending task result for task 0e448fcc-3ce9-4216-acec-00000000127e 30564 1726882857.41914: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000127e 30564 1726882857.41917: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 30564 1726882857.41962: no more pending results, returning what we have 30564 1726882857.41968: results queue empty 30564 1726882857.41969: checking for any_errors_fatal 30564 1726882857.41976: done checking for any_errors_fatal 30564 1726882857.41977: checking for max_fail_percentage 30564 1726882857.41979: done checking for max_fail_percentage 30564 1726882857.41980: checking to see if all hosts have failed and the running result is not ok 30564 1726882857.41980: done checking to see if all hosts have failed 30564 1726882857.41981: getting the remaining hosts for this loop 30564 1726882857.41983: done getting the remaining hosts for this loop 30564 1726882857.41986: getting the next task for host managed_node2 30564 1726882857.41994: done getting next task for host managed_node2 30564 1726882857.41998: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30564 1726882857.42003: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882857.42024: getting variables 30564 1726882857.42032: in VariableManager get_vars() 30564 1726882857.42066: Calling all_inventory to load vars for managed_node2 30564 1726882857.42069: Calling groups_inventory to load vars for managed_node2 30564 1726882857.42071: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882857.42080: Calling all_plugins_play to load vars for managed_node2 30564 1726882857.42083: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882857.42085: Calling groups_plugins_play to load vars for managed_node2 30564 1726882857.42976: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882857.44223: done with get_vars() 30564 1726882857.44245: done getting variables 30564 1726882857.44300: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:40:57 -0400 (0:00:00.057) 0:00:56.024 ****** 30564 1726882857.44333: entering _queue_task() for managed_node2/dnf 30564 1726882857.44595: worker is 1 (out of 1 available) 30564 1726882857.44608: exiting _queue_task() for managed_node2/dnf 30564 1726882857.44620: done queuing things up, now waiting for results queue to drain 30564 1726882857.44622: waiting for pending results... 30564 1726882857.44921: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30564 1726882857.45089: in run() - task 0e448fcc-3ce9-4216-acec-00000000127f 30564 1726882857.45108: variable 'ansible_search_path' from source: unknown 30564 1726882857.45117: variable 'ansible_search_path' from source: unknown 30564 1726882857.45168: calling self._execute() 30564 1726882857.45275: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882857.45291: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882857.45307: variable 'omit' from source: magic vars 30564 1726882857.45698: variable 'ansible_distribution_major_version' from source: facts 30564 1726882857.45722: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882857.45923: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882857.48121: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882857.48195: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882857.48236: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882857.48280: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882857.48321: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882857.48490: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882857.48561: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882857.48623: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882857.48662: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882857.48677: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882857.48779: variable 'ansible_distribution' from source: facts 30564 1726882857.48783: variable 'ansible_distribution_major_version' from source: facts 30564 1726882857.48795: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 30564 1726882857.48874: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882857.48957: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882857.48975: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882857.48993: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882857.49018: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882857.49030: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882857.49058: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882857.49079: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882857.49098: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882857.49122: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882857.49132: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882857.49158: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882857.49182: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882857.49204: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882857.49228: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882857.49238: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882857.49346: variable 'network_connections' from source: include params 30564 1726882857.49354: variable 'interface' from source: play vars 30564 1726882857.49403: variable 'interface' from source: play vars 30564 1726882857.49451: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882857.49561: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882857.49591: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882857.49617: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882857.49637: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882857.49667: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882857.49687: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882857.49710: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882857.49729: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882857.49766: variable '__network_team_connections_defined' from source: role '' defaults 30564 1726882857.50044: variable 'network_connections' from source: include params 30564 1726882857.50050: variable 'interface' from source: play vars 30564 1726882857.50105: variable 'interface' from source: play vars 30564 1726882857.50123: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30564 1726882857.50126: when evaluation is False, skipping this task 30564 1726882857.50128: _execute() done 30564 1726882857.50131: dumping result to json 30564 1726882857.50133: done dumping result, returning 30564 1726882857.50140: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0e448fcc-3ce9-4216-acec-00000000127f] 30564 1726882857.50147: sending task result for task 0e448fcc-3ce9-4216-acec-00000000127f skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30564 1726882857.50283: no more pending results, returning what we have 30564 1726882857.50286: results queue empty 30564 1726882857.50287: checking for any_errors_fatal 30564 1726882857.50292: done checking for any_errors_fatal 30564 1726882857.50293: checking for max_fail_percentage 30564 1726882857.50295: done checking for max_fail_percentage 30564 1726882857.50296: checking to see if all hosts have failed and the running result is not ok 30564 1726882857.50297: done checking to see if all hosts have failed 30564 1726882857.50298: getting the remaining hosts for this loop 30564 1726882857.50299: done getting the remaining hosts for this loop 30564 1726882857.50303: getting the next task for host managed_node2 30564 1726882857.50311: done getting next task for host managed_node2 30564 1726882857.50315: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30564 1726882857.50320: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882857.50342: getting variables 30564 1726882857.50344: in VariableManager get_vars() 30564 1726882857.50381: Calling all_inventory to load vars for managed_node2 30564 1726882857.50384: Calling groups_inventory to load vars for managed_node2 30564 1726882857.50386: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882857.50395: Calling all_plugins_play to load vars for managed_node2 30564 1726882857.50398: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882857.50400: Calling groups_plugins_play to load vars for managed_node2 30564 1726882857.50965: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000127f 30564 1726882857.50972: WORKER PROCESS EXITING 30564 1726882857.51294: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882857.53395: done with get_vars() 30564 1726882857.53416: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30564 1726882857.53488: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:40:57 -0400 (0:00:00.091) 0:00:56.116 ****** 30564 1726882857.53519: entering _queue_task() for managed_node2/yum 30564 1726882857.53779: worker is 1 (out of 1 available) 30564 1726882857.53829: exiting _queue_task() for managed_node2/yum 30564 1726882857.53842: done queuing things up, now waiting for results queue to drain 30564 1726882857.53843: waiting for pending results... 30564 1726882857.54125: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30564 1726882857.54273: in run() - task 0e448fcc-3ce9-4216-acec-000000001280 30564 1726882857.54300: variable 'ansible_search_path' from source: unknown 30564 1726882857.54309: variable 'ansible_search_path' from source: unknown 30564 1726882857.54350: calling self._execute() 30564 1726882857.54458: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882857.54471: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882857.54485: variable 'omit' from source: magic vars 30564 1726882857.54850: variable 'ansible_distribution_major_version' from source: facts 30564 1726882857.54869: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882857.55348: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882857.58727: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882857.58801: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882857.58844: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882857.58886: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882857.58922: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882857.59003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882857.59052: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882857.59087: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882857.59792: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882857.59813: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882857.59914: variable 'ansible_distribution_major_version' from source: facts 30564 1726882857.59933: Evaluated conditional (ansible_distribution_major_version | int < 8): False 30564 1726882857.59940: when evaluation is False, skipping this task 30564 1726882857.59947: _execute() done 30564 1726882857.59952: dumping result to json 30564 1726882857.59959: done dumping result, returning 30564 1726882857.59973: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0e448fcc-3ce9-4216-acec-000000001280] 30564 1726882857.59983: sending task result for task 0e448fcc-3ce9-4216-acec-000000001280 30564 1726882857.60092: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001280 30564 1726882857.60099: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 30564 1726882857.60160: no more pending results, returning what we have 30564 1726882857.60166: results queue empty 30564 1726882857.60168: checking for any_errors_fatal 30564 1726882857.60174: done checking for any_errors_fatal 30564 1726882857.60175: checking for max_fail_percentage 30564 1726882857.60177: done checking for max_fail_percentage 30564 1726882857.60178: checking to see if all hosts have failed and the running result is not ok 30564 1726882857.60179: done checking to see if all hosts have failed 30564 1726882857.60180: getting the remaining hosts for this loop 30564 1726882857.60181: done getting the remaining hosts for this loop 30564 1726882857.60186: getting the next task for host managed_node2 30564 1726882857.60194: done getting next task for host managed_node2 30564 1726882857.60199: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30564 1726882857.60204: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882857.60231: getting variables 30564 1726882857.60233: in VariableManager get_vars() 30564 1726882857.60274: Calling all_inventory to load vars for managed_node2 30564 1726882857.60276: Calling groups_inventory to load vars for managed_node2 30564 1726882857.60279: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882857.60290: Calling all_plugins_play to load vars for managed_node2 30564 1726882857.60293: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882857.60296: Calling groups_plugins_play to load vars for managed_node2 30564 1726882857.61980: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882857.63703: done with get_vars() 30564 1726882857.63727: done getting variables 30564 1726882857.63786: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:40:57 -0400 (0:00:00.102) 0:00:56.219 ****** 30564 1726882857.63819: entering _queue_task() for managed_node2/fail 30564 1726882857.64109: worker is 1 (out of 1 available) 30564 1726882857.64122: exiting _queue_task() for managed_node2/fail 30564 1726882857.64134: done queuing things up, now waiting for results queue to drain 30564 1726882857.64135: waiting for pending results... 30564 1726882857.64424: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30564 1726882857.64574: in run() - task 0e448fcc-3ce9-4216-acec-000000001281 30564 1726882857.64597: variable 'ansible_search_path' from source: unknown 30564 1726882857.64605: variable 'ansible_search_path' from source: unknown 30564 1726882857.64644: calling self._execute() 30564 1726882857.64751: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882857.64766: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882857.64783: variable 'omit' from source: magic vars 30564 1726882857.65155: variable 'ansible_distribution_major_version' from source: facts 30564 1726882857.65176: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882857.65299: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882857.65507: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882857.68296: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882857.68366: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882857.68472: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882857.68640: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882857.68674: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882857.68783: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882857.68878: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882857.68960: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882857.69060: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882857.69100: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882857.69198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882857.69286: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882857.69396: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882857.69441: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882857.69461: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882857.69521: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882857.69614: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882857.69724: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882857.69770: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882857.69790: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882857.70197: variable 'network_connections' from source: include params 30564 1726882857.70213: variable 'interface' from source: play vars 30564 1726882857.70287: variable 'interface' from source: play vars 30564 1726882857.70421: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882857.70862: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882857.70934: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882857.71037: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882857.71072: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882857.71156: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882857.71247: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882857.71280: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882857.71360: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882857.71417: variable '__network_team_connections_defined' from source: role '' defaults 30564 1726882857.72008: variable 'network_connections' from source: include params 30564 1726882857.72019: variable 'interface' from source: play vars 30564 1726882857.72084: variable 'interface' from source: play vars 30564 1726882857.72194: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30564 1726882857.72207: when evaluation is False, skipping this task 30564 1726882857.72215: _execute() done 30564 1726882857.72223: dumping result to json 30564 1726882857.72230: done dumping result, returning 30564 1726882857.72242: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-4216-acec-000000001281] 30564 1726882857.72324: sending task result for task 0e448fcc-3ce9-4216-acec-000000001281 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30564 1726882857.72500: no more pending results, returning what we have 30564 1726882857.72506: results queue empty 30564 1726882857.72507: checking for any_errors_fatal 30564 1726882857.72514: done checking for any_errors_fatal 30564 1726882857.72515: checking for max_fail_percentage 30564 1726882857.72517: done checking for max_fail_percentage 30564 1726882857.72518: checking to see if all hosts have failed and the running result is not ok 30564 1726882857.72519: done checking to see if all hosts have failed 30564 1726882857.72520: getting the remaining hosts for this loop 30564 1726882857.72522: done getting the remaining hosts for this loop 30564 1726882857.72526: getting the next task for host managed_node2 30564 1726882857.72536: done getting next task for host managed_node2 30564 1726882857.72540: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 30564 1726882857.72546: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882857.72573: getting variables 30564 1726882857.72575: in VariableManager get_vars() 30564 1726882857.72617: Calling all_inventory to load vars for managed_node2 30564 1726882857.72620: Calling groups_inventory to load vars for managed_node2 30564 1726882857.72623: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882857.72635: Calling all_plugins_play to load vars for managed_node2 30564 1726882857.72638: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882857.72641: Calling groups_plugins_play to load vars for managed_node2 30564 1726882857.73582: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001281 30564 1726882857.73586: WORKER PROCESS EXITING 30564 1726882857.74486: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882857.76892: done with get_vars() 30564 1726882857.76920: done getting variables 30564 1726882857.76989: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:40:57 -0400 (0:00:00.132) 0:00:56.351 ****** 30564 1726882857.77032: entering _queue_task() for managed_node2/package 30564 1726882857.77309: worker is 1 (out of 1 available) 30564 1726882857.77324: exiting _queue_task() for managed_node2/package 30564 1726882857.77336: done queuing things up, now waiting for results queue to drain 30564 1726882857.77337: waiting for pending results... 30564 1726882857.77529: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 30564 1726882857.77631: in run() - task 0e448fcc-3ce9-4216-acec-000000001282 30564 1726882857.77642: variable 'ansible_search_path' from source: unknown 30564 1726882857.77645: variable 'ansible_search_path' from source: unknown 30564 1726882857.77677: calling self._execute() 30564 1726882857.77755: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882857.77759: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882857.77772: variable 'omit' from source: magic vars 30564 1726882857.78043: variable 'ansible_distribution_major_version' from source: facts 30564 1726882857.78050: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882857.78191: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882857.78418: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882857.78467: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882857.78508: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882857.78562: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882857.78678: variable 'network_packages' from source: role '' defaults 30564 1726882857.78781: variable '__network_provider_setup' from source: role '' defaults 30564 1726882857.78796: variable '__network_service_name_default_nm' from source: role '' defaults 30564 1726882857.78889: variable '__network_service_name_default_nm' from source: role '' defaults 30564 1726882857.78908: variable '__network_packages_default_nm' from source: role '' defaults 30564 1726882857.79096: variable '__network_packages_default_nm' from source: role '' defaults 30564 1726882857.79353: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882857.81598: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882857.81643: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882857.81673: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882857.81696: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882857.81715: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882857.81775: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882857.81794: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882857.81811: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882857.81837: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882857.81851: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882857.81883: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882857.81901: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882857.81917: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882857.81941: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882857.81951: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882857.82098: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30564 1726882857.82166: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882857.82187: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882857.82205: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882857.82230: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882857.82240: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882857.82305: variable 'ansible_python' from source: facts 30564 1726882857.82319: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30564 1726882857.82372: variable '__network_wpa_supplicant_required' from source: role '' defaults 30564 1726882857.82433: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30564 1726882857.82519: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882857.82535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882857.82552: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882857.82579: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882857.82590: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882857.82623: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882857.82643: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882857.82660: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882857.82690: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882857.82700: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882857.82797: variable 'network_connections' from source: include params 30564 1726882857.82800: variable 'interface' from source: play vars 30564 1726882857.82872: variable 'interface' from source: play vars 30564 1726882857.83185: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882857.83225: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882857.83273: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882857.83317: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882857.83372: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882857.84731: variable 'network_connections' from source: include params 30564 1726882857.84741: variable 'interface' from source: play vars 30564 1726882857.85175: variable 'interface' from source: play vars 30564 1726882857.85178: variable '__network_packages_default_wireless' from source: role '' defaults 30564 1726882857.85181: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882857.85698: variable 'network_connections' from source: include params 30564 1726882857.85701: variable 'interface' from source: play vars 30564 1726882857.85704: variable 'interface' from source: play vars 30564 1726882857.85706: variable '__network_packages_default_team' from source: role '' defaults 30564 1726882857.85708: variable '__network_team_connections_defined' from source: role '' defaults 30564 1726882857.85921: variable 'network_connections' from source: include params 30564 1726882857.85937: variable 'interface' from source: play vars 30564 1726882857.86013: variable 'interface' from source: play vars 30564 1726882857.86058: variable '__network_service_name_default_initscripts' from source: role '' defaults 30564 1726882857.86115: variable '__network_service_name_default_initscripts' from source: role '' defaults 30564 1726882857.86120: variable '__network_packages_default_initscripts' from source: role '' defaults 30564 1726882857.86168: variable '__network_packages_default_initscripts' from source: role '' defaults 30564 1726882857.86441: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30564 1726882857.86937: variable 'network_connections' from source: include params 30564 1726882857.86946: variable 'interface' from source: play vars 30564 1726882857.87048: variable 'interface' from source: play vars 30564 1726882857.87059: variable 'ansible_distribution' from source: facts 30564 1726882857.87069: variable '__network_rh_distros' from source: role '' defaults 30564 1726882857.87104: variable 'ansible_distribution_major_version' from source: facts 30564 1726882857.87122: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30564 1726882857.87327: variable 'ansible_distribution' from source: facts 30564 1726882857.87335: variable '__network_rh_distros' from source: role '' defaults 30564 1726882857.87344: variable 'ansible_distribution_major_version' from source: facts 30564 1726882857.87371: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30564 1726882857.87546: variable 'ansible_distribution' from source: facts 30564 1726882857.87558: variable '__network_rh_distros' from source: role '' defaults 30564 1726882857.87576: variable 'ansible_distribution_major_version' from source: facts 30564 1726882857.87616: variable 'network_provider' from source: set_fact 30564 1726882857.87641: variable 'ansible_facts' from source: unknown 30564 1726882857.88559: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 30564 1726882857.88562: when evaluation is False, skipping this task 30564 1726882857.88565: _execute() done 30564 1726882857.88567: dumping result to json 30564 1726882857.88570: done dumping result, returning 30564 1726882857.88573: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [0e448fcc-3ce9-4216-acec-000000001282] 30564 1726882857.88575: sending task result for task 0e448fcc-3ce9-4216-acec-000000001282 30564 1726882857.88763: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001282 30564 1726882857.88768: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 30564 1726882857.88816: no more pending results, returning what we have 30564 1726882857.88820: results queue empty 30564 1726882857.88821: checking for any_errors_fatal 30564 1726882857.88826: done checking for any_errors_fatal 30564 1726882857.88826: checking for max_fail_percentage 30564 1726882857.88828: done checking for max_fail_percentage 30564 1726882857.88829: checking to see if all hosts have failed and the running result is not ok 30564 1726882857.88830: done checking to see if all hosts have failed 30564 1726882857.88831: getting the remaining hosts for this loop 30564 1726882857.88832: done getting the remaining hosts for this loop 30564 1726882857.88836: getting the next task for host managed_node2 30564 1726882857.88954: done getting next task for host managed_node2 30564 1726882857.88961: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30564 1726882857.88977: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882857.88999: getting variables 30564 1726882857.89000: in VariableManager get_vars() 30564 1726882857.89044: Calling all_inventory to load vars for managed_node2 30564 1726882857.89048: Calling groups_inventory to load vars for managed_node2 30564 1726882857.89051: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882857.89060: Calling all_plugins_play to load vars for managed_node2 30564 1726882857.89065: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882857.89087: Calling groups_plugins_play to load vars for managed_node2 30564 1726882857.90333: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882857.92201: done with get_vars() 30564 1726882857.92225: done getting variables 30564 1726882857.92301: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:40:57 -0400 (0:00:00.153) 0:00:56.504 ****** 30564 1726882857.92335: entering _queue_task() for managed_node2/package 30564 1726882857.92679: worker is 1 (out of 1 available) 30564 1726882857.92693: exiting _queue_task() for managed_node2/package 30564 1726882857.92705: done queuing things up, now waiting for results queue to drain 30564 1726882857.92706: waiting for pending results... 30564 1726882857.92932: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30564 1726882857.93042: in run() - task 0e448fcc-3ce9-4216-acec-000000001283 30564 1726882857.93055: variable 'ansible_search_path' from source: unknown 30564 1726882857.93059: variable 'ansible_search_path' from source: unknown 30564 1726882857.93092: calling self._execute() 30564 1726882857.93173: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882857.93176: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882857.93186: variable 'omit' from source: magic vars 30564 1726882857.93452: variable 'ansible_distribution_major_version' from source: facts 30564 1726882857.93463: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882857.93549: variable 'network_state' from source: role '' defaults 30564 1726882857.93558: Evaluated conditional (network_state != {}): False 30564 1726882857.93560: when evaluation is False, skipping this task 30564 1726882857.93565: _execute() done 30564 1726882857.93571: dumping result to json 30564 1726882857.93574: done dumping result, returning 30564 1726882857.93579: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0e448fcc-3ce9-4216-acec-000000001283] 30564 1726882857.93585: sending task result for task 0e448fcc-3ce9-4216-acec-000000001283 30564 1726882857.93682: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001283 30564 1726882857.93685: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30564 1726882857.93735: no more pending results, returning what we have 30564 1726882857.93739: results queue empty 30564 1726882857.93740: checking for any_errors_fatal 30564 1726882857.93746: done checking for any_errors_fatal 30564 1726882857.93747: checking for max_fail_percentage 30564 1726882857.93748: done checking for max_fail_percentage 30564 1726882857.93749: checking to see if all hosts have failed and the running result is not ok 30564 1726882857.93750: done checking to see if all hosts have failed 30564 1726882857.93751: getting the remaining hosts for this loop 30564 1726882857.93752: done getting the remaining hosts for this loop 30564 1726882857.93755: getting the next task for host managed_node2 30564 1726882857.93762: done getting next task for host managed_node2 30564 1726882857.93770: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30564 1726882857.93775: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882857.93800: getting variables 30564 1726882857.93802: in VariableManager get_vars() 30564 1726882857.93833: Calling all_inventory to load vars for managed_node2 30564 1726882857.93836: Calling groups_inventory to load vars for managed_node2 30564 1726882857.93838: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882857.93845: Calling all_plugins_play to load vars for managed_node2 30564 1726882857.93847: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882857.93849: Calling groups_plugins_play to load vars for managed_node2 30564 1726882857.94626: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882857.95602: done with get_vars() 30564 1726882857.95618: done getting variables 30564 1726882857.95660: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:40:57 -0400 (0:00:00.033) 0:00:56.538 ****** 30564 1726882857.95688: entering _queue_task() for managed_node2/package 30564 1726882857.95903: worker is 1 (out of 1 available) 30564 1726882857.95917: exiting _queue_task() for managed_node2/package 30564 1726882857.95929: done queuing things up, now waiting for results queue to drain 30564 1726882857.95931: waiting for pending results... 30564 1726882857.96112: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30564 1726882857.96195: in run() - task 0e448fcc-3ce9-4216-acec-000000001284 30564 1726882857.96207: variable 'ansible_search_path' from source: unknown 30564 1726882857.96211: variable 'ansible_search_path' from source: unknown 30564 1726882857.96238: calling self._execute() 30564 1726882857.96319: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882857.96322: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882857.96331: variable 'omit' from source: magic vars 30564 1726882857.96605: variable 'ansible_distribution_major_version' from source: facts 30564 1726882857.96617: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882857.96706: variable 'network_state' from source: role '' defaults 30564 1726882857.96718: Evaluated conditional (network_state != {}): False 30564 1726882857.96721: when evaluation is False, skipping this task 30564 1726882857.96723: _execute() done 30564 1726882857.96726: dumping result to json 30564 1726882857.96730: done dumping result, returning 30564 1726882857.96737: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0e448fcc-3ce9-4216-acec-000000001284] 30564 1726882857.96745: sending task result for task 0e448fcc-3ce9-4216-acec-000000001284 30564 1726882857.96835: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001284 30564 1726882857.96838: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30564 1726882857.96883: no more pending results, returning what we have 30564 1726882857.96887: results queue empty 30564 1726882857.96888: checking for any_errors_fatal 30564 1726882857.96894: done checking for any_errors_fatal 30564 1726882857.96894: checking for max_fail_percentage 30564 1726882857.96896: done checking for max_fail_percentage 30564 1726882857.96897: checking to see if all hosts have failed and the running result is not ok 30564 1726882857.96898: done checking to see if all hosts have failed 30564 1726882857.96899: getting the remaining hosts for this loop 30564 1726882857.96900: done getting the remaining hosts for this loop 30564 1726882857.96904: getting the next task for host managed_node2 30564 1726882857.96912: done getting next task for host managed_node2 30564 1726882857.96916: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30564 1726882857.96921: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882857.96942: getting variables 30564 1726882857.96944: in VariableManager get_vars() 30564 1726882857.96982: Calling all_inventory to load vars for managed_node2 30564 1726882857.96985: Calling groups_inventory to load vars for managed_node2 30564 1726882857.96987: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882857.96996: Calling all_plugins_play to load vars for managed_node2 30564 1726882857.96999: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882857.97001: Calling groups_plugins_play to load vars for managed_node2 30564 1726882857.97891: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882857.98828: done with get_vars() 30564 1726882857.98843: done getting variables 30564 1726882857.98886: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:40:57 -0400 (0:00:00.032) 0:00:56.570 ****** 30564 1726882857.98911: entering _queue_task() for managed_node2/service 30564 1726882857.99115: worker is 1 (out of 1 available) 30564 1726882857.99128: exiting _queue_task() for managed_node2/service 30564 1726882857.99141: done queuing things up, now waiting for results queue to drain 30564 1726882857.99142: waiting for pending results... 30564 1726882857.99331: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30564 1726882857.99428: in run() - task 0e448fcc-3ce9-4216-acec-000000001285 30564 1726882857.99439: variable 'ansible_search_path' from source: unknown 30564 1726882857.99441: variable 'ansible_search_path' from source: unknown 30564 1726882857.99478: calling self._execute() 30564 1726882857.99552: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882857.99560: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882857.99574: variable 'omit' from source: magic vars 30564 1726882857.99853: variable 'ansible_distribution_major_version' from source: facts 30564 1726882857.99864: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882857.99951: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882858.00088: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882858.01685: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882858.01735: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882858.01762: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882858.01790: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882858.01809: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882858.01871: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882858.01903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882858.01920: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882858.01949: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882858.01962: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882858.01999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882858.02016: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882858.02033: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882858.02061: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882858.02075: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882858.02103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882858.02119: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882858.02135: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882858.02161: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882858.02173: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882858.02288: variable 'network_connections' from source: include params 30564 1726882858.02297: variable 'interface' from source: play vars 30564 1726882858.02341: variable 'interface' from source: play vars 30564 1726882858.02397: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882858.02508: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882858.02534: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882858.02556: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882858.02581: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882858.02613: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882858.02628: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882858.02646: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882858.02665: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882858.02704: variable '__network_team_connections_defined' from source: role '' defaults 30564 1726882858.02862: variable 'network_connections' from source: include params 30564 1726882858.02867: variable 'interface' from source: play vars 30564 1726882858.02910: variable 'interface' from source: play vars 30564 1726882858.02930: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30564 1726882858.02934: when evaluation is False, skipping this task 30564 1726882858.02936: _execute() done 30564 1726882858.02939: dumping result to json 30564 1726882858.02941: done dumping result, returning 30564 1726882858.02946: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-4216-acec-000000001285] 30564 1726882858.02955: sending task result for task 0e448fcc-3ce9-4216-acec-000000001285 30564 1726882858.03042: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001285 30564 1726882858.03051: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30564 1726882858.03110: no more pending results, returning what we have 30564 1726882858.03114: results queue empty 30564 1726882858.03115: checking for any_errors_fatal 30564 1726882858.03122: done checking for any_errors_fatal 30564 1726882858.03123: checking for max_fail_percentage 30564 1726882858.03125: done checking for max_fail_percentage 30564 1726882858.03126: checking to see if all hosts have failed and the running result is not ok 30564 1726882858.03126: done checking to see if all hosts have failed 30564 1726882858.03127: getting the remaining hosts for this loop 30564 1726882858.03129: done getting the remaining hosts for this loop 30564 1726882858.03132: getting the next task for host managed_node2 30564 1726882858.03144: done getting next task for host managed_node2 30564 1726882858.03149: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30564 1726882858.03153: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882858.03182: getting variables 30564 1726882858.03184: in VariableManager get_vars() 30564 1726882858.03216: Calling all_inventory to load vars for managed_node2 30564 1726882858.03218: Calling groups_inventory to load vars for managed_node2 30564 1726882858.03220: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882858.03229: Calling all_plugins_play to load vars for managed_node2 30564 1726882858.03231: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882858.03233: Calling groups_plugins_play to load vars for managed_node2 30564 1726882858.04151: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882858.05107: done with get_vars() 30564 1726882858.05123: done getting variables 30564 1726882858.05162: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:40:58 -0400 (0:00:00.062) 0:00:56.633 ****** 30564 1726882858.05190: entering _queue_task() for managed_node2/service 30564 1726882858.05403: worker is 1 (out of 1 available) 30564 1726882858.05416: exiting _queue_task() for managed_node2/service 30564 1726882858.05429: done queuing things up, now waiting for results queue to drain 30564 1726882858.05431: waiting for pending results... 30564 1726882858.05612: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30564 1726882858.05707: in run() - task 0e448fcc-3ce9-4216-acec-000000001286 30564 1726882858.05717: variable 'ansible_search_path' from source: unknown 30564 1726882858.05720: variable 'ansible_search_path' from source: unknown 30564 1726882858.05749: calling self._execute() 30564 1726882858.05822: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882858.05825: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882858.05836: variable 'omit' from source: magic vars 30564 1726882858.06104: variable 'ansible_distribution_major_version' from source: facts 30564 1726882858.06114: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882858.06222: variable 'network_provider' from source: set_fact 30564 1726882858.06225: variable 'network_state' from source: role '' defaults 30564 1726882858.06236: Evaluated conditional (network_provider == "nm" or network_state != {}): True 30564 1726882858.06240: variable 'omit' from source: magic vars 30564 1726882858.06289: variable 'omit' from source: magic vars 30564 1726882858.06309: variable 'network_service_name' from source: role '' defaults 30564 1726882858.06354: variable 'network_service_name' from source: role '' defaults 30564 1726882858.06426: variable '__network_provider_setup' from source: role '' defaults 30564 1726882858.06431: variable '__network_service_name_default_nm' from source: role '' defaults 30564 1726882858.06478: variable '__network_service_name_default_nm' from source: role '' defaults 30564 1726882858.06485: variable '__network_packages_default_nm' from source: role '' defaults 30564 1726882858.06530: variable '__network_packages_default_nm' from source: role '' defaults 30564 1726882858.06678: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882858.08398: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882858.08467: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882858.08508: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882858.08558: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882858.08594: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882858.08679: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882858.08714: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882858.08745: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882858.08796: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882858.08816: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882858.08867: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882858.08897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882858.08926: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882858.08975: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882858.08996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882858.09230: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30564 1726882858.09328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882858.09345: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882858.09362: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882858.09395: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882858.09405: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882858.09476: variable 'ansible_python' from source: facts 30564 1726882858.09487: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30564 1726882858.09548: variable '__network_wpa_supplicant_required' from source: role '' defaults 30564 1726882858.09610: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30564 1726882858.09694: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882858.09715: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882858.09731: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882858.09757: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882858.09773: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882858.09803: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882858.09824: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882858.09840: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882858.09867: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882858.09880: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882858.09971: variable 'network_connections' from source: include params 30564 1726882858.09977: variable 'interface' from source: play vars 30564 1726882858.10026: variable 'interface' from source: play vars 30564 1726882858.10103: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882858.10221: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882858.10255: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882858.10293: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882858.10326: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882858.10387: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882858.10410: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882858.10433: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882858.10455: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882858.10500: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882858.10684: variable 'network_connections' from source: include params 30564 1726882858.10689: variable 'interface' from source: play vars 30564 1726882858.10744: variable 'interface' from source: play vars 30564 1726882858.10768: variable '__network_packages_default_wireless' from source: role '' defaults 30564 1726882858.10825: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882858.11012: variable 'network_connections' from source: include params 30564 1726882858.11015: variable 'interface' from source: play vars 30564 1726882858.11082: variable 'interface' from source: play vars 30564 1726882858.11108: variable '__network_packages_default_team' from source: role '' defaults 30564 1726882858.11187: variable '__network_team_connections_defined' from source: role '' defaults 30564 1726882858.11450: variable 'network_connections' from source: include params 30564 1726882858.11459: variable 'interface' from source: play vars 30564 1726882858.11533: variable 'interface' from source: play vars 30564 1726882858.11592: variable '__network_service_name_default_initscripts' from source: role '' defaults 30564 1726882858.11653: variable '__network_service_name_default_initscripts' from source: role '' defaults 30564 1726882858.11671: variable '__network_packages_default_initscripts' from source: role '' defaults 30564 1726882858.11733: variable '__network_packages_default_initscripts' from source: role '' defaults 30564 1726882858.11952: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30564 1726882858.12412: variable 'network_connections' from source: include params 30564 1726882858.12421: variable 'interface' from source: play vars 30564 1726882858.12486: variable 'interface' from source: play vars 30564 1726882858.12498: variable 'ansible_distribution' from source: facts 30564 1726882858.12505: variable '__network_rh_distros' from source: role '' defaults 30564 1726882858.12515: variable 'ansible_distribution_major_version' from source: facts 30564 1726882858.12533: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30564 1726882858.12708: variable 'ansible_distribution' from source: facts 30564 1726882858.12716: variable '__network_rh_distros' from source: role '' defaults 30564 1726882858.12725: variable 'ansible_distribution_major_version' from source: facts 30564 1726882858.12742: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30564 1726882858.12910: variable 'ansible_distribution' from source: facts 30564 1726882858.12921: variable '__network_rh_distros' from source: role '' defaults 30564 1726882858.12930: variable 'ansible_distribution_major_version' from source: facts 30564 1726882858.12970: variable 'network_provider' from source: set_fact 30564 1726882858.12996: variable 'omit' from source: magic vars 30564 1726882858.13025: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882858.13054: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882858.13083: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882858.13104: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882858.13118: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882858.13149: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882858.13156: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882858.13166: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882858.13272: Set connection var ansible_timeout to 10 30564 1726882858.13283: Set connection var ansible_pipelining to False 30564 1726882858.13290: Set connection var ansible_shell_type to sh 30564 1726882858.13299: Set connection var ansible_shell_executable to /bin/sh 30564 1726882858.13310: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882858.13315: Set connection var ansible_connection to ssh 30564 1726882858.13344: variable 'ansible_shell_executable' from source: unknown 30564 1726882858.13351: variable 'ansible_connection' from source: unknown 30564 1726882858.13357: variable 'ansible_module_compression' from source: unknown 30564 1726882858.13362: variable 'ansible_shell_type' from source: unknown 30564 1726882858.13374: variable 'ansible_shell_executable' from source: unknown 30564 1726882858.13380: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882858.13387: variable 'ansible_pipelining' from source: unknown 30564 1726882858.13393: variable 'ansible_timeout' from source: unknown 30564 1726882858.13403: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882858.13508: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882858.13529: variable 'omit' from source: magic vars 30564 1726882858.13538: starting attempt loop 30564 1726882858.13544: running the handler 30564 1726882858.13625: variable 'ansible_facts' from source: unknown 30564 1726882858.14452: _low_level_execute_command(): starting 30564 1726882858.14462: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882858.15180: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882858.15197: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882858.15213: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882858.15233: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882858.15286: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882858.15298: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882858.15312: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882858.15330: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882858.15342: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882858.15353: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882858.15370: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882858.15385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882858.15401: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882858.15414: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882858.15425: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882858.15439: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882858.15515: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882858.15530: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882858.15543: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882858.15698: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882858.17349: stdout chunk (state=3): >>>/root <<< 30564 1726882858.17455: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882858.17531: stderr chunk (state=3): >>><<< 30564 1726882858.17534: stdout chunk (state=3): >>><<< 30564 1726882858.17631: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882858.17635: _low_level_execute_command(): starting 30564 1726882858.17637: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882858.1754925-32984-29993850040454 `" && echo ansible-tmp-1726882858.1754925-32984-29993850040454="` echo /root/.ansible/tmp/ansible-tmp-1726882858.1754925-32984-29993850040454 `" ) && sleep 0' 30564 1726882858.18217: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882858.18231: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882858.18245: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882858.18259: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882858.18299: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882858.18309: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882858.18322: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882858.18337: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882858.18346: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882858.18354: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882858.18368: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882858.18380: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882858.18395: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882858.18407: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882858.18418: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882858.18432: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882858.18508: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882858.18525: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882858.18540: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882858.18681: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882858.20594: stdout chunk (state=3): >>>ansible-tmp-1726882858.1754925-32984-29993850040454=/root/.ansible/tmp/ansible-tmp-1726882858.1754925-32984-29993850040454 <<< 30564 1726882858.20708: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882858.20797: stderr chunk (state=3): >>><<< 30564 1726882858.20800: stdout chunk (state=3): >>><<< 30564 1726882858.20874: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882858.1754925-32984-29993850040454=/root/.ansible/tmp/ansible-tmp-1726882858.1754925-32984-29993850040454 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882858.20881: variable 'ansible_module_compression' from source: unknown 30564 1726882858.21074: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 30564 1726882858.21078: variable 'ansible_facts' from source: unknown 30564 1726882858.21183: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882858.1754925-32984-29993850040454/AnsiballZ_systemd.py 30564 1726882858.21342: Sending initial data 30564 1726882858.21346: Sent initial data (155 bytes) 30564 1726882858.22316: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882858.22337: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882858.22362: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882858.22368: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882858.22370: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882858.22381: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882858.22387: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration <<< 30564 1726882858.22392: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882858.22401: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882858.22406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882858.22413: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882858.22418: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882858.22423: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882858.22478: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882858.22487: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882858.22490: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882858.22612: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882858.24384: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882858.24480: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882858.24602: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmpdp6qlkly /root/.ansible/tmp/ansible-tmp-1726882858.1754925-32984-29993850040454/AnsiballZ_systemd.py <<< 30564 1726882858.24699: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882858.27680: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882858.27773: stderr chunk (state=3): >>><<< 30564 1726882858.27777: stdout chunk (state=3): >>><<< 30564 1726882858.27794: done transferring module to remote 30564 1726882858.27805: _low_level_execute_command(): starting 30564 1726882858.27810: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882858.1754925-32984-29993850040454/ /root/.ansible/tmp/ansible-tmp-1726882858.1754925-32984-29993850040454/AnsiballZ_systemd.py && sleep 0' 30564 1726882858.28408: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882858.28417: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882858.28426: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882858.28438: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882858.28500: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882858.28508: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882858.28522: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882858.28530: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882858.28536: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882858.28553: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882858.28966: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882858.28978: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882858.28990: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882858.28998: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882858.29007: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882858.29014: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882858.29085: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882858.29098: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882858.29110: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882858.29233: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882858.31145: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882858.31149: stdout chunk (state=3): >>><<< 30564 1726882858.31157: stderr chunk (state=3): >>><<< 30564 1726882858.31178: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882858.31181: _low_level_execute_command(): starting 30564 1726882858.31186: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882858.1754925-32984-29993850040454/AnsiballZ_systemd.py && sleep 0' 30564 1726882858.31792: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882858.31800: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882858.31810: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882858.31829: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882858.31866: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882858.31872: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882858.31883: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882858.31895: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882858.31902: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882858.31908: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882858.31915: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882858.31926: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882858.31942: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882858.31949: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882858.31957: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882858.31970: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882858.32040: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882858.32060: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882858.32074: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882858.32206: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882858.57413: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6692", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ExecMainStartTimestampMonotonic": "202392137", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6692", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManag<<< 30564 1726882858.57450: stdout chunk (state=3): >>>er.service", "ControlGroupId": "3602", "MemoryCurrent": "9158656", "MemoryAvailable": "infinity", "CPUUsageNSec": "2195739000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Watchdo<<< 30564 1726882858.57459: stdout chunk (state=3): >>>gSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service multi-user.target network.target shutdown.target cloud-init.service", "After": "cloud-init-local.service dbus-broker.service network-pre.target system.slice dbus.socket systemd-journald.socket basic.target sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:57 EDT", "StateChangeTimestampMonotonic": "316658837", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveExitTimestampMonotonic": "202392395", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveEnterTimestampMonotonic": "202472383", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveExitTimestampMonotonic": "202362940", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveEnterTimestampMonotonic": "202381901", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ConditionTimestampMonotonic": "202382734", "AssertTimestamp": "Fri 2024-09-20 21:31:03 EDT", "AssertTimestampMonotonic": "202382737", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "55e27919215348fab37a11b7ea324f90", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 30564 1726882858.59124: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882858.59127: stdout chunk (state=3): >>><<< 30564 1726882858.59130: stderr chunk (state=3): >>><<< 30564 1726882858.59172: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6692", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ExecMainStartTimestampMonotonic": "202392137", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6692", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3602", "MemoryCurrent": "9158656", "MemoryAvailable": "infinity", "CPUUsageNSec": "2195739000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service multi-user.target network.target shutdown.target cloud-init.service", "After": "cloud-init-local.service dbus-broker.service network-pre.target system.slice dbus.socket systemd-journald.socket basic.target sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:57 EDT", "StateChangeTimestampMonotonic": "316658837", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveExitTimestampMonotonic": "202392395", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveEnterTimestampMonotonic": "202472383", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveExitTimestampMonotonic": "202362940", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveEnterTimestampMonotonic": "202381901", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ConditionTimestampMonotonic": "202382734", "AssertTimestamp": "Fri 2024-09-20 21:31:03 EDT", "AssertTimestampMonotonic": "202382737", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "55e27919215348fab37a11b7ea324f90", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882858.59442: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882858.1754925-32984-29993850040454/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882858.59446: _low_level_execute_command(): starting 30564 1726882858.59448: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882858.1754925-32984-29993850040454/ > /dev/null 2>&1 && sleep 0' 30564 1726882858.60049: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882858.60065: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882858.60083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882858.60107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882858.60153: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882858.60167: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882858.60184: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882858.60202: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882858.60220: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882858.60235: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882858.60247: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882858.60261: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882858.60283: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882858.60296: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882858.60307: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882858.60323: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882858.60410: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882858.60434: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882858.60459: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882858.60595: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882858.62448: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882858.62519: stderr chunk (state=3): >>><<< 30564 1726882858.62530: stdout chunk (state=3): >>><<< 30564 1726882858.62572: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882858.62575: handler run complete 30564 1726882858.62671: attempt loop complete, returning result 30564 1726882858.62674: _execute() done 30564 1726882858.62676: dumping result to json 30564 1726882858.62679: done dumping result, returning 30564 1726882858.62681: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0e448fcc-3ce9-4216-acec-000000001286] 30564 1726882858.62683: sending task result for task 0e448fcc-3ce9-4216-acec-000000001286 ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30564 1726882858.62991: no more pending results, returning what we have 30564 1726882858.62994: results queue empty 30564 1726882858.62995: checking for any_errors_fatal 30564 1726882858.63000: done checking for any_errors_fatal 30564 1726882858.63000: checking for max_fail_percentage 30564 1726882858.63002: done checking for max_fail_percentage 30564 1726882858.63003: checking to see if all hosts have failed and the running result is not ok 30564 1726882858.63003: done checking to see if all hosts have failed 30564 1726882858.63004: getting the remaining hosts for this loop 30564 1726882858.63006: done getting the remaining hosts for this loop 30564 1726882858.63010: getting the next task for host managed_node2 30564 1726882858.63019: done getting next task for host managed_node2 30564 1726882858.63022: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30564 1726882858.63027: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882858.63039: getting variables 30564 1726882858.63041: in VariableManager get_vars() 30564 1726882858.63082: Calling all_inventory to load vars for managed_node2 30564 1726882858.63084: Calling groups_inventory to load vars for managed_node2 30564 1726882858.63087: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882858.63097: Calling all_plugins_play to load vars for managed_node2 30564 1726882858.63099: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882858.63102: Calling groups_plugins_play to load vars for managed_node2 30564 1726882858.63780: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001286 30564 1726882858.63788: WORKER PROCESS EXITING 30564 1726882858.64666: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882858.66554: done with get_vars() 30564 1726882858.66585: done getting variables 30564 1726882858.66655: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:40:58 -0400 (0:00:00.615) 0:00:57.248 ****** 30564 1726882858.66695: entering _queue_task() for managed_node2/service 30564 1726882858.67048: worker is 1 (out of 1 available) 30564 1726882858.67067: exiting _queue_task() for managed_node2/service 30564 1726882858.67087: done queuing things up, now waiting for results queue to drain 30564 1726882858.67089: waiting for pending results... 30564 1726882858.67420: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30564 1726882858.67571: in run() - task 0e448fcc-3ce9-4216-acec-000000001287 30564 1726882858.67592: variable 'ansible_search_path' from source: unknown 30564 1726882858.67600: variable 'ansible_search_path' from source: unknown 30564 1726882858.67650: calling self._execute() 30564 1726882858.67781: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882858.67792: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882858.67806: variable 'omit' from source: magic vars 30564 1726882858.68221: variable 'ansible_distribution_major_version' from source: facts 30564 1726882858.68240: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882858.68384: variable 'network_provider' from source: set_fact 30564 1726882858.68396: Evaluated conditional (network_provider == "nm"): True 30564 1726882858.68511: variable '__network_wpa_supplicant_required' from source: role '' defaults 30564 1726882858.68617: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30564 1726882858.68825: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882858.77588: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882858.77653: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882858.77706: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882858.77777: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882858.77795: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882858.77865: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882858.77889: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882858.77910: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882858.77936: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882858.77947: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882858.77980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882858.77997: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882858.78017: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882858.78056: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882858.78062: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882858.78092: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882858.78111: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882858.78129: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882858.78153: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882858.78165: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882858.78256: variable 'network_connections' from source: include params 30564 1726882858.78266: variable 'interface' from source: play vars 30564 1726882858.78317: variable 'interface' from source: play vars 30564 1726882858.78365: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882858.78466: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882858.78493: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882858.78515: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882858.78543: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882858.78577: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882858.78593: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882858.78610: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882858.78627: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882858.78656: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882858.78806: variable 'network_connections' from source: include params 30564 1726882858.78810: variable 'interface' from source: play vars 30564 1726882858.78851: variable 'interface' from source: play vars 30564 1726882858.78874: Evaluated conditional (__network_wpa_supplicant_required): False 30564 1726882858.78878: when evaluation is False, skipping this task 30564 1726882858.78880: _execute() done 30564 1726882858.78882: dumping result to json 30564 1726882858.78884: done dumping result, returning 30564 1726882858.78891: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0e448fcc-3ce9-4216-acec-000000001287] 30564 1726882858.78900: sending task result for task 0e448fcc-3ce9-4216-acec-000000001287 30564 1726882858.78986: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001287 30564 1726882858.78988: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 30564 1726882858.79030: no more pending results, returning what we have 30564 1726882858.79033: results queue empty 30564 1726882858.79034: checking for any_errors_fatal 30564 1726882858.79050: done checking for any_errors_fatal 30564 1726882858.79050: checking for max_fail_percentage 30564 1726882858.79052: done checking for max_fail_percentage 30564 1726882858.79053: checking to see if all hosts have failed and the running result is not ok 30564 1726882858.79054: done checking to see if all hosts have failed 30564 1726882858.79055: getting the remaining hosts for this loop 30564 1726882858.79056: done getting the remaining hosts for this loop 30564 1726882858.79060: getting the next task for host managed_node2 30564 1726882858.79071: done getting next task for host managed_node2 30564 1726882858.79076: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 30564 1726882858.79081: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882858.79106: getting variables 30564 1726882858.79107: in VariableManager get_vars() 30564 1726882858.79142: Calling all_inventory to load vars for managed_node2 30564 1726882858.79145: Calling groups_inventory to load vars for managed_node2 30564 1726882858.79147: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882858.79156: Calling all_plugins_play to load vars for managed_node2 30564 1726882858.79158: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882858.79161: Calling groups_plugins_play to load vars for managed_node2 30564 1726882858.85074: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882858.87194: done with get_vars() 30564 1726882858.87218: done getting variables 30564 1726882858.87271: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:40:58 -0400 (0:00:00.206) 0:00:57.454 ****** 30564 1726882858.87302: entering _queue_task() for managed_node2/service 30564 1726882858.87632: worker is 1 (out of 1 available) 30564 1726882858.87643: exiting _queue_task() for managed_node2/service 30564 1726882858.87656: done queuing things up, now waiting for results queue to drain 30564 1726882858.87658: waiting for pending results... 30564 1726882858.87976: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 30564 1726882858.88144: in run() - task 0e448fcc-3ce9-4216-acec-000000001288 30564 1726882858.88165: variable 'ansible_search_path' from source: unknown 30564 1726882858.88177: variable 'ansible_search_path' from source: unknown 30564 1726882858.88221: calling self._execute() 30564 1726882858.88328: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882858.88340: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882858.88354: variable 'omit' from source: magic vars 30564 1726882858.88741: variable 'ansible_distribution_major_version' from source: facts 30564 1726882858.88765: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882858.88893: variable 'network_provider' from source: set_fact 30564 1726882858.88904: Evaluated conditional (network_provider == "initscripts"): False 30564 1726882858.88913: when evaluation is False, skipping this task 30564 1726882858.88920: _execute() done 30564 1726882858.88927: dumping result to json 30564 1726882858.88934: done dumping result, returning 30564 1726882858.88945: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [0e448fcc-3ce9-4216-acec-000000001288] 30564 1726882858.88955: sending task result for task 0e448fcc-3ce9-4216-acec-000000001288 skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30564 1726882858.89110: no more pending results, returning what we have 30564 1726882858.89114: results queue empty 30564 1726882858.89115: checking for any_errors_fatal 30564 1726882858.89125: done checking for any_errors_fatal 30564 1726882858.89126: checking for max_fail_percentage 30564 1726882858.89127: done checking for max_fail_percentage 30564 1726882858.89128: checking to see if all hosts have failed and the running result is not ok 30564 1726882858.89129: done checking to see if all hosts have failed 30564 1726882858.89130: getting the remaining hosts for this loop 30564 1726882858.89132: done getting the remaining hosts for this loop 30564 1726882858.89135: getting the next task for host managed_node2 30564 1726882858.89144: done getting next task for host managed_node2 30564 1726882858.89148: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30564 1726882858.89153: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882858.89186: getting variables 30564 1726882858.89188: in VariableManager get_vars() 30564 1726882858.89227: Calling all_inventory to load vars for managed_node2 30564 1726882858.89230: Calling groups_inventory to load vars for managed_node2 30564 1726882858.89233: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882858.89245: Calling all_plugins_play to load vars for managed_node2 30564 1726882858.89248: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882858.89251: Calling groups_plugins_play to load vars for managed_node2 30564 1726882858.90281: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001288 30564 1726882858.90285: WORKER PROCESS EXITING 30564 1726882858.91096: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882858.92896: done with get_vars() 30564 1726882858.92918: done getting variables 30564 1726882858.92980: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:40:58 -0400 (0:00:00.057) 0:00:57.511 ****** 30564 1726882858.93013: entering _queue_task() for managed_node2/copy 30564 1726882858.93314: worker is 1 (out of 1 available) 30564 1726882858.93326: exiting _queue_task() for managed_node2/copy 30564 1726882858.93338: done queuing things up, now waiting for results queue to drain 30564 1726882858.93339: waiting for pending results... 30564 1726882858.93628: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30564 1726882858.93782: in run() - task 0e448fcc-3ce9-4216-acec-000000001289 30564 1726882858.93801: variable 'ansible_search_path' from source: unknown 30564 1726882858.93809: variable 'ansible_search_path' from source: unknown 30564 1726882858.93845: calling self._execute() 30564 1726882858.93951: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882858.93962: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882858.93981: variable 'omit' from source: magic vars 30564 1726882858.94361: variable 'ansible_distribution_major_version' from source: facts 30564 1726882858.94385: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882858.94508: variable 'network_provider' from source: set_fact 30564 1726882858.94519: Evaluated conditional (network_provider == "initscripts"): False 30564 1726882858.94527: when evaluation is False, skipping this task 30564 1726882858.94537: _execute() done 30564 1726882858.94545: dumping result to json 30564 1726882858.94552: done dumping result, returning 30564 1726882858.94565: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0e448fcc-3ce9-4216-acec-000000001289] 30564 1726882858.94578: sending task result for task 0e448fcc-3ce9-4216-acec-000000001289 skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 30564 1726882858.94724: no more pending results, returning what we have 30564 1726882858.94728: results queue empty 30564 1726882858.94729: checking for any_errors_fatal 30564 1726882858.94737: done checking for any_errors_fatal 30564 1726882858.94738: checking for max_fail_percentage 30564 1726882858.94740: done checking for max_fail_percentage 30564 1726882858.94741: checking to see if all hosts have failed and the running result is not ok 30564 1726882858.94742: done checking to see if all hosts have failed 30564 1726882858.94743: getting the remaining hosts for this loop 30564 1726882858.94744: done getting the remaining hosts for this loop 30564 1726882858.94748: getting the next task for host managed_node2 30564 1726882858.94757: done getting next task for host managed_node2 30564 1726882858.94761: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30564 1726882858.94771: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882858.94798: getting variables 30564 1726882858.94800: in VariableManager get_vars() 30564 1726882858.94838: Calling all_inventory to load vars for managed_node2 30564 1726882858.94841: Calling groups_inventory to load vars for managed_node2 30564 1726882858.94844: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882858.94855: Calling all_plugins_play to load vars for managed_node2 30564 1726882858.94858: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882858.94861: Calling groups_plugins_play to load vars for managed_node2 30564 1726882858.95884: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001289 30564 1726882858.95888: WORKER PROCESS EXITING 30564 1726882858.96555: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882858.98363: done with get_vars() 30564 1726882858.98387: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:40:58 -0400 (0:00:00.054) 0:00:57.565 ****** 30564 1726882858.98466: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 30564 1726882858.98720: worker is 1 (out of 1 available) 30564 1726882858.98732: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 30564 1726882858.98744: done queuing things up, now waiting for results queue to drain 30564 1726882858.98745: waiting for pending results... 30564 1726882858.99020: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30564 1726882858.99177: in run() - task 0e448fcc-3ce9-4216-acec-00000000128a 30564 1726882858.99201: variable 'ansible_search_path' from source: unknown 30564 1726882858.99211: variable 'ansible_search_path' from source: unknown 30564 1726882858.99246: calling self._execute() 30564 1726882858.99348: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882858.99359: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882858.99378: variable 'omit' from source: magic vars 30564 1726882858.99753: variable 'ansible_distribution_major_version' from source: facts 30564 1726882858.99776: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882858.99787: variable 'omit' from source: magic vars 30564 1726882858.99863: variable 'omit' from source: magic vars 30564 1726882859.00033: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882859.02636: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882859.02714: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882859.02776: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882859.02818: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882859.02852: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882859.02951: variable 'network_provider' from source: set_fact 30564 1726882859.03102: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882859.03138: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882859.03175: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882859.03228: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882859.03251: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882859.03340: variable 'omit' from source: magic vars 30564 1726882859.03458: variable 'omit' from source: magic vars 30564 1726882859.03567: variable 'network_connections' from source: include params 30564 1726882859.03586: variable 'interface' from source: play vars 30564 1726882859.03650: variable 'interface' from source: play vars 30564 1726882859.03806: variable 'omit' from source: magic vars 30564 1726882859.03821: variable '__lsr_ansible_managed' from source: task vars 30564 1726882859.03894: variable '__lsr_ansible_managed' from source: task vars 30564 1726882859.04074: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 30564 1726882859.04293: Loaded config def from plugin (lookup/template) 30564 1726882859.04304: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 30564 1726882859.04335: File lookup term: get_ansible_managed.j2 30564 1726882859.04343: variable 'ansible_search_path' from source: unknown 30564 1726882859.04352: evaluation_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 30564 1726882859.04375: search_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 30564 1726882859.04401: variable 'ansible_search_path' from source: unknown 30564 1726882859.10894: variable 'ansible_managed' from source: unknown 30564 1726882859.11046: variable 'omit' from source: magic vars 30564 1726882859.11084: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882859.11121: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882859.11145: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882859.11173: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882859.11190: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882859.11225: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882859.11236: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882859.11244: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882859.11354: Set connection var ansible_timeout to 10 30564 1726882859.11367: Set connection var ansible_pipelining to False 30564 1726882859.11379: Set connection var ansible_shell_type to sh 30564 1726882859.11390: Set connection var ansible_shell_executable to /bin/sh 30564 1726882859.11402: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882859.11408: Set connection var ansible_connection to ssh 30564 1726882859.11439: variable 'ansible_shell_executable' from source: unknown 30564 1726882859.11451: variable 'ansible_connection' from source: unknown 30564 1726882859.11458: variable 'ansible_module_compression' from source: unknown 30564 1726882859.11470: variable 'ansible_shell_type' from source: unknown 30564 1726882859.11478: variable 'ansible_shell_executable' from source: unknown 30564 1726882859.11486: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882859.11502: variable 'ansible_pipelining' from source: unknown 30564 1726882859.11510: variable 'ansible_timeout' from source: unknown 30564 1726882859.11518: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882859.11658: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30564 1726882859.11691: variable 'omit' from source: magic vars 30564 1726882859.11702: starting attempt loop 30564 1726882859.11709: running the handler 30564 1726882859.11727: _low_level_execute_command(): starting 30564 1726882859.11737: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882859.12512: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882859.12525: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882859.12539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882859.12561: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882859.12607: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882859.12621: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882859.12637: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882859.12658: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882859.12679: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882859.12691: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882859.12702: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882859.12715: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882859.12730: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882859.12743: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882859.12757: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882859.12777: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882859.12819: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882859.12836: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882859.12840: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882859.12951: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882859.14605: stdout chunk (state=3): >>>/root <<< 30564 1726882859.14714: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882859.14758: stderr chunk (state=3): >>><<< 30564 1726882859.14761: stdout chunk (state=3): >>><<< 30564 1726882859.14782: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882859.14791: _low_level_execute_command(): starting 30564 1726882859.14796: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882859.147813-33022-246359935929269 `" && echo ansible-tmp-1726882859.147813-33022-246359935929269="` echo /root/.ansible/tmp/ansible-tmp-1726882859.147813-33022-246359935929269 `" ) && sleep 0' 30564 1726882859.15360: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882859.15382: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882859.15504: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882859.17371: stdout chunk (state=3): >>>ansible-tmp-1726882859.147813-33022-246359935929269=/root/.ansible/tmp/ansible-tmp-1726882859.147813-33022-246359935929269 <<< 30564 1726882859.17483: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882859.17526: stderr chunk (state=3): >>><<< 30564 1726882859.17529: stdout chunk (state=3): >>><<< 30564 1726882859.17541: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882859.147813-33022-246359935929269=/root/.ansible/tmp/ansible-tmp-1726882859.147813-33022-246359935929269 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882859.17577: variable 'ansible_module_compression' from source: unknown 30564 1726882859.17614: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 30564 1726882859.17637: variable 'ansible_facts' from source: unknown 30564 1726882859.17704: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882859.147813-33022-246359935929269/AnsiballZ_network_connections.py 30564 1726882859.17799: Sending initial data 30564 1726882859.17802: Sent initial data (167 bytes) 30564 1726882859.18423: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882859.18431: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882859.18471: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882859.18474: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882859.18484: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882859.18528: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882859.18539: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882859.18644: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882859.20388: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 30564 1726882859.20399: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 30564 1726882859.20409: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 30564 1726882859.20444: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 30564 1726882859.20448: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882859.20562: stderr chunk (state=3): >>>debug1: Using server download size 261120 <<< 30564 1726882859.20567: stderr chunk (state=3): >>>debug1: Using server upload size 261120 <<< 30564 1726882859.20569: stderr chunk (state=3): >>>debug1: Server handle limit 1019; using 64 <<< 30564 1726882859.20669: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmpss9tt3jb /root/.ansible/tmp/ansible-tmp-1726882859.147813-33022-246359935929269/AnsiballZ_network_connections.py <<< 30564 1726882859.20771: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882859.22436: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882859.22656: stderr chunk (state=3): >>><<< 30564 1726882859.22659: stdout chunk (state=3): >>><<< 30564 1726882859.22661: done transferring module to remote 30564 1726882859.22663: _low_level_execute_command(): starting 30564 1726882859.22668: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882859.147813-33022-246359935929269/ /root/.ansible/tmp/ansible-tmp-1726882859.147813-33022-246359935929269/AnsiballZ_network_connections.py && sleep 0' 30564 1726882859.23171: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882859.23183: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882859.23195: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882859.23209: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882859.23277: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882859.23292: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882859.23305: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882859.23322: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882859.23333: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882859.23344: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882859.23357: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882859.23374: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882859.23390: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882859.23402: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882859.23412: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882859.23425: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882859.23498: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882859.23518: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882859.23533: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882859.24359: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882859.25753: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882859.25830: stderr chunk (state=3): >>><<< 30564 1726882859.25840: stdout chunk (state=3): >>><<< 30564 1726882859.25936: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882859.25939: _low_level_execute_command(): starting 30564 1726882859.25941: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882859.147813-33022-246359935929269/AnsiballZ_network_connections.py && sleep 0' 30564 1726882859.26458: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882859.26477: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882859.26496: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882859.26515: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882859.26557: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882859.26573: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882859.26590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882859.26614: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882859.26626: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882859.26638: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882859.26650: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882859.26668: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882859.26689: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882859.26703: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882859.26717: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882859.26730: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882859.26804: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882859.26831: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882859.26848: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882859.26990: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882859.49861: stdout chunk (state=3): >>> {"changed": false, "warnings": [], "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, ef91e5fd-4b93-4ee4-ae54-4de7a703b196 skipped because already active\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 30564 1726882859.51290: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882859.51372: stderr chunk (state=3): >>><<< 30564 1726882859.51375: stdout chunk (state=3): >>><<< 30564 1726882859.51395: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "warnings": [], "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, ef91e5fd-4b93-4ee4-ae54-4de7a703b196 skipped because already active\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882859.51452: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'state': 'up'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882859.147813-33022-246359935929269/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882859.51456: _low_level_execute_command(): starting 30564 1726882859.51458: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882859.147813-33022-246359935929269/ > /dev/null 2>&1 && sleep 0' 30564 1726882859.52106: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882859.52116: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882859.52127: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882859.52142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882859.52273: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882859.52277: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882859.52280: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882859.52283: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882859.52285: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882859.52288: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882859.52290: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882859.52293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882859.52295: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882859.52298: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882859.52301: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882859.52303: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882859.52391: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882859.52394: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882859.52403: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882859.52542: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882859.54443: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882859.54447: stdout chunk (state=3): >>><<< 30564 1726882859.54454: stderr chunk (state=3): >>><<< 30564 1726882859.54476: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882859.54484: handler run complete 30564 1726882859.54510: attempt loop complete, returning result 30564 1726882859.54513: _execute() done 30564 1726882859.54515: dumping result to json 30564 1726882859.54520: done dumping result, returning 30564 1726882859.54530: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0e448fcc-3ce9-4216-acec-00000000128a] 30564 1726882859.54535: sending task result for task 0e448fcc-3ce9-4216-acec-00000000128a 30564 1726882859.54645: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000128a 30564 1726882859.54647: WORKER PROCESS EXITING ok: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "state": "up" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false } STDERR: [002] #0, state:up persistent_state:present, 'statebr': up connection statebr, ef91e5fd-4b93-4ee4-ae54-4de7a703b196 skipped because already active 30564 1726882859.54872: no more pending results, returning what we have 30564 1726882859.54875: results queue empty 30564 1726882859.54876: checking for any_errors_fatal 30564 1726882859.54882: done checking for any_errors_fatal 30564 1726882859.54883: checking for max_fail_percentage 30564 1726882859.54885: done checking for max_fail_percentage 30564 1726882859.54886: checking to see if all hosts have failed and the running result is not ok 30564 1726882859.54887: done checking to see if all hosts have failed 30564 1726882859.54888: getting the remaining hosts for this loop 30564 1726882859.54889: done getting the remaining hosts for this loop 30564 1726882859.54893: getting the next task for host managed_node2 30564 1726882859.54901: done getting next task for host managed_node2 30564 1726882859.54905: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 30564 1726882859.54910: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882859.54924: getting variables 30564 1726882859.54925: in VariableManager get_vars() 30564 1726882859.54966: Calling all_inventory to load vars for managed_node2 30564 1726882859.54969: Calling groups_inventory to load vars for managed_node2 30564 1726882859.54972: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882859.54982: Calling all_plugins_play to load vars for managed_node2 30564 1726882859.54985: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882859.54988: Calling groups_plugins_play to load vars for managed_node2 30564 1726882859.58107: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882859.61788: done with get_vars() 30564 1726882859.61812: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:40:59 -0400 (0:00:00.635) 0:00:58.201 ****** 30564 1726882859.62021: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 30564 1726882859.62826: worker is 1 (out of 1 available) 30564 1726882859.62839: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 30564 1726882859.62851: done queuing things up, now waiting for results queue to drain 30564 1726882859.62853: waiting for pending results... 30564 1726882859.63477: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 30564 1726882859.63927: in run() - task 0e448fcc-3ce9-4216-acec-00000000128b 30564 1726882859.64060: variable 'ansible_search_path' from source: unknown 30564 1726882859.64074: variable 'ansible_search_path' from source: unknown 30564 1726882859.64118: calling self._execute() 30564 1726882859.64276: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882859.64727: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882859.64758: variable 'omit' from source: magic vars 30564 1726882859.65648: variable 'ansible_distribution_major_version' from source: facts 30564 1726882859.65673: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882859.65901: variable 'network_state' from source: role '' defaults 30564 1726882859.66020: Evaluated conditional (network_state != {}): False 30564 1726882859.66195: when evaluation is False, skipping this task 30564 1726882859.66204: _execute() done 30564 1726882859.66212: dumping result to json 30564 1726882859.66219: done dumping result, returning 30564 1726882859.66237: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [0e448fcc-3ce9-4216-acec-00000000128b] 30564 1726882859.66261: sending task result for task 0e448fcc-3ce9-4216-acec-00000000128b skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30564 1726882859.66504: no more pending results, returning what we have 30564 1726882859.66508: results queue empty 30564 1726882859.66509: checking for any_errors_fatal 30564 1726882859.66527: done checking for any_errors_fatal 30564 1726882859.66528: checking for max_fail_percentage 30564 1726882859.66530: done checking for max_fail_percentage 30564 1726882859.66531: checking to see if all hosts have failed and the running result is not ok 30564 1726882859.66532: done checking to see if all hosts have failed 30564 1726882859.66532: getting the remaining hosts for this loop 30564 1726882859.66534: done getting the remaining hosts for this loop 30564 1726882859.66538: getting the next task for host managed_node2 30564 1726882859.66546: done getting next task for host managed_node2 30564 1726882859.66550: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30564 1726882859.66555: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882859.66574: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000128b 30564 1726882859.66587: WORKER PROCESS EXITING 30564 1726882859.66611: getting variables 30564 1726882859.66613: in VariableManager get_vars() 30564 1726882859.66653: Calling all_inventory to load vars for managed_node2 30564 1726882859.66656: Calling groups_inventory to load vars for managed_node2 30564 1726882859.66658: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882859.66672: Calling all_plugins_play to load vars for managed_node2 30564 1726882859.66675: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882859.66683: Calling groups_plugins_play to load vars for managed_node2 30564 1726882859.69982: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882859.74252: done with get_vars() 30564 1726882859.74286: done getting variables 30564 1726882859.74347: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:40:59 -0400 (0:00:00.123) 0:00:58.325 ****** 30564 1726882859.74392: entering _queue_task() for managed_node2/debug 30564 1726882859.74783: worker is 1 (out of 1 available) 30564 1726882859.74800: exiting _queue_task() for managed_node2/debug 30564 1726882859.74812: done queuing things up, now waiting for results queue to drain 30564 1726882859.74814: waiting for pending results... 30564 1726882859.75128: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30564 1726882859.75415: in run() - task 0e448fcc-3ce9-4216-acec-00000000128c 30564 1726882859.75443: variable 'ansible_search_path' from source: unknown 30564 1726882859.75452: variable 'ansible_search_path' from source: unknown 30564 1726882859.75498: calling self._execute() 30564 1726882859.75612: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882859.75624: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882859.75646: variable 'omit' from source: magic vars 30564 1726882859.76207: variable 'ansible_distribution_major_version' from source: facts 30564 1726882859.76233: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882859.76243: variable 'omit' from source: magic vars 30564 1726882859.76320: variable 'omit' from source: magic vars 30564 1726882859.76356: variable 'omit' from source: magic vars 30564 1726882859.76414: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882859.76460: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882859.76487: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882859.76510: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882859.76530: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882859.76572: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882859.76582: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882859.76591: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882859.76708: Set connection var ansible_timeout to 10 30564 1726882859.76718: Set connection var ansible_pipelining to False 30564 1726882859.76725: Set connection var ansible_shell_type to sh 30564 1726882859.76735: Set connection var ansible_shell_executable to /bin/sh 30564 1726882859.76747: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882859.76846: Set connection var ansible_connection to ssh 30564 1726882859.76884: variable 'ansible_shell_executable' from source: unknown 30564 1726882859.76892: variable 'ansible_connection' from source: unknown 30564 1726882859.76898: variable 'ansible_module_compression' from source: unknown 30564 1726882859.76905: variable 'ansible_shell_type' from source: unknown 30564 1726882859.76911: variable 'ansible_shell_executable' from source: unknown 30564 1726882859.76917: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882859.76924: variable 'ansible_pipelining' from source: unknown 30564 1726882859.76930: variable 'ansible_timeout' from source: unknown 30564 1726882859.76937: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882859.77088: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882859.77105: variable 'omit' from source: magic vars 30564 1726882859.77115: starting attempt loop 30564 1726882859.77122: running the handler 30564 1726882859.77257: variable '__network_connections_result' from source: set_fact 30564 1726882859.77320: handler run complete 30564 1726882859.77343: attempt loop complete, returning result 30564 1726882859.77350: _execute() done 30564 1726882859.77358: dumping result to json 30564 1726882859.77368: done dumping result, returning 30564 1726882859.77420: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0e448fcc-3ce9-4216-acec-00000000128c] 30564 1726882859.77431: sending task result for task 0e448fcc-3ce9-4216-acec-00000000128c ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, ef91e5fd-4b93-4ee4-ae54-4de7a703b196 skipped because already active" ] } 30564 1726882859.77612: no more pending results, returning what we have 30564 1726882859.77616: results queue empty 30564 1726882859.77617: checking for any_errors_fatal 30564 1726882859.77625: done checking for any_errors_fatal 30564 1726882859.77625: checking for max_fail_percentage 30564 1726882859.77628: done checking for max_fail_percentage 30564 1726882859.77629: checking to see if all hosts have failed and the running result is not ok 30564 1726882859.77629: done checking to see if all hosts have failed 30564 1726882859.77630: getting the remaining hosts for this loop 30564 1726882859.77633: done getting the remaining hosts for this loop 30564 1726882859.77637: getting the next task for host managed_node2 30564 1726882859.77648: done getting next task for host managed_node2 30564 1726882859.77653: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30564 1726882859.77659: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882859.77677: getting variables 30564 1726882859.77680: in VariableManager get_vars() 30564 1726882859.77720: Calling all_inventory to load vars for managed_node2 30564 1726882859.77723: Calling groups_inventory to load vars for managed_node2 30564 1726882859.77725: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882859.77737: Calling all_plugins_play to load vars for managed_node2 30564 1726882859.77740: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882859.77744: Calling groups_plugins_play to load vars for managed_node2 30564 1726882859.78861: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000128c 30564 1726882859.78864: WORKER PROCESS EXITING 30564 1726882859.80398: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882859.82357: done with get_vars() 30564 1726882859.82384: done getting variables 30564 1726882859.82442: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:40:59 -0400 (0:00:00.080) 0:00:58.406 ****** 30564 1726882859.82490: entering _queue_task() for managed_node2/debug 30564 1726882859.82791: worker is 1 (out of 1 available) 30564 1726882859.82805: exiting _queue_task() for managed_node2/debug 30564 1726882859.82817: done queuing things up, now waiting for results queue to drain 30564 1726882859.82819: waiting for pending results... 30564 1726882859.83114: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30564 1726882859.83271: in run() - task 0e448fcc-3ce9-4216-acec-00000000128d 30564 1726882859.83293: variable 'ansible_search_path' from source: unknown 30564 1726882859.83301: variable 'ansible_search_path' from source: unknown 30564 1726882859.83346: calling self._execute() 30564 1726882859.83459: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882859.83473: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882859.83491: variable 'omit' from source: magic vars 30564 1726882859.83889: variable 'ansible_distribution_major_version' from source: facts 30564 1726882859.83910: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882859.83925: variable 'omit' from source: magic vars 30564 1726882859.84002: variable 'omit' from source: magic vars 30564 1726882859.84045: variable 'omit' from source: magic vars 30564 1726882859.84095: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882859.84139: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882859.84165: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882859.84189: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882859.84206: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882859.84246: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882859.84255: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882859.84263: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882859.84383: Set connection var ansible_timeout to 10 30564 1726882859.84395: Set connection var ansible_pipelining to False 30564 1726882859.84402: Set connection var ansible_shell_type to sh 30564 1726882859.84412: Set connection var ansible_shell_executable to /bin/sh 30564 1726882859.84424: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882859.84436: Set connection var ansible_connection to ssh 30564 1726882859.84471: variable 'ansible_shell_executable' from source: unknown 30564 1726882859.84481: variable 'ansible_connection' from source: unknown 30564 1726882859.84489: variable 'ansible_module_compression' from source: unknown 30564 1726882859.84496: variable 'ansible_shell_type' from source: unknown 30564 1726882859.84504: variable 'ansible_shell_executable' from source: unknown 30564 1726882859.84510: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882859.84520: variable 'ansible_pipelining' from source: unknown 30564 1726882859.84527: variable 'ansible_timeout' from source: unknown 30564 1726882859.84537: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882859.84694: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882859.84711: variable 'omit' from source: magic vars 30564 1726882859.84721: starting attempt loop 30564 1726882859.84728: running the handler 30564 1726882859.84787: variable '__network_connections_result' from source: set_fact 30564 1726882859.84867: variable '__network_connections_result' from source: set_fact 30564 1726882859.84989: handler run complete 30564 1726882859.85026: attempt loop complete, returning result 30564 1726882859.85033: _execute() done 30564 1726882859.85039: dumping result to json 30564 1726882859.85046: done dumping result, returning 30564 1726882859.85057: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0e448fcc-3ce9-4216-acec-00000000128d] 30564 1726882859.85070: sending task result for task 0e448fcc-3ce9-4216-acec-00000000128d ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "state": "up" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false, "failed": false, "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, ef91e5fd-4b93-4ee4-ae54-4de7a703b196 skipped because already active\n", "stderr_lines": [ "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, ef91e5fd-4b93-4ee4-ae54-4de7a703b196 skipped because already active" ] } } 30564 1726882859.85275: no more pending results, returning what we have 30564 1726882859.85279: results queue empty 30564 1726882859.85280: checking for any_errors_fatal 30564 1726882859.85286: done checking for any_errors_fatal 30564 1726882859.85287: checking for max_fail_percentage 30564 1726882859.85289: done checking for max_fail_percentage 30564 1726882859.85290: checking to see if all hosts have failed and the running result is not ok 30564 1726882859.85291: done checking to see if all hosts have failed 30564 1726882859.85292: getting the remaining hosts for this loop 30564 1726882859.85293: done getting the remaining hosts for this loop 30564 1726882859.85297: getting the next task for host managed_node2 30564 1726882859.85306: done getting next task for host managed_node2 30564 1726882859.85310: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30564 1726882859.85315: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882859.85330: getting variables 30564 1726882859.85332: in VariableManager get_vars() 30564 1726882859.85372: Calling all_inventory to load vars for managed_node2 30564 1726882859.85375: Calling groups_inventory to load vars for managed_node2 30564 1726882859.85378: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882859.85389: Calling all_plugins_play to load vars for managed_node2 30564 1726882859.85399: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882859.85402: Calling groups_plugins_play to load vars for managed_node2 30564 1726882859.86404: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000128d 30564 1726882859.86407: WORKER PROCESS EXITING 30564 1726882859.87357: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882859.89131: done with get_vars() 30564 1726882859.89157: done getting variables 30564 1726882859.89215: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:40:59 -0400 (0:00:00.067) 0:00:58.473 ****** 30564 1726882859.89250: entering _queue_task() for managed_node2/debug 30564 1726882859.89527: worker is 1 (out of 1 available) 30564 1726882859.89538: exiting _queue_task() for managed_node2/debug 30564 1726882859.89550: done queuing things up, now waiting for results queue to drain 30564 1726882859.89551: waiting for pending results... 30564 1726882859.89846: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30564 1726882859.90007: in run() - task 0e448fcc-3ce9-4216-acec-00000000128e 30564 1726882859.90031: variable 'ansible_search_path' from source: unknown 30564 1726882859.90038: variable 'ansible_search_path' from source: unknown 30564 1726882859.90076: calling self._execute() 30564 1726882859.90182: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882859.90194: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882859.90207: variable 'omit' from source: magic vars 30564 1726882859.90595: variable 'ansible_distribution_major_version' from source: facts 30564 1726882859.90614: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882859.90744: variable 'network_state' from source: role '' defaults 30564 1726882859.90762: Evaluated conditional (network_state != {}): False 30564 1726882859.90773: when evaluation is False, skipping this task 30564 1726882859.90785: _execute() done 30564 1726882859.90792: dumping result to json 30564 1726882859.90800: done dumping result, returning 30564 1726882859.90812: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0e448fcc-3ce9-4216-acec-00000000128e] 30564 1726882859.90823: sending task result for task 0e448fcc-3ce9-4216-acec-00000000128e skipping: [managed_node2] => { "false_condition": "network_state != {}" } 30564 1726882859.90969: no more pending results, returning what we have 30564 1726882859.90975: results queue empty 30564 1726882859.90976: checking for any_errors_fatal 30564 1726882859.90987: done checking for any_errors_fatal 30564 1726882859.90988: checking for max_fail_percentage 30564 1726882859.90990: done checking for max_fail_percentage 30564 1726882859.90991: checking to see if all hosts have failed and the running result is not ok 30564 1726882859.90992: done checking to see if all hosts have failed 30564 1726882859.90993: getting the remaining hosts for this loop 30564 1726882859.90995: done getting the remaining hosts for this loop 30564 1726882859.90999: getting the next task for host managed_node2 30564 1726882859.91009: done getting next task for host managed_node2 30564 1726882859.91013: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 30564 1726882859.91020: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882859.91047: getting variables 30564 1726882859.91049: in VariableManager get_vars() 30564 1726882859.91095: Calling all_inventory to load vars for managed_node2 30564 1726882859.91098: Calling groups_inventory to load vars for managed_node2 30564 1726882859.91101: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882859.91114: Calling all_plugins_play to load vars for managed_node2 30564 1726882859.91117: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882859.91120: Calling groups_plugins_play to load vars for managed_node2 30564 1726882859.92130: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000128e 30564 1726882859.92133: WORKER PROCESS EXITING 30564 1726882859.92848: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882859.94650: done with get_vars() 30564 1726882859.94678: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:40:59 -0400 (0:00:00.055) 0:00:58.529 ****** 30564 1726882859.94777: entering _queue_task() for managed_node2/ping 30564 1726882859.95327: worker is 1 (out of 1 available) 30564 1726882859.95339: exiting _queue_task() for managed_node2/ping 30564 1726882859.95352: done queuing things up, now waiting for results queue to drain 30564 1726882859.95353: waiting for pending results... 30564 1726882859.96194: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 30564 1726882859.96745: in run() - task 0e448fcc-3ce9-4216-acec-00000000128f 30564 1726882859.96768: variable 'ansible_search_path' from source: unknown 30564 1726882859.96824: variable 'ansible_search_path' from source: unknown 30564 1726882859.96862: calling self._execute() 30564 1726882859.97089: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882859.97100: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882859.97119: variable 'omit' from source: magic vars 30564 1726882859.97507: variable 'ansible_distribution_major_version' from source: facts 30564 1726882859.97524: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882859.97535: variable 'omit' from source: magic vars 30564 1726882859.97615: variable 'omit' from source: magic vars 30564 1726882859.97654: variable 'omit' from source: magic vars 30564 1726882859.97705: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882859.97744: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882859.97775: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882859.97798: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882859.97820: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882859.97854: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882859.97865: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882859.97878: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882859.97990: Set connection var ansible_timeout to 10 30564 1726882859.98001: Set connection var ansible_pipelining to False 30564 1726882859.98009: Set connection var ansible_shell_type to sh 30564 1726882859.98021: Set connection var ansible_shell_executable to /bin/sh 30564 1726882859.98037: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882859.98043: Set connection var ansible_connection to ssh 30564 1726882859.98074: variable 'ansible_shell_executable' from source: unknown 30564 1726882859.98084: variable 'ansible_connection' from source: unknown 30564 1726882859.98094: variable 'ansible_module_compression' from source: unknown 30564 1726882859.98101: variable 'ansible_shell_type' from source: unknown 30564 1726882859.98108: variable 'ansible_shell_executable' from source: unknown 30564 1726882859.98115: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882859.98123: variable 'ansible_pipelining' from source: unknown 30564 1726882859.98132: variable 'ansible_timeout' from source: unknown 30564 1726882859.98143: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882859.98353: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30564 1726882859.98375: variable 'omit' from source: magic vars 30564 1726882859.98384: starting attempt loop 30564 1726882859.98391: running the handler 30564 1726882859.98407: _low_level_execute_command(): starting 30564 1726882859.98423: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882860.00728: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882860.00747: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882860.00877: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882860.00896: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882860.00940: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882860.00953: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882860.00973: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882860.00985: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882860.00994: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882860.01001: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882860.01009: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882860.01019: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882860.01030: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882860.01038: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882860.01045: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882860.01054: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882860.01133: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882860.01151: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882860.01166: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882860.01303: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882860.03012: stdout chunk (state=3): >>>/root <<< 30564 1726882860.03196: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882860.03202: stdout chunk (state=3): >>><<< 30564 1726882860.03211: stderr chunk (state=3): >>><<< 30564 1726882860.03236: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882860.03253: _low_level_execute_command(): starting 30564 1726882860.03260: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882860.032361-33096-219235727429634 `" && echo ansible-tmp-1726882860.032361-33096-219235727429634="` echo /root/.ansible/tmp/ansible-tmp-1726882860.032361-33096-219235727429634 `" ) && sleep 0' 30564 1726882860.05375: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882860.05379: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882860.05392: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882860.05406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882860.05449: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882860.05480: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882860.05490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882860.05503: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882860.05511: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882860.05517: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882860.05643: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882860.05653: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882860.05667: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882860.05674: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882860.05682: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882860.05691: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882860.05770: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882860.05787: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882860.05799: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882860.05933: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882860.07866: stdout chunk (state=3): >>>ansible-tmp-1726882860.032361-33096-219235727429634=/root/.ansible/tmp/ansible-tmp-1726882860.032361-33096-219235727429634 <<< 30564 1726882860.08051: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882860.08054: stdout chunk (state=3): >>><<< 30564 1726882860.08063: stderr chunk (state=3): >>><<< 30564 1726882860.08087: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882860.032361-33096-219235727429634=/root/.ansible/tmp/ansible-tmp-1726882860.032361-33096-219235727429634 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882860.08133: variable 'ansible_module_compression' from source: unknown 30564 1726882860.08176: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 30564 1726882860.08211: variable 'ansible_facts' from source: unknown 30564 1726882860.08295: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882860.032361-33096-219235727429634/AnsiballZ_ping.py 30564 1726882860.08842: Sending initial data 30564 1726882860.08845: Sent initial data (152 bytes) 30564 1726882860.12198: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882860.12208: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882860.12219: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882860.12233: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882860.12274: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882860.12405: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882860.12415: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882860.12428: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882860.12435: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882860.12442: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882860.12450: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882860.12460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882860.12473: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882860.12481: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882860.12488: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882860.12497: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882860.12574: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882860.12621: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882860.12633: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882860.12761: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882860.14638: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882860.14730: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882860.14834: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmp0picnsvb /root/.ansible/tmp/ansible-tmp-1726882860.032361-33096-219235727429634/AnsiballZ_ping.py <<< 30564 1726882860.14923: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882860.16481: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882860.16566: stderr chunk (state=3): >>><<< 30564 1726882860.16570: stdout chunk (state=3): >>><<< 30564 1726882860.16596: done transferring module to remote 30564 1726882860.16607: _low_level_execute_command(): starting 30564 1726882860.16612: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882860.032361-33096-219235727429634/ /root/.ansible/tmp/ansible-tmp-1726882860.032361-33096-219235727429634/AnsiballZ_ping.py && sleep 0' 30564 1726882860.18041: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882860.18157: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882860.18161: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882860.18205: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 30564 1726882860.18211: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30564 1726882860.18262: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882860.18268: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 30564 1726882860.18283: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882860.18358: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882860.18365: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882860.18486: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882860.18682: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882860.20478: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882860.20568: stderr chunk (state=3): >>><<< 30564 1726882860.20571: stdout chunk (state=3): >>><<< 30564 1726882860.20667: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882860.20671: _low_level_execute_command(): starting 30564 1726882860.20673: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882860.032361-33096-219235727429634/AnsiballZ_ping.py && sleep 0' 30564 1726882860.22881: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882860.23885: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882860.23902: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882860.23922: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882860.23968: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882860.23982: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882860.23997: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882860.24016: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882860.24029: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882860.24040: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882860.24052: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882860.24067: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882860.24084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882860.24097: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882860.24109: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882860.24122: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882860.24201: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882860.24219: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882860.24234: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882860.25081: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882860.37679: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 30564 1726882860.38788: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882860.38822: stderr chunk (state=3): >>><<< 30564 1726882860.38825: stdout chunk (state=3): >>><<< 30564 1726882860.38948: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882860.38952: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882860.032361-33096-219235727429634/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882860.38959: _low_level_execute_command(): starting 30564 1726882860.38962: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882860.032361-33096-219235727429634/ > /dev/null 2>&1 && sleep 0' 30564 1726882860.40841: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882860.40855: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882860.40872: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882860.40890: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882860.40934: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882860.40945: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882860.40958: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882860.40977: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882860.40988: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882860.40998: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882860.41009: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882860.41022: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882860.41037: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882860.41047: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882860.41058: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882860.41073: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882860.41149: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882860.41167: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882860.41182: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882860.41311: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882860.43181: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882860.43243: stderr chunk (state=3): >>><<< 30564 1726882860.43247: stdout chunk (state=3): >>><<< 30564 1726882860.43370: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882860.43374: handler run complete 30564 1726882860.43376: attempt loop complete, returning result 30564 1726882860.43378: _execute() done 30564 1726882860.43380: dumping result to json 30564 1726882860.43382: done dumping result, returning 30564 1726882860.43384: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0e448fcc-3ce9-4216-acec-00000000128f] 30564 1726882860.43387: sending task result for task 0e448fcc-3ce9-4216-acec-00000000128f 30564 1726882860.43462: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000128f ok: [managed_node2] => { "changed": false, "ping": "pong" } 30564 1726882860.43538: no more pending results, returning what we have 30564 1726882860.43541: results queue empty 30564 1726882860.43542: checking for any_errors_fatal 30564 1726882860.43551: done checking for any_errors_fatal 30564 1726882860.43552: checking for max_fail_percentage 30564 1726882860.43553: done checking for max_fail_percentage 30564 1726882860.43554: checking to see if all hosts have failed and the running result is not ok 30564 1726882860.43555: done checking to see if all hosts have failed 30564 1726882860.43556: getting the remaining hosts for this loop 30564 1726882860.43557: done getting the remaining hosts for this loop 30564 1726882860.43561: getting the next task for host managed_node2 30564 1726882860.43578: done getting next task for host managed_node2 30564 1726882860.43581: ^ task is: TASK: meta (role_complete) 30564 1726882860.43586: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882860.43598: getting variables 30564 1726882860.43600: in VariableManager get_vars() 30564 1726882860.43643: Calling all_inventory to load vars for managed_node2 30564 1726882860.43645: Calling groups_inventory to load vars for managed_node2 30564 1726882860.43648: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882860.43659: Calling all_plugins_play to load vars for managed_node2 30564 1726882860.43662: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882860.43666: Calling groups_plugins_play to load vars for managed_node2 30564 1726882860.45240: WORKER PROCESS EXITING 30564 1726882860.48262: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882860.53075: done with get_vars() 30564 1726882860.53109: done getting variables 30564 1726882860.54202: done queuing things up, now waiting for results queue to drain 30564 1726882860.54205: results queue empty 30564 1726882860.54206: checking for any_errors_fatal 30564 1726882860.54209: done checking for any_errors_fatal 30564 1726882860.54210: checking for max_fail_percentage 30564 1726882860.54212: done checking for max_fail_percentage 30564 1726882860.54213: checking to see if all hosts have failed and the running result is not ok 30564 1726882860.54214: done checking to see if all hosts have failed 30564 1726882860.54214: getting the remaining hosts for this loop 30564 1726882860.54215: done getting the remaining hosts for this loop 30564 1726882860.54223: getting the next task for host managed_node2 30564 1726882860.54229: done getting next task for host managed_node2 30564 1726882860.54232: ^ task is: TASK: Test 30564 1726882860.54234: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882860.54237: getting variables 30564 1726882860.54238: in VariableManager get_vars() 30564 1726882860.54251: Calling all_inventory to load vars for managed_node2 30564 1726882860.54253: Calling groups_inventory to load vars for managed_node2 30564 1726882860.54256: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882860.54261: Calling all_plugins_play to load vars for managed_node2 30564 1726882860.54266: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882860.54272: Calling groups_plugins_play to load vars for managed_node2 30564 1726882860.56781: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882860.60711: done with get_vars() 30564 1726882860.60734: done getting variables TASK [Test] ******************************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:30 Friday 20 September 2024 21:41:00 -0400 (0:00:00.660) 0:00:59.189 ****** 30564 1726882860.60816: entering _queue_task() for managed_node2/include_tasks 30564 1726882860.61871: worker is 1 (out of 1 available) 30564 1726882860.61886: exiting _queue_task() for managed_node2/include_tasks 30564 1726882860.61899: done queuing things up, now waiting for results queue to drain 30564 1726882860.61901: waiting for pending results... 30564 1726882860.62750: running TaskExecutor() for managed_node2/TASK: Test 30564 1726882860.62873: in run() - task 0e448fcc-3ce9-4216-acec-000000001009 30564 1726882860.63015: variable 'ansible_search_path' from source: unknown 30564 1726882860.63024: variable 'ansible_search_path' from source: unknown 30564 1726882860.63074: variable 'lsr_test' from source: include params 30564 1726882860.63528: variable 'lsr_test' from source: include params 30564 1726882860.64345: variable 'omit' from source: magic vars 30564 1726882860.64488: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882860.65417: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882860.65432: variable 'omit' from source: magic vars 30564 1726882860.65680: variable 'ansible_distribution_major_version' from source: facts 30564 1726882860.65699: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882860.65710: variable 'item' from source: unknown 30564 1726882860.65780: variable 'item' from source: unknown 30564 1726882860.66804: variable 'item' from source: unknown 30564 1726882860.66872: variable 'item' from source: unknown 30564 1726882860.67033: dumping result to json 30564 1726882860.67042: done dumping result, returning 30564 1726882860.67051: done running TaskExecutor() for managed_node2/TASK: Test [0e448fcc-3ce9-4216-acec-000000001009] 30564 1726882860.67060: sending task result for task 0e448fcc-3ce9-4216-acec-000000001009 30564 1726882860.67132: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001009 30564 1726882860.67157: no more pending results, returning what we have 30564 1726882860.67162: in VariableManager get_vars() 30564 1726882860.67204: Calling all_inventory to load vars for managed_node2 30564 1726882860.67208: Calling groups_inventory to load vars for managed_node2 30564 1726882860.67211: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882860.67227: Calling all_plugins_play to load vars for managed_node2 30564 1726882860.67230: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882860.67233: Calling groups_plugins_play to load vars for managed_node2 30564 1726882860.68374: WORKER PROCESS EXITING 30564 1726882860.69797: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882860.73505: done with get_vars() 30564 1726882860.73531: variable 'ansible_search_path' from source: unknown 30564 1726882860.73533: variable 'ansible_search_path' from source: unknown 30564 1726882860.73883: we have included files to process 30564 1726882860.73884: generating all_blocks data 30564 1726882860.73886: done generating all_blocks data 30564 1726882860.73892: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_profile.yml 30564 1726882860.73893: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_profile.yml 30564 1726882860.73895: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_profile.yml 30564 1726882860.74280: done processing included file 30564 1726882860.74282: iterating over new_blocks loaded from include file 30564 1726882860.74284: in VariableManager get_vars() 30564 1726882860.74301: done with get_vars() 30564 1726882860.74302: filtering new block on tags 30564 1726882860.74331: done filtering new block on tags 30564 1726882860.74333: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_profile.yml for managed_node2 => (item=tasks/remove_profile.yml) 30564 1726882860.74338: extending task lists for all hosts with included blocks 30564 1726882860.75759: done extending task lists 30564 1726882860.75761: done processing included files 30564 1726882860.75762: results queue empty 30564 1726882860.75762: checking for any_errors_fatal 30564 1726882860.76767: done checking for any_errors_fatal 30564 1726882860.76770: checking for max_fail_percentage 30564 1726882860.76772: done checking for max_fail_percentage 30564 1726882860.76773: checking to see if all hosts have failed and the running result is not ok 30564 1726882860.76774: done checking to see if all hosts have failed 30564 1726882860.76775: getting the remaining hosts for this loop 30564 1726882860.76776: done getting the remaining hosts for this loop 30564 1726882860.76779: getting the next task for host managed_node2 30564 1726882860.76784: done getting next task for host managed_node2 30564 1726882860.76786: ^ task is: TASK: Include network role 30564 1726882860.76789: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882860.76792: getting variables 30564 1726882860.76793: in VariableManager get_vars() 30564 1726882860.76804: Calling all_inventory to load vars for managed_node2 30564 1726882860.76806: Calling groups_inventory to load vars for managed_node2 30564 1726882860.76808: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882860.76813: Calling all_plugins_play to load vars for managed_node2 30564 1726882860.76816: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882860.76818: Calling groups_plugins_play to load vars for managed_node2 30564 1726882860.80596: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882860.83241: done with get_vars() 30564 1726882860.83272: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_profile.yml:3 Friday 20 September 2024 21:41:00 -0400 (0:00:00.225) 0:00:59.414 ****** 30564 1726882860.83360: entering _queue_task() for managed_node2/include_role 30564 1726882860.83705: worker is 1 (out of 1 available) 30564 1726882860.83717: exiting _queue_task() for managed_node2/include_role 30564 1726882860.83730: done queuing things up, now waiting for results queue to drain 30564 1726882860.83731: waiting for pending results... 30564 1726882860.84033: running TaskExecutor() for managed_node2/TASK: Include network role 30564 1726882860.84148: in run() - task 0e448fcc-3ce9-4216-acec-0000000013e8 30564 1726882860.84162: variable 'ansible_search_path' from source: unknown 30564 1726882860.84173: variable 'ansible_search_path' from source: unknown 30564 1726882860.84207: calling self._execute() 30564 1726882860.84313: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882860.84320: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882860.84330: variable 'omit' from source: magic vars 30564 1726882860.84720: variable 'ansible_distribution_major_version' from source: facts 30564 1726882860.84732: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882860.84738: _execute() done 30564 1726882860.84741: dumping result to json 30564 1726882860.84744: done dumping result, returning 30564 1726882860.84750: done running TaskExecutor() for managed_node2/TASK: Include network role [0e448fcc-3ce9-4216-acec-0000000013e8] 30564 1726882860.84755: sending task result for task 0e448fcc-3ce9-4216-acec-0000000013e8 30564 1726882860.84866: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000013e8 30564 1726882860.84870: WORKER PROCESS EXITING 30564 1726882860.84899: no more pending results, returning what we have 30564 1726882860.84905: in VariableManager get_vars() 30564 1726882860.84945: Calling all_inventory to load vars for managed_node2 30564 1726882860.84948: Calling groups_inventory to load vars for managed_node2 30564 1726882860.84952: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882860.84971: Calling all_plugins_play to load vars for managed_node2 30564 1726882860.84975: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882860.84978: Calling groups_plugins_play to load vars for managed_node2 30564 1726882860.86797: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882860.88598: done with get_vars() 30564 1726882860.88618: variable 'ansible_search_path' from source: unknown 30564 1726882860.88619: variable 'ansible_search_path' from source: unknown 30564 1726882860.88771: variable 'omit' from source: magic vars 30564 1726882860.88818: variable 'omit' from source: magic vars 30564 1726882860.88833: variable 'omit' from source: magic vars 30564 1726882860.88836: we have included files to process 30564 1726882860.88837: generating all_blocks data 30564 1726882860.88839: done generating all_blocks data 30564 1726882860.88841: processing included file: fedora.linux_system_roles.network 30564 1726882860.88861: in VariableManager get_vars() 30564 1726882860.88880: done with get_vars() 30564 1726882860.88912: in VariableManager get_vars() 30564 1726882860.88929: done with get_vars() 30564 1726882860.88970: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 30564 1726882860.89109: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 30564 1726882860.89198: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 30564 1726882860.89682: in VariableManager get_vars() 30564 1726882860.89703: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30564 1726882860.93472: iterating over new_blocks loaded from include file 30564 1726882860.93474: in VariableManager get_vars() 30564 1726882860.93493: done with get_vars() 30564 1726882860.93495: filtering new block on tags 30564 1726882860.94331: done filtering new block on tags 30564 1726882860.94334: in VariableManager get_vars() 30564 1726882860.94350: done with get_vars() 30564 1726882860.94351: filtering new block on tags 30564 1726882860.94373: done filtering new block on tags 30564 1726882860.94376: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node2 30564 1726882860.94382: extending task lists for all hosts with included blocks 30564 1726882860.94676: done extending task lists 30564 1726882860.94678: done processing included files 30564 1726882860.94679: results queue empty 30564 1726882860.94680: checking for any_errors_fatal 30564 1726882860.94684: done checking for any_errors_fatal 30564 1726882860.94684: checking for max_fail_percentage 30564 1726882860.94685: done checking for max_fail_percentage 30564 1726882860.94686: checking to see if all hosts have failed and the running result is not ok 30564 1726882860.94687: done checking to see if all hosts have failed 30564 1726882860.94688: getting the remaining hosts for this loop 30564 1726882860.94689: done getting the remaining hosts for this loop 30564 1726882860.94692: getting the next task for host managed_node2 30564 1726882860.94697: done getting next task for host managed_node2 30564 1726882860.94699: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30564 1726882860.94703: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882860.94713: getting variables 30564 1726882860.94714: in VariableManager get_vars() 30564 1726882860.94727: Calling all_inventory to load vars for managed_node2 30564 1726882860.94729: Calling groups_inventory to load vars for managed_node2 30564 1726882860.94731: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882860.94736: Calling all_plugins_play to load vars for managed_node2 30564 1726882860.94738: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882860.94741: Calling groups_plugins_play to load vars for managed_node2 30564 1726882860.97236: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882860.99852: done with get_vars() 30564 1726882860.99878: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:41:00 -0400 (0:00:00.165) 0:00:59.580 ****** 30564 1726882860.99957: entering _queue_task() for managed_node2/include_tasks 30564 1726882861.00454: worker is 1 (out of 1 available) 30564 1726882861.00472: exiting _queue_task() for managed_node2/include_tasks 30564 1726882861.00485: done queuing things up, now waiting for results queue to drain 30564 1726882861.00487: waiting for pending results... 30564 1726882861.01784: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30564 1726882861.02120: in run() - task 0e448fcc-3ce9-4216-acec-00000000145f 30564 1726882861.02259: variable 'ansible_search_path' from source: unknown 30564 1726882861.02273: variable 'ansible_search_path' from source: unknown 30564 1726882861.02314: calling self._execute() 30564 1726882861.02533: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882861.02545: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882861.02559: variable 'omit' from source: magic vars 30564 1726882861.03390: variable 'ansible_distribution_major_version' from source: facts 30564 1726882861.03467: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882861.03481: _execute() done 30564 1726882861.03556: dumping result to json 30564 1726882861.03570: done dumping result, returning 30564 1726882861.03582: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0e448fcc-3ce9-4216-acec-00000000145f] 30564 1726882861.03594: sending task result for task 0e448fcc-3ce9-4216-acec-00000000145f 30564 1726882861.03709: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000145f 30564 1726882861.03718: WORKER PROCESS EXITING 30564 1726882861.03911: no more pending results, returning what we have 30564 1726882861.03916: in VariableManager get_vars() 30564 1726882861.03966: Calling all_inventory to load vars for managed_node2 30564 1726882861.03970: Calling groups_inventory to load vars for managed_node2 30564 1726882861.03973: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882861.03987: Calling all_plugins_play to load vars for managed_node2 30564 1726882861.03990: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882861.03992: Calling groups_plugins_play to load vars for managed_node2 30564 1726882861.06784: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882861.09688: done with get_vars() 30564 1726882861.09715: variable 'ansible_search_path' from source: unknown 30564 1726882861.09717: variable 'ansible_search_path' from source: unknown 30564 1726882861.09758: we have included files to process 30564 1726882861.09759: generating all_blocks data 30564 1726882861.09767: done generating all_blocks data 30564 1726882861.09774: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30564 1726882861.09776: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30564 1726882861.09779: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30564 1726882861.10556: done processing included file 30564 1726882861.10558: iterating over new_blocks loaded from include file 30564 1726882861.10560: in VariableManager get_vars() 30564 1726882861.11387: done with get_vars() 30564 1726882861.11389: filtering new block on tags 30564 1726882861.11422: done filtering new block on tags 30564 1726882861.11428: in VariableManager get_vars() 30564 1726882861.11452: done with get_vars() 30564 1726882861.11454: filtering new block on tags 30564 1726882861.11512: done filtering new block on tags 30564 1726882861.11515: in VariableManager get_vars() 30564 1726882861.11537: done with get_vars() 30564 1726882861.11539: filtering new block on tags 30564 1726882861.11582: done filtering new block on tags 30564 1726882861.11584: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 30564 1726882861.11590: extending task lists for all hosts with included blocks 30564 1726882861.14729: done extending task lists 30564 1726882861.14730: done processing included files 30564 1726882861.14731: results queue empty 30564 1726882861.14732: checking for any_errors_fatal 30564 1726882861.14736: done checking for any_errors_fatal 30564 1726882861.14736: checking for max_fail_percentage 30564 1726882861.14738: done checking for max_fail_percentage 30564 1726882861.14739: checking to see if all hosts have failed and the running result is not ok 30564 1726882861.14740: done checking to see if all hosts have failed 30564 1726882861.14741: getting the remaining hosts for this loop 30564 1726882861.14742: done getting the remaining hosts for this loop 30564 1726882861.14745: getting the next task for host managed_node2 30564 1726882861.14751: done getting next task for host managed_node2 30564 1726882861.14754: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30564 1726882861.14758: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882861.14771: getting variables 30564 1726882861.14773: in VariableManager get_vars() 30564 1726882861.14787: Calling all_inventory to load vars for managed_node2 30564 1726882861.14790: Calling groups_inventory to load vars for managed_node2 30564 1726882861.14792: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882861.14797: Calling all_plugins_play to load vars for managed_node2 30564 1726882861.14799: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882861.14802: Calling groups_plugins_play to load vars for managed_node2 30564 1726882861.16458: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882861.19226: done with get_vars() 30564 1726882861.19257: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:41:01 -0400 (0:00:00.193) 0:00:59.774 ****** 30564 1726882861.19349: entering _queue_task() for managed_node2/setup 30564 1726882861.19720: worker is 1 (out of 1 available) 30564 1726882861.19735: exiting _queue_task() for managed_node2/setup 30564 1726882861.19748: done queuing things up, now waiting for results queue to drain 30564 1726882861.19750: waiting for pending results... 30564 1726882861.20074: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30564 1726882861.20222: in run() - task 0e448fcc-3ce9-4216-acec-0000000014b6 30564 1726882861.20236: variable 'ansible_search_path' from source: unknown 30564 1726882861.20240: variable 'ansible_search_path' from source: unknown 30564 1726882861.20274: calling self._execute() 30564 1726882861.20381: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882861.20385: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882861.20395: variable 'omit' from source: magic vars 30564 1726882861.20802: variable 'ansible_distribution_major_version' from source: facts 30564 1726882861.20815: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882861.21055: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882861.23537: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882861.23621: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882861.23659: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882861.23705: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882861.23734: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882861.23828: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882861.23857: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882861.23888: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882861.23934: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882861.23949: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882861.24009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882861.24034: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882861.24056: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882861.24102: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882861.24123: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882861.24287: variable '__network_required_facts' from source: role '' defaults 30564 1726882861.24296: variable 'ansible_facts' from source: unknown 30564 1726882861.25170: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 30564 1726882861.25174: when evaluation is False, skipping this task 30564 1726882861.25177: _execute() done 30564 1726882861.25179: dumping result to json 30564 1726882861.25182: done dumping result, returning 30564 1726882861.25184: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0e448fcc-3ce9-4216-acec-0000000014b6] 30564 1726882861.25197: sending task result for task 0e448fcc-3ce9-4216-acec-0000000014b6 30564 1726882861.25296: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000014b6 30564 1726882861.25299: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30564 1726882861.25350: no more pending results, returning what we have 30564 1726882861.25355: results queue empty 30564 1726882861.25356: checking for any_errors_fatal 30564 1726882861.25358: done checking for any_errors_fatal 30564 1726882861.25358: checking for max_fail_percentage 30564 1726882861.25360: done checking for max_fail_percentage 30564 1726882861.25362: checking to see if all hosts have failed and the running result is not ok 30564 1726882861.25362: done checking to see if all hosts have failed 30564 1726882861.25365: getting the remaining hosts for this loop 30564 1726882861.25367: done getting the remaining hosts for this loop 30564 1726882861.25371: getting the next task for host managed_node2 30564 1726882861.25384: done getting next task for host managed_node2 30564 1726882861.25388: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 30564 1726882861.25394: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882861.25426: getting variables 30564 1726882861.25428: in VariableManager get_vars() 30564 1726882861.25469: Calling all_inventory to load vars for managed_node2 30564 1726882861.25473: Calling groups_inventory to load vars for managed_node2 30564 1726882861.25475: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882861.25487: Calling all_plugins_play to load vars for managed_node2 30564 1726882861.25491: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882861.25501: Calling groups_plugins_play to load vars for managed_node2 30564 1726882861.27422: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882861.35493: done with get_vars() 30564 1726882861.35516: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:41:01 -0400 (0:00:00.162) 0:00:59.937 ****** 30564 1726882861.35609: entering _queue_task() for managed_node2/stat 30564 1726882861.35948: worker is 1 (out of 1 available) 30564 1726882861.35961: exiting _queue_task() for managed_node2/stat 30564 1726882861.35976: done queuing things up, now waiting for results queue to drain 30564 1726882861.35977: waiting for pending results... 30564 1726882861.36285: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 30564 1726882861.36451: in run() - task 0e448fcc-3ce9-4216-acec-0000000014b8 30564 1726882861.36467: variable 'ansible_search_path' from source: unknown 30564 1726882861.36476: variable 'ansible_search_path' from source: unknown 30564 1726882861.36504: calling self._execute() 30564 1726882861.36611: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882861.36616: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882861.36627: variable 'omit' from source: magic vars 30564 1726882861.37583: variable 'ansible_distribution_major_version' from source: facts 30564 1726882861.37596: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882861.37774: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882861.38062: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882861.38110: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882861.38174: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882861.38209: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882861.38298: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882861.38321: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882861.38345: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882861.38374: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882861.38466: variable '__network_is_ostree' from source: set_fact 30564 1726882861.38472: Evaluated conditional (not __network_is_ostree is defined): False 30564 1726882861.38476: when evaluation is False, skipping this task 30564 1726882861.38478: _execute() done 30564 1726882861.38481: dumping result to json 30564 1726882861.38483: done dumping result, returning 30564 1726882861.38496: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0e448fcc-3ce9-4216-acec-0000000014b8] 30564 1726882861.38509: sending task result for task 0e448fcc-3ce9-4216-acec-0000000014b8 skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30564 1726882861.38653: no more pending results, returning what we have 30564 1726882861.38657: results queue empty 30564 1726882861.38658: checking for any_errors_fatal 30564 1726882861.38668: done checking for any_errors_fatal 30564 1726882861.38669: checking for max_fail_percentage 30564 1726882861.38671: done checking for max_fail_percentage 30564 1726882861.38672: checking to see if all hosts have failed and the running result is not ok 30564 1726882861.38673: done checking to see if all hosts have failed 30564 1726882861.38674: getting the remaining hosts for this loop 30564 1726882861.38676: done getting the remaining hosts for this loop 30564 1726882861.38681: getting the next task for host managed_node2 30564 1726882861.38689: done getting next task for host managed_node2 30564 1726882861.38693: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30564 1726882861.38698: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882861.38728: getting variables 30564 1726882861.38730: in VariableManager get_vars() 30564 1726882861.38776: Calling all_inventory to load vars for managed_node2 30564 1726882861.38779: Calling groups_inventory to load vars for managed_node2 30564 1726882861.38782: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882861.38793: Calling all_plugins_play to load vars for managed_node2 30564 1726882861.38796: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882861.38799: Calling groups_plugins_play to load vars for managed_node2 30564 1726882861.39428: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000014b8 30564 1726882861.39431: WORKER PROCESS EXITING 30564 1726882861.41211: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882861.45091: done with get_vars() 30564 1726882861.45124: done getting variables 30564 1726882861.45298: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:41:01 -0400 (0:00:00.097) 0:01:00.034 ****** 30564 1726882861.45335: entering _queue_task() for managed_node2/set_fact 30564 1726882861.45789: worker is 1 (out of 1 available) 30564 1726882861.45804: exiting _queue_task() for managed_node2/set_fact 30564 1726882861.45823: done queuing things up, now waiting for results queue to drain 30564 1726882861.45825: waiting for pending results... 30564 1726882861.46714: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30564 1726882861.48712: in run() - task 0e448fcc-3ce9-4216-acec-0000000014b9 30564 1726882861.48737: variable 'ansible_search_path' from source: unknown 30564 1726882861.48747: variable 'ansible_search_path' from source: unknown 30564 1726882861.48794: calling self._execute() 30564 1726882861.48909: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882861.48922: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882861.48936: variable 'omit' from source: magic vars 30564 1726882861.49330: variable 'ansible_distribution_major_version' from source: facts 30564 1726882861.50286: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882861.50448: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882861.50724: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882861.51712: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882861.51775: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882861.51811: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882861.51911: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882861.51942: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882861.51981: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882861.52242: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882861.52405: variable '__network_is_ostree' from source: set_fact 30564 1726882861.52419: Evaluated conditional (not __network_is_ostree is defined): False 30564 1726882861.52428: when evaluation is False, skipping this task 30564 1726882861.52436: _execute() done 30564 1726882861.52442: dumping result to json 30564 1726882861.52509: done dumping result, returning 30564 1726882861.52522: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0e448fcc-3ce9-4216-acec-0000000014b9] 30564 1726882861.52534: sending task result for task 0e448fcc-3ce9-4216-acec-0000000014b9 skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30564 1726882861.52695: no more pending results, returning what we have 30564 1726882861.52699: results queue empty 30564 1726882861.52701: checking for any_errors_fatal 30564 1726882861.52707: done checking for any_errors_fatal 30564 1726882861.52708: checking for max_fail_percentage 30564 1726882861.52709: done checking for max_fail_percentage 30564 1726882861.52710: checking to see if all hosts have failed and the running result is not ok 30564 1726882861.52711: done checking to see if all hosts have failed 30564 1726882861.52712: getting the remaining hosts for this loop 30564 1726882861.52714: done getting the remaining hosts for this loop 30564 1726882861.52718: getting the next task for host managed_node2 30564 1726882861.52730: done getting next task for host managed_node2 30564 1726882861.52735: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 30564 1726882861.52740: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882861.52765: getting variables 30564 1726882861.52767: in VariableManager get_vars() 30564 1726882861.52809: Calling all_inventory to load vars for managed_node2 30564 1726882861.52812: Calling groups_inventory to load vars for managed_node2 30564 1726882861.52815: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882861.52826: Calling all_plugins_play to load vars for managed_node2 30564 1726882861.52830: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882861.52834: Calling groups_plugins_play to load vars for managed_node2 30564 1726882861.53413: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000014b9 30564 1726882861.53417: WORKER PROCESS EXITING 30564 1726882861.56592: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882861.60165: done with get_vars() 30564 1726882861.60192: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:41:01 -0400 (0:00:00.149) 0:01:00.184 ****** 30564 1726882861.60303: entering _queue_task() for managed_node2/service_facts 30564 1726882861.60769: worker is 1 (out of 1 available) 30564 1726882861.60787: exiting _queue_task() for managed_node2/service_facts 30564 1726882861.60815: done queuing things up, now waiting for results queue to drain 30564 1726882861.60817: waiting for pending results... 30564 1726882861.61996: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 30564 1726882861.62532: in run() - task 0e448fcc-3ce9-4216-acec-0000000014bb 30564 1726882861.62536: variable 'ansible_search_path' from source: unknown 30564 1726882861.62539: variable 'ansible_search_path' from source: unknown 30564 1726882861.62567: calling self._execute() 30564 1726882861.62785: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882861.62907: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882861.62917: variable 'omit' from source: magic vars 30564 1726882861.63895: variable 'ansible_distribution_major_version' from source: facts 30564 1726882861.63927: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882861.63933: variable 'omit' from source: magic vars 30564 1726882861.64113: variable 'omit' from source: magic vars 30564 1726882861.64171: variable 'omit' from source: magic vars 30564 1726882861.64223: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882861.64266: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882861.64286: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882861.64304: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882861.64320: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882861.64391: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882861.64395: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882861.64399: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882861.64512: Set connection var ansible_timeout to 10 30564 1726882861.64517: Set connection var ansible_pipelining to False 30564 1726882861.64520: Set connection var ansible_shell_type to sh 30564 1726882861.64525: Set connection var ansible_shell_executable to /bin/sh 30564 1726882861.64537: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882861.64540: Set connection var ansible_connection to ssh 30564 1726882861.64566: variable 'ansible_shell_executable' from source: unknown 30564 1726882861.64572: variable 'ansible_connection' from source: unknown 30564 1726882861.64575: variable 'ansible_module_compression' from source: unknown 30564 1726882861.64578: variable 'ansible_shell_type' from source: unknown 30564 1726882861.64580: variable 'ansible_shell_executable' from source: unknown 30564 1726882861.64582: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882861.64587: variable 'ansible_pipelining' from source: unknown 30564 1726882861.64595: variable 'ansible_timeout' from source: unknown 30564 1726882861.64599: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882861.64827: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30564 1726882861.64835: variable 'omit' from source: magic vars 30564 1726882861.64840: starting attempt loop 30564 1726882861.64842: running the handler 30564 1726882861.64861: _low_level_execute_command(): starting 30564 1726882861.64872: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882861.65633: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882861.65651: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882861.65662: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882861.65682: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882861.65724: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882861.65731: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882861.65741: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882861.65760: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882861.65772: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882861.65775: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882861.65789: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882861.65800: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882861.65817: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882861.65824: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882861.65831: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882861.65841: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882861.65924: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882861.65946: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882861.65959: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882861.66172: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882861.67813: stdout chunk (state=3): >>>/root <<< 30564 1726882861.67929: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882861.68017: stderr chunk (state=3): >>><<< 30564 1726882861.68020: stdout chunk (state=3): >>><<< 30564 1726882861.68058: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882861.68073: _low_level_execute_command(): starting 30564 1726882861.68079: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882861.6805627-33141-253301815922580 `" && echo ansible-tmp-1726882861.6805627-33141-253301815922580="` echo /root/.ansible/tmp/ansible-tmp-1726882861.6805627-33141-253301815922580 `" ) && sleep 0' 30564 1726882861.69299: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882861.69307: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882861.69318: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882861.69330: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882861.69379: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882861.69464: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882861.69475: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882861.69489: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882861.69496: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882861.69502: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882861.69509: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882861.69518: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882861.69528: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882861.69535: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882861.69543: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882861.69549: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882861.69734: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882861.69749: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882861.69761: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882861.69904: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882861.71847: stdout chunk (state=3): >>>ansible-tmp-1726882861.6805627-33141-253301815922580=/root/.ansible/tmp/ansible-tmp-1726882861.6805627-33141-253301815922580 <<< 30564 1726882861.72043: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882861.72047: stdout chunk (state=3): >>><<< 30564 1726882861.72049: stderr chunk (state=3): >>><<< 30564 1726882861.72052: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882861.6805627-33141-253301815922580=/root/.ansible/tmp/ansible-tmp-1726882861.6805627-33141-253301815922580 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882861.72094: variable 'ansible_module_compression' from source: unknown 30564 1726882861.72141: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 30564 1726882861.72177: variable 'ansible_facts' from source: unknown 30564 1726882861.72263: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882861.6805627-33141-253301815922580/AnsiballZ_service_facts.py 30564 1726882861.72433: Sending initial data 30564 1726882861.72436: Sent initial data (162 bytes) 30564 1726882861.73516: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882861.73525: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882861.73538: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882861.73562: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882861.73592: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882861.73595: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882861.73608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882861.73628: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882861.73634: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882861.73709: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882861.73713: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882861.73716: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882861.73826: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882861.76238: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882861.76887: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882861.77243: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmpq4mkru2i /root/.ansible/tmp/ansible-tmp-1726882861.6805627-33141-253301815922580/AnsiballZ_service_facts.py <<< 30564 1726882861.77371: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882861.78613: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882861.78715: stderr chunk (state=3): >>><<< 30564 1726882861.78718: stdout chunk (state=3): >>><<< 30564 1726882861.78749: done transferring module to remote 30564 1726882861.78752: _low_level_execute_command(): starting 30564 1726882861.78755: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882861.6805627-33141-253301815922580/ /root/.ansible/tmp/ansible-tmp-1726882861.6805627-33141-253301815922580/AnsiballZ_service_facts.py && sleep 0' 30564 1726882861.79206: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882861.79213: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882861.79246: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882861.79250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882861.79297: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882861.79308: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882861.79417: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882861.81347: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882861.81350: stdout chunk (state=3): >>><<< 30564 1726882861.81353: stderr chunk (state=3): >>><<< 30564 1726882861.81438: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882861.81443: _low_level_execute_command(): starting 30564 1726882861.81446: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882861.6805627-33141-253301815922580/AnsiballZ_service_facts.py && sleep 0' 30564 1726882861.82504: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882861.82516: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882861.82530: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882861.82546: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882861.82589: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882861.82600: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882861.82612: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882861.82627: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882861.82638: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882861.82649: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882861.82660: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882861.82677: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882861.82695: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882861.82708: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882861.82720: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882861.82733: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882861.82807: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882861.82824: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882861.82839: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882861.82985: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882863.20005: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.servi<<< 30564 1726882863.20017: stdout chunk (state=3): >>>ce": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 30564 1726882863.21285: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882863.21289: stdout chunk (state=3): >>><<< 30564 1726882863.21295: stderr chunk (state=3): >>><<< 30564 1726882863.21328: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882863.22034: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882861.6805627-33141-253301815922580/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882863.22041: _low_level_execute_command(): starting 30564 1726882863.22047: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882861.6805627-33141-253301815922580/ > /dev/null 2>&1 && sleep 0' 30564 1726882863.22981: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882863.22985: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882863.22995: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882863.23009: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882863.23045: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882863.23053: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882863.23065: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882863.23081: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882863.23088: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882863.23095: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882863.23103: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882863.23112: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882863.23124: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882863.23131: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882863.23138: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882863.23148: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882863.23233: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882863.23255: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882863.23278: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882863.23480: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882863.25297: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882863.25301: stdout chunk (state=3): >>><<< 30564 1726882863.25307: stderr chunk (state=3): >>><<< 30564 1726882863.25320: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882863.25326: handler run complete 30564 1726882863.25502: variable 'ansible_facts' from source: unknown 30564 1726882863.25647: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882863.26096: variable 'ansible_facts' from source: unknown 30564 1726882863.26218: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882863.26897: attempt loop complete, returning result 30564 1726882863.26902: _execute() done 30564 1726882863.26905: dumping result to json 30564 1726882863.26965: done dumping result, returning 30564 1726882863.27193: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [0e448fcc-3ce9-4216-acec-0000000014bb] 30564 1726882863.27198: sending task result for task 0e448fcc-3ce9-4216-acec-0000000014bb ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30564 1726882863.28662: no more pending results, returning what we have 30564 1726882863.28666: results queue empty 30564 1726882863.28667: checking for any_errors_fatal 30564 1726882863.28675: done checking for any_errors_fatal 30564 1726882863.28676: checking for max_fail_percentage 30564 1726882863.28678: done checking for max_fail_percentage 30564 1726882863.28679: checking to see if all hosts have failed and the running result is not ok 30564 1726882863.28680: done checking to see if all hosts have failed 30564 1726882863.28681: getting the remaining hosts for this loop 30564 1726882863.28682: done getting the remaining hosts for this loop 30564 1726882863.28686: getting the next task for host managed_node2 30564 1726882863.28692: done getting next task for host managed_node2 30564 1726882863.28695: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 30564 1726882863.28700: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882863.28712: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000014bb 30564 1726882863.28722: WORKER PROCESS EXITING 30564 1726882863.28727: getting variables 30564 1726882863.28729: in VariableManager get_vars() 30564 1726882863.28761: Calling all_inventory to load vars for managed_node2 30564 1726882863.28765: Calling groups_inventory to load vars for managed_node2 30564 1726882863.28770: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882863.28781: Calling all_plugins_play to load vars for managed_node2 30564 1726882863.28783: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882863.28786: Calling groups_plugins_play to load vars for managed_node2 30564 1726882863.31684: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882863.35904: done with get_vars() 30564 1726882863.35948: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:41:03 -0400 (0:00:01.758) 0:01:01.942 ****** 30564 1726882863.36122: entering _queue_task() for managed_node2/package_facts 30564 1726882863.36687: worker is 1 (out of 1 available) 30564 1726882863.36698: exiting _queue_task() for managed_node2/package_facts 30564 1726882863.36711: done queuing things up, now waiting for results queue to drain 30564 1726882863.36712: waiting for pending results... 30564 1726882863.37423: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 30564 1726882863.37533: in run() - task 0e448fcc-3ce9-4216-acec-0000000014bc 30564 1726882863.37544: variable 'ansible_search_path' from source: unknown 30564 1726882863.37548: variable 'ansible_search_path' from source: unknown 30564 1726882863.37585: calling self._execute() 30564 1726882863.37670: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882863.37676: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882863.37686: variable 'omit' from source: magic vars 30564 1726882863.37991: variable 'ansible_distribution_major_version' from source: facts 30564 1726882863.38000: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882863.38005: variable 'omit' from source: magic vars 30564 1726882863.38056: variable 'omit' from source: magic vars 30564 1726882863.38081: variable 'omit' from source: magic vars 30564 1726882863.38114: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882863.38147: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882863.38161: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882863.38178: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882863.38188: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882863.38212: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882863.38216: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882863.38219: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882863.38291: Set connection var ansible_timeout to 10 30564 1726882863.38295: Set connection var ansible_pipelining to False 30564 1726882863.38298: Set connection var ansible_shell_type to sh 30564 1726882863.38302: Set connection var ansible_shell_executable to /bin/sh 30564 1726882863.38317: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882863.38336: Set connection var ansible_connection to ssh 30564 1726882863.38534: variable 'ansible_shell_executable' from source: unknown 30564 1726882863.38538: variable 'ansible_connection' from source: unknown 30564 1726882863.38541: variable 'ansible_module_compression' from source: unknown 30564 1726882863.38543: variable 'ansible_shell_type' from source: unknown 30564 1726882863.38547: variable 'ansible_shell_executable' from source: unknown 30564 1726882863.38549: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882863.38552: variable 'ansible_pipelining' from source: unknown 30564 1726882863.38554: variable 'ansible_timeout' from source: unknown 30564 1726882863.38556: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882863.38606: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30564 1726882863.38621: variable 'omit' from source: magic vars 30564 1726882863.38624: starting attempt loop 30564 1726882863.38627: running the handler 30564 1726882863.38642: _low_level_execute_command(): starting 30564 1726882863.38645: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882863.39354: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882863.39377: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882863.39392: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882863.39411: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882863.39457: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882863.39466: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882863.39475: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882863.39489: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882863.39497: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882863.39504: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882863.39512: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882863.39524: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882863.39801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882863.39805: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882863.39956: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882863.41569: stdout chunk (state=3): >>>/root <<< 30564 1726882863.41677: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882863.41719: stderr chunk (state=3): >>><<< 30564 1726882863.41724: stdout chunk (state=3): >>><<< 30564 1726882863.41737: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882863.41750: _low_level_execute_command(): starting 30564 1726882863.41754: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882863.4173615-33230-87384571061678 `" && echo ansible-tmp-1726882863.4173615-33230-87384571061678="` echo /root/.ansible/tmp/ansible-tmp-1726882863.4173615-33230-87384571061678 `" ) && sleep 0' 30564 1726882863.42163: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882863.42175: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882863.42186: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882863.42211: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882863.42217: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882863.42226: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882863.42235: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882863.42242: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882863.42251: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882863.42258: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882863.42310: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882863.42320: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882863.42333: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882863.42444: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882863.44329: stdout chunk (state=3): >>>ansible-tmp-1726882863.4173615-33230-87384571061678=/root/.ansible/tmp/ansible-tmp-1726882863.4173615-33230-87384571061678 <<< 30564 1726882863.44444: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882863.44509: stderr chunk (state=3): >>><<< 30564 1726882863.44515: stdout chunk (state=3): >>><<< 30564 1726882863.44532: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882863.4173615-33230-87384571061678=/root/.ansible/tmp/ansible-tmp-1726882863.4173615-33230-87384571061678 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882863.44578: variable 'ansible_module_compression' from source: unknown 30564 1726882863.44625: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 30564 1726882863.44680: variable 'ansible_facts' from source: unknown 30564 1726882863.44860: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882863.4173615-33230-87384571061678/AnsiballZ_package_facts.py 30564 1726882863.45007: Sending initial data 30564 1726882863.45011: Sent initial data (161 bytes) 30564 1726882863.45958: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882863.45970: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882863.45979: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882863.45993: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882863.46029: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882863.46035: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882863.46051: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882863.46066: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882863.46073: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882863.46080: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882863.46087: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882863.46096: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882863.46107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882863.46114: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882863.46120: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882863.46130: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882863.46210: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882863.46226: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882863.46238: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882863.46361: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882863.48126: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882863.48217: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882863.48319: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmph5lty4wt /root/.ansible/tmp/ansible-tmp-1726882863.4173615-33230-87384571061678/AnsiballZ_package_facts.py <<< 30564 1726882863.48407: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882863.51083: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882863.51184: stderr chunk (state=3): >>><<< 30564 1726882863.51188: stdout chunk (state=3): >>><<< 30564 1726882863.51207: done transferring module to remote 30564 1726882863.51219: _low_level_execute_command(): starting 30564 1726882863.51223: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882863.4173615-33230-87384571061678/ /root/.ansible/tmp/ansible-tmp-1726882863.4173615-33230-87384571061678/AnsiballZ_package_facts.py && sleep 0' 30564 1726882863.51893: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882863.51900: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882863.51911: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882863.51925: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882863.51968: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882863.51980: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882863.51990: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882863.52003: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882863.52011: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882863.52018: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882863.52025: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882863.52034: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882863.52045: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882863.52052: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882863.52065: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882863.52077: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882863.52148: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882863.52161: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882863.52183: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882863.52308: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882863.54092: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882863.54166: stderr chunk (state=3): >>><<< 30564 1726882863.54178: stdout chunk (state=3): >>><<< 30564 1726882863.54276: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882863.54280: _low_level_execute_command(): starting 30564 1726882863.54283: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882863.4173615-33230-87384571061678/AnsiballZ_package_facts.py && sleep 0' 30564 1726882863.54871: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882863.54886: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882863.54901: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882863.54919: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882863.54974: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882863.54988: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882863.55002: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882863.55020: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882863.55037: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882863.55055: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882863.55069: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882863.55084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882863.55099: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882863.55110: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882863.55120: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882863.55132: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882863.55218: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882863.55238: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882863.55259: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882863.55415: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882864.01811: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"<<< 30564 1726882864.01821: stdout chunk (state=3): >>>}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_<<< 30564 1726882864.01835: stdout chunk (state=3): >>>64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x8<<< 30564 1726882864.01840: stdout chunk (state=3): >>>6_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [<<< 30564 1726882864.01847: stdout chunk (state=3): >>>{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba<<< 30564 1726882864.01850: stdout chunk (state=3): >>>", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "py<<< 30564 1726882864.01856: stdout chunk (state=3): >>>thon3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epo<<< 30564 1726882864.01859: stdout chunk (state=3): >>>ch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.<<< 30564 1726882864.01945: stdout chunk (state=3): >>>9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "pe<<< 30564 1726882864.01956: stdout chunk (state=3): >>>rl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch"<<< 30564 1726882864.01970: stdout chunk (state=3): >>>: null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "relea<<< 30564 1726882864.01988: stdout chunk (state=3): >>>se": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 30564 1726882864.03582: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882864.03626: stderr chunk (state=3): >>><<< 30564 1726882864.03629: stdout chunk (state=3): >>><<< 30564 1726882864.03734: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882864.09086: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882863.4173615-33230-87384571061678/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882864.09117: _low_level_execute_command(): starting 30564 1726882864.09121: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882863.4173615-33230-87384571061678/ > /dev/null 2>&1 && sleep 0' 30564 1726882864.10526: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882864.10544: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882864.10559: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882864.10583: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882864.10624: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882864.10637: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882864.10651: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882864.10675: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882864.10689: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882864.10700: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882864.10711: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882864.10724: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882864.10741: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882864.10753: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882864.10767: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882864.10785: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882864.10860: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882864.10881: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882864.10897: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882864.11256: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882864.13161: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882864.13167: stdout chunk (state=3): >>><<< 30564 1726882864.13181: stderr chunk (state=3): >>><<< 30564 1726882864.13195: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882864.13202: handler run complete 30564 1726882864.14322: variable 'ansible_facts' from source: unknown 30564 1726882864.15102: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882864.18037: variable 'ansible_facts' from source: unknown 30564 1726882864.18722: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882864.19611: attempt loop complete, returning result 30564 1726882864.19629: _execute() done 30564 1726882864.19636: dumping result to json 30564 1726882864.19903: done dumping result, returning 30564 1726882864.19921: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0e448fcc-3ce9-4216-acec-0000000014bc] 30564 1726882864.19934: sending task result for task 0e448fcc-3ce9-4216-acec-0000000014bc 30564 1726882864.22589: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000014bc 30564 1726882864.22593: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30564 1726882864.22754: no more pending results, returning what we have 30564 1726882864.22758: results queue empty 30564 1726882864.22759: checking for any_errors_fatal 30564 1726882864.22768: done checking for any_errors_fatal 30564 1726882864.22768: checking for max_fail_percentage 30564 1726882864.22770: done checking for max_fail_percentage 30564 1726882864.22771: checking to see if all hosts have failed and the running result is not ok 30564 1726882864.22772: done checking to see if all hosts have failed 30564 1726882864.22773: getting the remaining hosts for this loop 30564 1726882864.22775: done getting the remaining hosts for this loop 30564 1726882864.22778: getting the next task for host managed_node2 30564 1726882864.22786: done getting next task for host managed_node2 30564 1726882864.22791: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 30564 1726882864.22796: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882864.22808: getting variables 30564 1726882864.22810: in VariableManager get_vars() 30564 1726882864.22843: Calling all_inventory to load vars for managed_node2 30564 1726882864.22846: Calling groups_inventory to load vars for managed_node2 30564 1726882864.22852: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882864.22863: Calling all_plugins_play to load vars for managed_node2 30564 1726882864.22868: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882864.22871: Calling groups_plugins_play to load vars for managed_node2 30564 1726882864.24847: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882864.26810: done with get_vars() 30564 1726882864.26840: done getting variables 30564 1726882864.26915: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:41:04 -0400 (0:00:00.908) 0:01:02.850 ****** 30564 1726882864.26949: entering _queue_task() for managed_node2/debug 30564 1726882864.27286: worker is 1 (out of 1 available) 30564 1726882864.27297: exiting _queue_task() for managed_node2/debug 30564 1726882864.27309: done queuing things up, now waiting for results queue to drain 30564 1726882864.27310: waiting for pending results... 30564 1726882864.27638: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 30564 1726882864.27876: in run() - task 0e448fcc-3ce9-4216-acec-000000001460 30564 1726882864.27895: variable 'ansible_search_path' from source: unknown 30564 1726882864.27899: variable 'ansible_search_path' from source: unknown 30564 1726882864.27987: calling self._execute() 30564 1726882864.28147: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882864.28151: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882864.28174: variable 'omit' from source: magic vars 30564 1726882864.28725: variable 'ansible_distribution_major_version' from source: facts 30564 1726882864.28744: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882864.28773: variable 'omit' from source: magic vars 30564 1726882864.28833: variable 'omit' from source: magic vars 30564 1726882864.29009: variable 'network_provider' from source: set_fact 30564 1726882864.29027: variable 'omit' from source: magic vars 30564 1726882864.29105: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882864.29141: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882864.29161: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882864.29204: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882864.29217: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882864.29247: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882864.29250: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882864.29253: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882864.29378: Set connection var ansible_timeout to 10 30564 1726882864.29382: Set connection var ansible_pipelining to False 30564 1726882864.29384: Set connection var ansible_shell_type to sh 30564 1726882864.29392: Set connection var ansible_shell_executable to /bin/sh 30564 1726882864.29406: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882864.29408: Set connection var ansible_connection to ssh 30564 1726882864.29438: variable 'ansible_shell_executable' from source: unknown 30564 1726882864.29442: variable 'ansible_connection' from source: unknown 30564 1726882864.29445: variable 'ansible_module_compression' from source: unknown 30564 1726882864.29447: variable 'ansible_shell_type' from source: unknown 30564 1726882864.29449: variable 'ansible_shell_executable' from source: unknown 30564 1726882864.29451: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882864.29454: variable 'ansible_pipelining' from source: unknown 30564 1726882864.29456: variable 'ansible_timeout' from source: unknown 30564 1726882864.29461: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882864.29628: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882864.29643: variable 'omit' from source: magic vars 30564 1726882864.29646: starting attempt loop 30564 1726882864.29649: running the handler 30564 1726882864.29698: handler run complete 30564 1726882864.29712: attempt loop complete, returning result 30564 1726882864.29719: _execute() done 30564 1726882864.29722: dumping result to json 30564 1726882864.29732: done dumping result, returning 30564 1726882864.29740: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [0e448fcc-3ce9-4216-acec-000000001460] 30564 1726882864.29750: sending task result for task 0e448fcc-3ce9-4216-acec-000000001460 ok: [managed_node2] => {} MSG: Using network provider: nm 30564 1726882864.29899: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001460 30564 1726882864.29917: no more pending results, returning what we have 30564 1726882864.29921: results queue empty 30564 1726882864.29922: checking for any_errors_fatal 30564 1726882864.29936: done checking for any_errors_fatal 30564 1726882864.29937: checking for max_fail_percentage 30564 1726882864.29940: done checking for max_fail_percentage 30564 1726882864.29941: checking to see if all hosts have failed and the running result is not ok 30564 1726882864.29941: done checking to see if all hosts have failed 30564 1726882864.29942: getting the remaining hosts for this loop 30564 1726882864.29944: done getting the remaining hosts for this loop 30564 1726882864.29948: getting the next task for host managed_node2 30564 1726882864.29956: done getting next task for host managed_node2 30564 1726882864.29960: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30564 1726882864.29967: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882864.29980: WORKER PROCESS EXITING 30564 1726882864.29987: getting variables 30564 1726882864.29989: in VariableManager get_vars() 30564 1726882864.30031: Calling all_inventory to load vars for managed_node2 30564 1726882864.30034: Calling groups_inventory to load vars for managed_node2 30564 1726882864.30037: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882864.30047: Calling all_plugins_play to load vars for managed_node2 30564 1726882864.30051: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882864.30053: Calling groups_plugins_play to load vars for managed_node2 30564 1726882864.31224: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882864.32307: done with get_vars() 30564 1726882864.32322: done getting variables 30564 1726882864.32362: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:41:04 -0400 (0:00:00.054) 0:01:02.905 ****** 30564 1726882864.32395: entering _queue_task() for managed_node2/fail 30564 1726882864.32592: worker is 1 (out of 1 available) 30564 1726882864.32605: exiting _queue_task() for managed_node2/fail 30564 1726882864.32618: done queuing things up, now waiting for results queue to drain 30564 1726882864.32619: waiting for pending results... 30564 1726882864.32817: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30564 1726882864.33034: in run() - task 0e448fcc-3ce9-4216-acec-000000001461 30564 1726882864.33037: variable 'ansible_search_path' from source: unknown 30564 1726882864.33040: variable 'ansible_search_path' from source: unknown 30564 1726882864.33043: calling self._execute() 30564 1726882864.33168: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882864.33172: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882864.33176: variable 'omit' from source: magic vars 30564 1726882864.33505: variable 'ansible_distribution_major_version' from source: facts 30564 1726882864.33517: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882864.33642: variable 'network_state' from source: role '' defaults 30564 1726882864.33653: Evaluated conditional (network_state != {}): False 30564 1726882864.33656: when evaluation is False, skipping this task 30564 1726882864.33659: _execute() done 30564 1726882864.33661: dumping result to json 30564 1726882864.33668: done dumping result, returning 30564 1726882864.33678: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0e448fcc-3ce9-4216-acec-000000001461] 30564 1726882864.33688: sending task result for task 0e448fcc-3ce9-4216-acec-000000001461 30564 1726882864.33785: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001461 30564 1726882864.33789: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30564 1726882864.33833: no more pending results, returning what we have 30564 1726882864.33837: results queue empty 30564 1726882864.33838: checking for any_errors_fatal 30564 1726882864.33845: done checking for any_errors_fatal 30564 1726882864.33846: checking for max_fail_percentage 30564 1726882864.33848: done checking for max_fail_percentage 30564 1726882864.33849: checking to see if all hosts have failed and the running result is not ok 30564 1726882864.33850: done checking to see if all hosts have failed 30564 1726882864.33851: getting the remaining hosts for this loop 30564 1726882864.33853: done getting the remaining hosts for this loop 30564 1726882864.33856: getting the next task for host managed_node2 30564 1726882864.33863: done getting next task for host managed_node2 30564 1726882864.33869: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30564 1726882864.33875: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882864.33895: getting variables 30564 1726882864.33897: in VariableManager get_vars() 30564 1726882864.33928: Calling all_inventory to load vars for managed_node2 30564 1726882864.33931: Calling groups_inventory to load vars for managed_node2 30564 1726882864.33933: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882864.33941: Calling all_plugins_play to load vars for managed_node2 30564 1726882864.33944: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882864.33946: Calling groups_plugins_play to load vars for managed_node2 30564 1726882864.35040: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882864.36014: done with get_vars() 30564 1726882864.36030: done getting variables 30564 1726882864.36074: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:41:04 -0400 (0:00:00.037) 0:01:02.942 ****** 30564 1726882864.36098: entering _queue_task() for managed_node2/fail 30564 1726882864.36304: worker is 1 (out of 1 available) 30564 1726882864.36320: exiting _queue_task() for managed_node2/fail 30564 1726882864.36334: done queuing things up, now waiting for results queue to drain 30564 1726882864.36335: waiting for pending results... 30564 1726882864.36533: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30564 1726882864.36627: in run() - task 0e448fcc-3ce9-4216-acec-000000001462 30564 1726882864.36637: variable 'ansible_search_path' from source: unknown 30564 1726882864.36641: variable 'ansible_search_path' from source: unknown 30564 1726882864.36674: calling self._execute() 30564 1726882864.36754: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882864.36758: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882864.36773: variable 'omit' from source: magic vars 30564 1726882864.37050: variable 'ansible_distribution_major_version' from source: facts 30564 1726882864.37061: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882864.37144: variable 'network_state' from source: role '' defaults 30564 1726882864.37153: Evaluated conditional (network_state != {}): False 30564 1726882864.37156: when evaluation is False, skipping this task 30564 1726882864.37159: _execute() done 30564 1726882864.37163: dumping result to json 30564 1726882864.37165: done dumping result, returning 30564 1726882864.37180: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0e448fcc-3ce9-4216-acec-000000001462] 30564 1726882864.37183: sending task result for task 0e448fcc-3ce9-4216-acec-000000001462 30564 1726882864.37271: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001462 30564 1726882864.37274: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30564 1726882864.37333: no more pending results, returning what we have 30564 1726882864.37336: results queue empty 30564 1726882864.37337: checking for any_errors_fatal 30564 1726882864.37344: done checking for any_errors_fatal 30564 1726882864.37345: checking for max_fail_percentage 30564 1726882864.37346: done checking for max_fail_percentage 30564 1726882864.37347: checking to see if all hosts have failed and the running result is not ok 30564 1726882864.37348: done checking to see if all hosts have failed 30564 1726882864.37349: getting the remaining hosts for this loop 30564 1726882864.37350: done getting the remaining hosts for this loop 30564 1726882864.37354: getting the next task for host managed_node2 30564 1726882864.37360: done getting next task for host managed_node2 30564 1726882864.37366: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30564 1726882864.37374: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882864.37400: getting variables 30564 1726882864.37402: in VariableManager get_vars() 30564 1726882864.37439: Calling all_inventory to load vars for managed_node2 30564 1726882864.37442: Calling groups_inventory to load vars for managed_node2 30564 1726882864.37447: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882864.37497: Calling all_plugins_play to load vars for managed_node2 30564 1726882864.37500: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882864.37504: Calling groups_plugins_play to load vars for managed_node2 30564 1726882864.39020: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882864.40714: done with get_vars() 30564 1726882864.40737: done getting variables 30564 1726882864.40798: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:41:04 -0400 (0:00:00.047) 0:01:02.989 ****** 30564 1726882864.40831: entering _queue_task() for managed_node2/fail 30564 1726882864.41100: worker is 1 (out of 1 available) 30564 1726882864.41113: exiting _queue_task() for managed_node2/fail 30564 1726882864.41127: done queuing things up, now waiting for results queue to drain 30564 1726882864.41128: waiting for pending results... 30564 1726882864.41430: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30564 1726882864.41571: in run() - task 0e448fcc-3ce9-4216-acec-000000001463 30564 1726882864.41590: variable 'ansible_search_path' from source: unknown 30564 1726882864.41594: variable 'ansible_search_path' from source: unknown 30564 1726882864.41627: calling self._execute() 30564 1726882864.41725: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882864.41731: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882864.41741: variable 'omit' from source: magic vars 30564 1726882864.42097: variable 'ansible_distribution_major_version' from source: facts 30564 1726882864.42109: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882864.42282: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882864.44752: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882864.44817: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882864.44856: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882864.44891: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882864.44921: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882864.45001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882864.45043: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882864.45076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882864.45118: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882864.45132: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882864.45229: variable 'ansible_distribution_major_version' from source: facts 30564 1726882864.45244: Evaluated conditional (ansible_distribution_major_version | int > 9): False 30564 1726882864.45248: when evaluation is False, skipping this task 30564 1726882864.45251: _execute() done 30564 1726882864.45253: dumping result to json 30564 1726882864.45256: done dumping result, returning 30564 1726882864.45264: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0e448fcc-3ce9-4216-acec-000000001463] 30564 1726882864.45275: sending task result for task 0e448fcc-3ce9-4216-acec-000000001463 30564 1726882864.45373: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001463 30564 1726882864.45377: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 30564 1726882864.45424: no more pending results, returning what we have 30564 1726882864.45429: results queue empty 30564 1726882864.45430: checking for any_errors_fatal 30564 1726882864.45441: done checking for any_errors_fatal 30564 1726882864.45442: checking for max_fail_percentage 30564 1726882864.45444: done checking for max_fail_percentage 30564 1726882864.45445: checking to see if all hosts have failed and the running result is not ok 30564 1726882864.45446: done checking to see if all hosts have failed 30564 1726882864.45447: getting the remaining hosts for this loop 30564 1726882864.45449: done getting the remaining hosts for this loop 30564 1726882864.45453: getting the next task for host managed_node2 30564 1726882864.45461: done getting next task for host managed_node2 30564 1726882864.45468: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30564 1726882864.45474: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882864.45499: getting variables 30564 1726882864.45501: in VariableManager get_vars() 30564 1726882864.45543: Calling all_inventory to load vars for managed_node2 30564 1726882864.45546: Calling groups_inventory to load vars for managed_node2 30564 1726882864.45549: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882864.45560: Calling all_plugins_play to load vars for managed_node2 30564 1726882864.45565: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882864.45568: Calling groups_plugins_play to load vars for managed_node2 30564 1726882864.47225: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882864.49084: done with get_vars() 30564 1726882864.49107: done getting variables 30564 1726882864.49167: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:41:04 -0400 (0:00:00.083) 0:01:03.073 ****** 30564 1726882864.49201: entering _queue_task() for managed_node2/dnf 30564 1726882864.49483: worker is 1 (out of 1 available) 30564 1726882864.49496: exiting _queue_task() for managed_node2/dnf 30564 1726882864.49508: done queuing things up, now waiting for results queue to drain 30564 1726882864.49510: waiting for pending results... 30564 1726882864.49800: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30564 1726882864.49929: in run() - task 0e448fcc-3ce9-4216-acec-000000001464 30564 1726882864.49942: variable 'ansible_search_path' from source: unknown 30564 1726882864.49951: variable 'ansible_search_path' from source: unknown 30564 1726882864.49986: calling self._execute() 30564 1726882864.50082: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882864.50089: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882864.50097: variable 'omit' from source: magic vars 30564 1726882864.50467: variable 'ansible_distribution_major_version' from source: facts 30564 1726882864.50480: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882864.50685: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882864.53101: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882864.53170: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882864.53204: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882864.53243: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882864.53272: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882864.53348: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882864.53390: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882864.53415: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882864.53459: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882864.53474: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882864.53583: variable 'ansible_distribution' from source: facts 30564 1726882864.53586: variable 'ansible_distribution_major_version' from source: facts 30564 1726882864.53600: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 30564 1726882864.53708: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882864.53838: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882864.53861: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882864.53894: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882864.53932: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882864.53945: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882864.53989: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882864.54010: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882864.54033: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882864.54073: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882864.54089: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882864.54127: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882864.54149: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882864.54176: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882864.54218: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882864.54233: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882864.54397: variable 'network_connections' from source: include params 30564 1726882864.54408: variable 'interface' from source: play vars 30564 1726882864.54473: variable 'interface' from source: play vars 30564 1726882864.54543: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882864.54706: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882864.54740: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882864.54777: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882864.54803: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882864.54840: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882864.54865: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882864.54892: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882864.54915: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882864.54959: variable '__network_team_connections_defined' from source: role '' defaults 30564 1726882864.55210: variable 'network_connections' from source: include params 30564 1726882864.55214: variable 'interface' from source: play vars 30564 1726882864.55274: variable 'interface' from source: play vars 30564 1726882864.55300: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30564 1726882864.55303: when evaluation is False, skipping this task 30564 1726882864.55306: _execute() done 30564 1726882864.55308: dumping result to json 30564 1726882864.55310: done dumping result, returning 30564 1726882864.55316: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0e448fcc-3ce9-4216-acec-000000001464] 30564 1726882864.55321: sending task result for task 0e448fcc-3ce9-4216-acec-000000001464 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30564 1726882864.55465: no more pending results, returning what we have 30564 1726882864.55469: results queue empty 30564 1726882864.55471: checking for any_errors_fatal 30564 1726882864.55479: done checking for any_errors_fatal 30564 1726882864.55480: checking for max_fail_percentage 30564 1726882864.55481: done checking for max_fail_percentage 30564 1726882864.55482: checking to see if all hosts have failed and the running result is not ok 30564 1726882864.55483: done checking to see if all hosts have failed 30564 1726882864.55484: getting the remaining hosts for this loop 30564 1726882864.55486: done getting the remaining hosts for this loop 30564 1726882864.55489: getting the next task for host managed_node2 30564 1726882864.55498: done getting next task for host managed_node2 30564 1726882864.55502: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30564 1726882864.55507: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882864.55519: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001464 30564 1726882864.55538: getting variables 30564 1726882864.55540: in VariableManager get_vars() 30564 1726882864.55580: Calling all_inventory to load vars for managed_node2 30564 1726882864.55583: Calling groups_inventory to load vars for managed_node2 30564 1726882864.55585: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882864.55597: Calling all_plugins_play to load vars for managed_node2 30564 1726882864.55600: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882864.55603: Calling groups_plugins_play to load vars for managed_node2 30564 1726882864.56122: WORKER PROCESS EXITING 30564 1726882864.57187: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882864.58950: done with get_vars() 30564 1726882864.58984: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30564 1726882864.59084: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:41:04 -0400 (0:00:00.099) 0:01:03.172 ****** 30564 1726882864.59123: entering _queue_task() for managed_node2/yum 30564 1726882864.59470: worker is 1 (out of 1 available) 30564 1726882864.59485: exiting _queue_task() for managed_node2/yum 30564 1726882864.59497: done queuing things up, now waiting for results queue to drain 30564 1726882864.59499: waiting for pending results... 30564 1726882864.59909: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30564 1726882864.60042: in run() - task 0e448fcc-3ce9-4216-acec-000000001465 30564 1726882864.60070: variable 'ansible_search_path' from source: unknown 30564 1726882864.60074: variable 'ansible_search_path' from source: unknown 30564 1726882864.60110: calling self._execute() 30564 1726882864.60192: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882864.60197: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882864.60207: variable 'omit' from source: magic vars 30564 1726882864.60493: variable 'ansible_distribution_major_version' from source: facts 30564 1726882864.60505: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882864.60628: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882864.62854: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882864.62931: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882864.62977: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882864.63017: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882864.63050: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882864.63140: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882864.63522: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882864.63557: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882864.63612: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882864.63630: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882864.63732: variable 'ansible_distribution_major_version' from source: facts 30564 1726882864.63753: Evaluated conditional (ansible_distribution_major_version | int < 8): False 30564 1726882864.63760: when evaluation is False, skipping this task 30564 1726882864.63769: _execute() done 30564 1726882864.63775: dumping result to json 30564 1726882864.63784: done dumping result, returning 30564 1726882864.63795: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0e448fcc-3ce9-4216-acec-000000001465] 30564 1726882864.63804: sending task result for task 0e448fcc-3ce9-4216-acec-000000001465 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 30564 1726882864.63979: no more pending results, returning what we have 30564 1726882864.63982: results queue empty 30564 1726882864.63983: checking for any_errors_fatal 30564 1726882864.63991: done checking for any_errors_fatal 30564 1726882864.63991: checking for max_fail_percentage 30564 1726882864.63993: done checking for max_fail_percentage 30564 1726882864.63994: checking to see if all hosts have failed and the running result is not ok 30564 1726882864.63995: done checking to see if all hosts have failed 30564 1726882864.63996: getting the remaining hosts for this loop 30564 1726882864.63997: done getting the remaining hosts for this loop 30564 1726882864.64001: getting the next task for host managed_node2 30564 1726882864.64009: done getting next task for host managed_node2 30564 1726882864.64012: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30564 1726882864.64016: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882864.64043: getting variables 30564 1726882864.64044: in VariableManager get_vars() 30564 1726882864.64084: Calling all_inventory to load vars for managed_node2 30564 1726882864.64087: Calling groups_inventory to load vars for managed_node2 30564 1726882864.64089: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882864.64100: Calling all_plugins_play to load vars for managed_node2 30564 1726882864.64103: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882864.64106: Calling groups_plugins_play to load vars for managed_node2 30564 1726882864.64708: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001465 30564 1726882864.64711: WORKER PROCESS EXITING 30564 1726882864.66590: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882864.68391: done with get_vars() 30564 1726882864.68416: done getting variables 30564 1726882864.68481: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:41:04 -0400 (0:00:00.093) 0:01:03.266 ****** 30564 1726882864.68523: entering _queue_task() for managed_node2/fail 30564 1726882864.68856: worker is 1 (out of 1 available) 30564 1726882864.68875: exiting _queue_task() for managed_node2/fail 30564 1726882864.68889: done queuing things up, now waiting for results queue to drain 30564 1726882864.68890: waiting for pending results... 30564 1726882864.69296: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30564 1726882864.69448: in run() - task 0e448fcc-3ce9-4216-acec-000000001466 30564 1726882864.69460: variable 'ansible_search_path' from source: unknown 30564 1726882864.69476: variable 'ansible_search_path' from source: unknown 30564 1726882864.69513: calling self._execute() 30564 1726882864.69727: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882864.69730: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882864.69742: variable 'omit' from source: magic vars 30564 1726882864.70595: variable 'ansible_distribution_major_version' from source: facts 30564 1726882864.70605: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882864.70842: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882864.71167: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882864.73277: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882864.73337: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882864.73543: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882864.73585: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882864.73613: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882864.73695: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882864.73736: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882864.73762: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882864.73819: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882864.73835: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882864.73890: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882864.73914: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882864.73958: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882864.74006: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882864.74024: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882864.74146: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882864.74150: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882864.74152: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882864.74265: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882864.74270: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882864.74438: variable 'network_connections' from source: include params 30564 1726882864.74455: variable 'interface' from source: play vars 30564 1726882864.74490: variable 'interface' from source: play vars 30564 1726882864.74568: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882864.74757: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882864.74802: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882864.74847: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882864.74881: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882864.74917: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882864.74946: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882864.74979: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882864.75018: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882864.75074: variable '__network_team_connections_defined' from source: role '' defaults 30564 1726882864.75649: variable 'network_connections' from source: include params 30564 1726882864.75653: variable 'interface' from source: play vars 30564 1726882864.76028: variable 'interface' from source: play vars 30564 1726882864.76052: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30564 1726882864.76056: when evaluation is False, skipping this task 30564 1726882864.76058: _execute() done 30564 1726882864.76061: dumping result to json 30564 1726882864.76064: done dumping result, returning 30564 1726882864.76077: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-4216-acec-000000001466] 30564 1726882864.76083: sending task result for task 0e448fcc-3ce9-4216-acec-000000001466 30564 1726882864.76182: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001466 30564 1726882864.76185: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30564 1726882864.76240: no more pending results, returning what we have 30564 1726882864.76245: results queue empty 30564 1726882864.76246: checking for any_errors_fatal 30564 1726882864.76255: done checking for any_errors_fatal 30564 1726882864.76256: checking for max_fail_percentage 30564 1726882864.76259: done checking for max_fail_percentage 30564 1726882864.76260: checking to see if all hosts have failed and the running result is not ok 30564 1726882864.76260: done checking to see if all hosts have failed 30564 1726882864.76261: getting the remaining hosts for this loop 30564 1726882864.76266: done getting the remaining hosts for this loop 30564 1726882864.76270: getting the next task for host managed_node2 30564 1726882864.76280: done getting next task for host managed_node2 30564 1726882864.76284: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 30564 1726882864.76289: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882864.76313: getting variables 30564 1726882864.76314: in VariableManager get_vars() 30564 1726882864.76350: Calling all_inventory to load vars for managed_node2 30564 1726882864.76353: Calling groups_inventory to load vars for managed_node2 30564 1726882864.76355: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882864.76422: Calling all_plugins_play to load vars for managed_node2 30564 1726882864.76425: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882864.76428: Calling groups_plugins_play to load vars for managed_node2 30564 1726882864.80291: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882864.82880: done with get_vars() 30564 1726882864.82917: done getting variables 30564 1726882864.83003: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:41:04 -0400 (0:00:00.145) 0:01:03.411 ****** 30564 1726882864.83031: entering _queue_task() for managed_node2/package 30564 1726882864.83306: worker is 1 (out of 1 available) 30564 1726882864.83322: exiting _queue_task() for managed_node2/package 30564 1726882864.83334: done queuing things up, now waiting for results queue to drain 30564 1726882864.83336: waiting for pending results... 30564 1726882864.83528: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 30564 1726882864.83630: in run() - task 0e448fcc-3ce9-4216-acec-000000001467 30564 1726882864.83642: variable 'ansible_search_path' from source: unknown 30564 1726882864.83647: variable 'ansible_search_path' from source: unknown 30564 1726882864.83680: calling self._execute() 30564 1726882864.83757: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882864.83761: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882864.83776: variable 'omit' from source: magic vars 30564 1726882864.84049: variable 'ansible_distribution_major_version' from source: facts 30564 1726882864.84061: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882864.84199: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882864.84395: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882864.84431: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882864.84455: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882864.84761: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882864.84888: variable 'network_packages' from source: role '' defaults 30564 1726882864.84986: variable '__network_provider_setup' from source: role '' defaults 30564 1726882864.85021: variable '__network_service_name_default_nm' from source: role '' defaults 30564 1726882864.85063: variable '__network_service_name_default_nm' from source: role '' defaults 30564 1726882864.85073: variable '__network_packages_default_nm' from source: role '' defaults 30564 1726882864.85143: variable '__network_packages_default_nm' from source: role '' defaults 30564 1726882864.85502: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882864.88031: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882864.88035: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882864.88040: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882864.88043: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882864.88045: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882864.88067: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882864.88113: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882864.88153: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882864.88207: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882864.88218: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882864.88345: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882864.88349: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882864.88358: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882864.88483: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882864.88488: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882864.88718: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30564 1726882864.89089: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882864.89094: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882864.89097: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882864.89099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882864.89103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882864.89105: variable 'ansible_python' from source: facts 30564 1726882864.89129: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30564 1726882864.89475: variable '__network_wpa_supplicant_required' from source: role '' defaults 30564 1726882864.89603: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30564 1726882864.89777: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882864.89806: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882864.89835: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882864.90295: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882864.90315: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882864.90377: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882864.90416: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882864.90433: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882864.90473: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882864.90498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882864.90646: variable 'network_connections' from source: include params 30564 1726882864.90662: variable 'interface' from source: play vars 30564 1726882864.90838: variable 'interface' from source: play vars 30564 1726882864.90903: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882864.90922: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882864.90942: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882864.90988: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882864.91033: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882864.91340: variable 'network_connections' from source: include params 30564 1726882864.91343: variable 'interface' from source: play vars 30564 1726882864.91430: variable 'interface' from source: play vars 30564 1726882864.91454: variable '__network_packages_default_wireless' from source: role '' defaults 30564 1726882864.91511: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882864.91724: variable 'network_connections' from source: include params 30564 1726882864.91727: variable 'interface' from source: play vars 30564 1726882864.91778: variable 'interface' from source: play vars 30564 1726882864.91794: variable '__network_packages_default_team' from source: role '' defaults 30564 1726882864.91846: variable '__network_team_connections_defined' from source: role '' defaults 30564 1726882864.92050: variable 'network_connections' from source: include params 30564 1726882864.92053: variable 'interface' from source: play vars 30564 1726882864.92104: variable 'interface' from source: play vars 30564 1726882864.92140: variable '__network_service_name_default_initscripts' from source: role '' defaults 30564 1726882864.92186: variable '__network_service_name_default_initscripts' from source: role '' defaults 30564 1726882864.92189: variable '__network_packages_default_initscripts' from source: role '' defaults 30564 1726882864.92300: variable '__network_packages_default_initscripts' from source: role '' defaults 30564 1726882864.92556: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30564 1726882864.93094: variable 'network_connections' from source: include params 30564 1726882864.93104: variable 'interface' from source: play vars 30564 1726882864.93335: variable 'interface' from source: play vars 30564 1726882864.93346: variable 'ansible_distribution' from source: facts 30564 1726882864.93353: variable '__network_rh_distros' from source: role '' defaults 30564 1726882864.93440: variable 'ansible_distribution_major_version' from source: facts 30564 1726882864.93458: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30564 1726882864.93774: variable 'ansible_distribution' from source: facts 30564 1726882864.93782: variable '__network_rh_distros' from source: role '' defaults 30564 1726882864.93787: variable 'ansible_distribution_major_version' from source: facts 30564 1726882864.93798: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30564 1726882864.93908: variable 'ansible_distribution' from source: facts 30564 1726882864.93912: variable '__network_rh_distros' from source: role '' defaults 30564 1726882864.93916: variable 'ansible_distribution_major_version' from source: facts 30564 1726882864.93972: variable 'network_provider' from source: set_fact 30564 1726882864.93987: variable 'ansible_facts' from source: unknown 30564 1726882864.94451: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 30564 1726882864.94454: when evaluation is False, skipping this task 30564 1726882864.94457: _execute() done 30564 1726882864.94459: dumping result to json 30564 1726882864.94461: done dumping result, returning 30564 1726882864.94469: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [0e448fcc-3ce9-4216-acec-000000001467] 30564 1726882864.94477: sending task result for task 0e448fcc-3ce9-4216-acec-000000001467 30564 1726882864.94568: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001467 30564 1726882864.94570: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 30564 1726882864.94615: no more pending results, returning what we have 30564 1726882864.94619: results queue empty 30564 1726882864.94620: checking for any_errors_fatal 30564 1726882864.94627: done checking for any_errors_fatal 30564 1726882864.94628: checking for max_fail_percentage 30564 1726882864.94630: done checking for max_fail_percentage 30564 1726882864.94630: checking to see if all hosts have failed and the running result is not ok 30564 1726882864.94631: done checking to see if all hosts have failed 30564 1726882864.94632: getting the remaining hosts for this loop 30564 1726882864.94634: done getting the remaining hosts for this loop 30564 1726882864.94637: getting the next task for host managed_node2 30564 1726882864.94645: done getting next task for host managed_node2 30564 1726882864.94648: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30564 1726882864.94652: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882864.94676: getting variables 30564 1726882864.94678: in VariableManager get_vars() 30564 1726882864.94718: Calling all_inventory to load vars for managed_node2 30564 1726882864.94720: Calling groups_inventory to load vars for managed_node2 30564 1726882864.94722: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882864.94731: Calling all_plugins_play to load vars for managed_node2 30564 1726882864.94734: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882864.94737: Calling groups_plugins_play to load vars for managed_node2 30564 1726882864.95833: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882864.96997: done with get_vars() 30564 1726882864.97021: done getting variables 30564 1726882864.97097: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:41:04 -0400 (0:00:00.140) 0:01:03.552 ****** 30564 1726882864.97128: entering _queue_task() for managed_node2/package 30564 1726882864.97376: worker is 1 (out of 1 available) 30564 1726882864.97389: exiting _queue_task() for managed_node2/package 30564 1726882864.97401: done queuing things up, now waiting for results queue to drain 30564 1726882864.97402: waiting for pending results... 30564 1726882864.97688: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30564 1726882864.97811: in run() - task 0e448fcc-3ce9-4216-acec-000000001468 30564 1726882864.97816: variable 'ansible_search_path' from source: unknown 30564 1726882864.97821: variable 'ansible_search_path' from source: unknown 30564 1726882864.97850: calling self._execute() 30564 1726882864.97929: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882864.97933: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882864.97955: variable 'omit' from source: magic vars 30564 1726882864.98307: variable 'ansible_distribution_major_version' from source: facts 30564 1726882864.98327: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882864.98456: variable 'network_state' from source: role '' defaults 30564 1726882864.98468: Evaluated conditional (network_state != {}): False 30564 1726882864.98474: when evaluation is False, skipping this task 30564 1726882864.98478: _execute() done 30564 1726882864.98480: dumping result to json 30564 1726882864.98482: done dumping result, returning 30564 1726882864.98492: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0e448fcc-3ce9-4216-acec-000000001468] 30564 1726882864.98497: sending task result for task 0e448fcc-3ce9-4216-acec-000000001468 30564 1726882864.98610: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001468 30564 1726882864.98613: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30564 1726882864.98673: no more pending results, returning what we have 30564 1726882864.98707: results queue empty 30564 1726882864.98709: checking for any_errors_fatal 30564 1726882864.98715: done checking for any_errors_fatal 30564 1726882864.98715: checking for max_fail_percentage 30564 1726882864.98717: done checking for max_fail_percentage 30564 1726882864.98718: checking to see if all hosts have failed and the running result is not ok 30564 1726882864.98719: done checking to see if all hosts have failed 30564 1726882864.98746: getting the remaining hosts for this loop 30564 1726882864.98748: done getting the remaining hosts for this loop 30564 1726882864.98751: getting the next task for host managed_node2 30564 1726882864.98756: done getting next task for host managed_node2 30564 1726882864.98760: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30564 1726882864.98763: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882864.98779: getting variables 30564 1726882864.98780: in VariableManager get_vars() 30564 1726882864.98834: Calling all_inventory to load vars for managed_node2 30564 1726882864.98837: Calling groups_inventory to load vars for managed_node2 30564 1726882864.98840: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882864.98849: Calling all_plugins_play to load vars for managed_node2 30564 1726882864.98857: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882864.98861: Calling groups_plugins_play to load vars for managed_node2 30564 1726882865.00000: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882865.00981: done with get_vars() 30564 1726882865.00996: done getting variables 30564 1726882865.01034: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:41:05 -0400 (0:00:00.039) 0:01:03.591 ****** 30564 1726882865.01056: entering _queue_task() for managed_node2/package 30564 1726882865.01250: worker is 1 (out of 1 available) 30564 1726882865.01265: exiting _queue_task() for managed_node2/package 30564 1726882865.01279: done queuing things up, now waiting for results queue to drain 30564 1726882865.01281: waiting for pending results... 30564 1726882865.01472: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30564 1726882865.01559: in run() - task 0e448fcc-3ce9-4216-acec-000000001469 30564 1726882865.01575: variable 'ansible_search_path' from source: unknown 30564 1726882865.01579: variable 'ansible_search_path' from source: unknown 30564 1726882865.01601: calling self._execute() 30564 1726882865.01679: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882865.01683: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882865.01692: variable 'omit' from source: magic vars 30564 1726882865.01970: variable 'ansible_distribution_major_version' from source: facts 30564 1726882865.01979: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882865.02071: variable 'network_state' from source: role '' defaults 30564 1726882865.02075: Evaluated conditional (network_state != {}): False 30564 1726882865.02077: when evaluation is False, skipping this task 30564 1726882865.02080: _execute() done 30564 1726882865.02082: dumping result to json 30564 1726882865.02085: done dumping result, returning 30564 1726882865.02093: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0e448fcc-3ce9-4216-acec-000000001469] 30564 1726882865.02100: sending task result for task 0e448fcc-3ce9-4216-acec-000000001469 30564 1726882865.02202: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001469 30564 1726882865.02204: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30564 1726882865.02255: no more pending results, returning what we have 30564 1726882865.02259: results queue empty 30564 1726882865.02260: checking for any_errors_fatal 30564 1726882865.02272: done checking for any_errors_fatal 30564 1726882865.02273: checking for max_fail_percentage 30564 1726882865.02275: done checking for max_fail_percentage 30564 1726882865.02276: checking to see if all hosts have failed and the running result is not ok 30564 1726882865.02277: done checking to see if all hosts have failed 30564 1726882865.02278: getting the remaining hosts for this loop 30564 1726882865.02279: done getting the remaining hosts for this loop 30564 1726882865.02282: getting the next task for host managed_node2 30564 1726882865.02289: done getting next task for host managed_node2 30564 1726882865.02292: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30564 1726882865.02297: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882865.02323: getting variables 30564 1726882865.02325: in VariableManager get_vars() 30564 1726882865.02352: Calling all_inventory to load vars for managed_node2 30564 1726882865.02354: Calling groups_inventory to load vars for managed_node2 30564 1726882865.02355: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882865.02361: Calling all_plugins_play to load vars for managed_node2 30564 1726882865.02365: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882865.02367: Calling groups_plugins_play to load vars for managed_node2 30564 1726882865.03290: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882865.05052: done with get_vars() 30564 1726882865.05071: done getting variables 30564 1726882865.05113: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:41:05 -0400 (0:00:00.040) 0:01:03.632 ****** 30564 1726882865.05136: entering _queue_task() for managed_node2/service 30564 1726882865.05363: worker is 1 (out of 1 available) 30564 1726882865.05380: exiting _queue_task() for managed_node2/service 30564 1726882865.05398: done queuing things up, now waiting for results queue to drain 30564 1726882865.05399: waiting for pending results... 30564 1726882865.05908: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30564 1726882865.05929: in run() - task 0e448fcc-3ce9-4216-acec-00000000146a 30564 1726882865.05962: variable 'ansible_search_path' from source: unknown 30564 1726882865.05968: variable 'ansible_search_path' from source: unknown 30564 1726882865.06004: calling self._execute() 30564 1726882865.06100: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882865.06117: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882865.06138: variable 'omit' from source: magic vars 30564 1726882865.06487: variable 'ansible_distribution_major_version' from source: facts 30564 1726882865.06498: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882865.06601: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882865.06740: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882865.13571: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882865.13640: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882865.13681: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882865.13717: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882865.13737: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882865.13791: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882865.13818: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882865.13853: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882865.13909: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882865.13923: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882865.13966: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882865.13986: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882865.14006: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882865.14042: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882865.14065: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882865.14116: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882865.14144: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882865.14161: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882865.14209: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882865.14236: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882865.14441: variable 'network_connections' from source: include params 30564 1726882865.14457: variable 'interface' from source: play vars 30564 1726882865.14552: variable 'interface' from source: play vars 30564 1726882865.14645: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882865.14755: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882865.14787: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882865.14809: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882865.14829: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882865.14860: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882865.14891: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882865.14911: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882865.14929: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882865.14957: variable '__network_team_connections_defined' from source: role '' defaults 30564 1726882865.15112: variable 'network_connections' from source: include params 30564 1726882865.15116: variable 'interface' from source: play vars 30564 1726882865.15173: variable 'interface' from source: play vars 30564 1726882865.15190: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30564 1726882865.15224: when evaluation is False, skipping this task 30564 1726882865.15231: _execute() done 30564 1726882865.15246: dumping result to json 30564 1726882865.15261: done dumping result, returning 30564 1726882865.15278: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-4216-acec-00000000146a] 30564 1726882865.15290: sending task result for task 0e448fcc-3ce9-4216-acec-00000000146a 30564 1726882865.15395: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000146a 30564 1726882865.15403: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30564 1726882865.15454: no more pending results, returning what we have 30564 1726882865.15458: results queue empty 30564 1726882865.15459: checking for any_errors_fatal 30564 1726882865.15471: done checking for any_errors_fatal 30564 1726882865.15472: checking for max_fail_percentage 30564 1726882865.15474: done checking for max_fail_percentage 30564 1726882865.15475: checking to see if all hosts have failed and the running result is not ok 30564 1726882865.15476: done checking to see if all hosts have failed 30564 1726882865.15477: getting the remaining hosts for this loop 30564 1726882865.15479: done getting the remaining hosts for this loop 30564 1726882865.15483: getting the next task for host managed_node2 30564 1726882865.15490: done getting next task for host managed_node2 30564 1726882865.15494: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30564 1726882865.15498: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882865.15517: getting variables 30564 1726882865.15518: in VariableManager get_vars() 30564 1726882865.15561: Calling all_inventory to load vars for managed_node2 30564 1726882865.15565: Calling groups_inventory to load vars for managed_node2 30564 1726882865.15570: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882865.15580: Calling all_plugins_play to load vars for managed_node2 30564 1726882865.15583: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882865.15588: Calling groups_plugins_play to load vars for managed_node2 30564 1726882865.22537: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882865.24293: done with get_vars() 30564 1726882865.24325: done getting variables 30564 1726882865.24380: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:41:05 -0400 (0:00:00.192) 0:01:03.825 ****** 30564 1726882865.24417: entering _queue_task() for managed_node2/service 30564 1726882865.24801: worker is 1 (out of 1 available) 30564 1726882865.24821: exiting _queue_task() for managed_node2/service 30564 1726882865.24838: done queuing things up, now waiting for results queue to drain 30564 1726882865.24844: waiting for pending results... 30564 1726882865.25213: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30564 1726882865.25413: in run() - task 0e448fcc-3ce9-4216-acec-00000000146b 30564 1726882865.25434: variable 'ansible_search_path' from source: unknown 30564 1726882865.25448: variable 'ansible_search_path' from source: unknown 30564 1726882865.25506: calling self._execute() 30564 1726882865.25643: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882865.25674: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882865.25680: variable 'omit' from source: magic vars 30564 1726882865.26184: variable 'ansible_distribution_major_version' from source: facts 30564 1726882865.26207: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882865.26379: variable 'network_provider' from source: set_fact 30564 1726882865.26382: variable 'network_state' from source: role '' defaults 30564 1726882865.26392: Evaluated conditional (network_provider == "nm" or network_state != {}): True 30564 1726882865.26404: variable 'omit' from source: magic vars 30564 1726882865.26478: variable 'omit' from source: magic vars 30564 1726882865.26500: variable 'network_service_name' from source: role '' defaults 30564 1726882865.26582: variable 'network_service_name' from source: role '' defaults 30564 1726882865.26655: variable '__network_provider_setup' from source: role '' defaults 30564 1726882865.26661: variable '__network_service_name_default_nm' from source: role '' defaults 30564 1726882865.26713: variable '__network_service_name_default_nm' from source: role '' defaults 30564 1726882865.26719: variable '__network_packages_default_nm' from source: role '' defaults 30564 1726882865.26764: variable '__network_packages_default_nm' from source: role '' defaults 30564 1726882865.26916: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882865.29779: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882865.30158: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882865.30217: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882865.30255: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882865.30305: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882865.30394: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882865.30417: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882865.30436: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882865.30465: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882865.30481: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882865.30512: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882865.30529: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882865.30548: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882865.30581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882865.30589: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882865.30744: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30564 1726882865.30822: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882865.30841: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882865.30857: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882865.30889: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882865.30901: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882865.30988: variable 'ansible_python' from source: facts 30564 1726882865.31000: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30564 1726882865.31058: variable '__network_wpa_supplicant_required' from source: role '' defaults 30564 1726882865.31115: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30564 1726882865.31202: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882865.31219: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882865.31237: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882865.31262: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882865.31278: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882865.31312: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882865.31332: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882865.31351: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882865.31380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882865.31391: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882865.31490: variable 'network_connections' from source: include params 30564 1726882865.31493: variable 'interface' from source: play vars 30564 1726882865.31545: variable 'interface' from source: play vars 30564 1726882865.31621: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882865.31752: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882865.31791: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882865.31822: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882865.31853: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882865.31899: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882865.31922: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882865.31945: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882865.31969: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882865.32009: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882865.32195: variable 'network_connections' from source: include params 30564 1726882865.32201: variable 'interface' from source: play vars 30564 1726882865.32253: variable 'interface' from source: play vars 30564 1726882865.32281: variable '__network_packages_default_wireless' from source: role '' defaults 30564 1726882865.32362: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882865.32757: variable 'network_connections' from source: include params 30564 1726882865.32760: variable 'interface' from source: play vars 30564 1726882865.32862: variable 'interface' from source: play vars 30564 1726882865.32897: variable '__network_packages_default_team' from source: role '' defaults 30564 1726882865.32999: variable '__network_team_connections_defined' from source: role '' defaults 30564 1726882865.33372: variable 'network_connections' from source: include params 30564 1726882865.33378: variable 'interface' from source: play vars 30564 1726882865.33765: variable 'interface' from source: play vars 30564 1726882865.34096: variable '__network_service_name_default_initscripts' from source: role '' defaults 30564 1726882865.34453: variable '__network_service_name_default_initscripts' from source: role '' defaults 30564 1726882865.34498: variable '__network_packages_default_initscripts' from source: role '' defaults 30564 1726882865.34800: variable '__network_packages_default_initscripts' from source: role '' defaults 30564 1726882865.35841: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30564 1726882865.38197: variable 'network_connections' from source: include params 30564 1726882865.38201: variable 'interface' from source: play vars 30564 1726882865.38247: variable 'interface' from source: play vars 30564 1726882865.38253: variable 'ansible_distribution' from source: facts 30564 1726882865.38256: variable '__network_rh_distros' from source: role '' defaults 30564 1726882865.38261: variable 'ansible_distribution_major_version' from source: facts 30564 1726882865.38278: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30564 1726882865.38396: variable 'ansible_distribution' from source: facts 30564 1726882865.38400: variable '__network_rh_distros' from source: role '' defaults 30564 1726882865.38404: variable 'ansible_distribution_major_version' from source: facts 30564 1726882865.38415: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30564 1726882865.38532: variable 'ansible_distribution' from source: facts 30564 1726882865.38537: variable '__network_rh_distros' from source: role '' defaults 30564 1726882865.38551: variable 'ansible_distribution_major_version' from source: facts 30564 1726882865.38594: variable 'network_provider' from source: set_fact 30564 1726882865.38612: variable 'omit' from source: magic vars 30564 1726882865.38635: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882865.38655: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882865.38672: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882865.38690: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882865.38698: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882865.38720: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882865.38724: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882865.38726: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882865.38800: Set connection var ansible_timeout to 10 30564 1726882865.38803: Set connection var ansible_pipelining to False 30564 1726882865.38806: Set connection var ansible_shell_type to sh 30564 1726882865.38812: Set connection var ansible_shell_executable to /bin/sh 30564 1726882865.38818: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882865.38820: Set connection var ansible_connection to ssh 30564 1726882865.38842: variable 'ansible_shell_executable' from source: unknown 30564 1726882865.38845: variable 'ansible_connection' from source: unknown 30564 1726882865.38847: variable 'ansible_module_compression' from source: unknown 30564 1726882865.38850: variable 'ansible_shell_type' from source: unknown 30564 1726882865.38853: variable 'ansible_shell_executable' from source: unknown 30564 1726882865.38855: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882865.38857: variable 'ansible_pipelining' from source: unknown 30564 1726882865.38860: variable 'ansible_timeout' from source: unknown 30564 1726882865.38866: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882865.38938: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882865.38947: variable 'omit' from source: magic vars 30564 1726882865.38950: starting attempt loop 30564 1726882865.38952: running the handler 30564 1726882865.39012: variable 'ansible_facts' from source: unknown 30564 1726882865.39785: _low_level_execute_command(): starting 30564 1726882865.39788: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882865.41238: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882865.41329: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882865.41400: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882865.41470: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882865.41566: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882865.41601: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882865.41621: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882865.41643: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882865.41656: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882865.41671: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882865.41686: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882865.41700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882865.41720: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882865.41740: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882865.41754: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882865.41770: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882865.41876: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882865.41901: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882865.42036: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882865.43688: stdout chunk (state=3): >>>/root <<< 30564 1726882865.43787: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882865.43872: stderr chunk (state=3): >>><<< 30564 1726882865.43886: stdout chunk (state=3): >>><<< 30564 1726882865.44007: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882865.44011: _low_level_execute_command(): starting 30564 1726882865.44015: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882865.4391642-33310-220892780125749 `" && echo ansible-tmp-1726882865.4391642-33310-220892780125749="` echo /root/.ansible/tmp/ansible-tmp-1726882865.4391642-33310-220892780125749 `" ) && sleep 0' 30564 1726882865.44588: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882865.44629: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882865.44633: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882865.44635: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882865.44685: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882865.44691: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882865.44694: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882865.44696: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882865.44747: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882865.44751: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882865.44767: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882865.44914: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882865.46775: stdout chunk (state=3): >>>ansible-tmp-1726882865.4391642-33310-220892780125749=/root/.ansible/tmp/ansible-tmp-1726882865.4391642-33310-220892780125749 <<< 30564 1726882865.46881: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882865.46930: stderr chunk (state=3): >>><<< 30564 1726882865.46933: stdout chunk (state=3): >>><<< 30564 1726882865.46948: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882865.4391642-33310-220892780125749=/root/.ansible/tmp/ansible-tmp-1726882865.4391642-33310-220892780125749 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882865.46982: variable 'ansible_module_compression' from source: unknown 30564 1726882865.47034: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 30564 1726882865.47088: variable 'ansible_facts' from source: unknown 30564 1726882865.47286: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882865.4391642-33310-220892780125749/AnsiballZ_systemd.py 30564 1726882865.47429: Sending initial data 30564 1726882865.47433: Sent initial data (156 bytes) 30564 1726882865.48322: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882865.48330: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882865.48340: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882865.48353: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882865.48392: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882865.48398: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882865.48409: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882865.48422: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882865.48431: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882865.48434: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882865.48441: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882865.48451: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882865.48461: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882865.48477: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882865.48484: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882865.48493: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882865.48587: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882865.48601: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882865.48611: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882865.48729: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882865.50482: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882865.50587: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882865.50690: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmp43cwdqzz /root/.ansible/tmp/ansible-tmp-1726882865.4391642-33310-220892780125749/AnsiballZ_systemd.py <<< 30564 1726882865.50792: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882865.53804: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882865.53896: stderr chunk (state=3): >>><<< 30564 1726882865.53900: stdout chunk (state=3): >>><<< 30564 1726882865.53922: done transferring module to remote 30564 1726882865.53932: _low_level_execute_command(): starting 30564 1726882865.53937: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882865.4391642-33310-220892780125749/ /root/.ansible/tmp/ansible-tmp-1726882865.4391642-33310-220892780125749/AnsiballZ_systemd.py && sleep 0' 30564 1726882865.54603: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882865.54613: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882865.54623: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882865.54638: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882865.54687: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882865.54694: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882865.54704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882865.54717: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882865.54725: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882865.54731: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882865.54738: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882865.54747: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882865.54759: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882865.54773: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882865.54782: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882865.54796: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882865.54859: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882865.54879: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882865.54899: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882865.55033: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882865.56926: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882865.56930: stdout chunk (state=3): >>><<< 30564 1726882865.56935: stderr chunk (state=3): >>><<< 30564 1726882865.56956: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882865.56959: _low_level_execute_command(): starting 30564 1726882865.56965: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882865.4391642-33310-220892780125749/AnsiballZ_systemd.py && sleep 0' 30564 1726882865.57623: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882865.57631: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882865.57641: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882865.57653: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882865.57695: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882865.57712: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882865.57720: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882865.57733: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882865.57740: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882865.57746: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882865.57754: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882865.57763: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882865.57779: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882865.57786: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882865.57793: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882865.57801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882865.57885: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882865.57901: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882865.57912: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882865.58055: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882865.83275: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6692", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ExecMainStartTimestampMonotonic": "202392137", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6692", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3602", "MemoryCurrent": "9158656", "MemoryAvailable": "infinity", "CPUUsageNSec": "2205114000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft"<<< 30564 1726882865.83293: stdout chunk (state=3): >>>: "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service multi-user.target network.target shutdown.target cloud-init.service", "After": "cloud-init-local.service dbus-broker.service network-pre.target system.slice dbus.socket systemd-journald.socket basic.target sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:57 EDT", "StateChangeTimestampMonotonic": "316658837", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveExitTimestampMonotonic": "202392395", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveEnterTimestampMonotonic": "202472383", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveExitTimestampMonotonic": "202362940", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveEnterTimestampMonotonic": "202381901", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ConditionTimestampMonoton<<< 30564 1726882865.83302: stdout chunk (state=3): >>>ic": "202382734", "AssertTimestamp": "Fri 2024-09-20 21:31:03 EDT", "AssertTimestampMonotonic": "202382737", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "55e27919215348fab37a11b7ea324f90", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 30564 1726882865.84891: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882865.84895: stdout chunk (state=3): >>><<< 30564 1726882865.84902: stderr chunk (state=3): >>><<< 30564 1726882865.84920: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6692", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ExecMainStartTimestampMonotonic": "202392137", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6692", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3602", "MemoryCurrent": "9158656", "MemoryAvailable": "infinity", "CPUUsageNSec": "2205114000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service multi-user.target network.target shutdown.target cloud-init.service", "After": "cloud-init-local.service dbus-broker.service network-pre.target system.slice dbus.socket systemd-journald.socket basic.target sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:57 EDT", "StateChangeTimestampMonotonic": "316658837", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveExitTimestampMonotonic": "202392395", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveEnterTimestampMonotonic": "202472383", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveExitTimestampMonotonic": "202362940", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveEnterTimestampMonotonic": "202381901", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ConditionTimestampMonotonic": "202382734", "AssertTimestamp": "Fri 2024-09-20 21:31:03 EDT", "AssertTimestampMonotonic": "202382737", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "55e27919215348fab37a11b7ea324f90", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882865.85104: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882865.4391642-33310-220892780125749/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882865.85121: _low_level_execute_command(): starting 30564 1726882865.85126: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882865.4391642-33310-220892780125749/ > /dev/null 2>&1 && sleep 0' 30564 1726882865.86183: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882865.86263: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882865.86394: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882865.88256: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882865.88260: stdout chunk (state=3): >>><<< 30564 1726882865.88262: stderr chunk (state=3): >>><<< 30564 1726882865.88669: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882865.88673: handler run complete 30564 1726882865.88675: attempt loop complete, returning result 30564 1726882865.88678: _execute() done 30564 1726882865.88680: dumping result to json 30564 1726882865.88682: done dumping result, returning 30564 1726882865.88684: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0e448fcc-3ce9-4216-acec-00000000146b] 30564 1726882865.88686: sending task result for task 0e448fcc-3ce9-4216-acec-00000000146b 30564 1726882865.88822: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000146b 30564 1726882865.88825: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30564 1726882865.88924: no more pending results, returning what we have 30564 1726882865.88926: results queue empty 30564 1726882865.88927: checking for any_errors_fatal 30564 1726882865.88934: done checking for any_errors_fatal 30564 1726882865.88934: checking for max_fail_percentage 30564 1726882865.88936: done checking for max_fail_percentage 30564 1726882865.88937: checking to see if all hosts have failed and the running result is not ok 30564 1726882865.88938: done checking to see if all hosts have failed 30564 1726882865.88938: getting the remaining hosts for this loop 30564 1726882865.88940: done getting the remaining hosts for this loop 30564 1726882865.88943: getting the next task for host managed_node2 30564 1726882865.88950: done getting next task for host managed_node2 30564 1726882865.88953: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30564 1726882865.88958: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882865.88973: getting variables 30564 1726882865.88975: in VariableManager get_vars() 30564 1726882865.89004: Calling all_inventory to load vars for managed_node2 30564 1726882865.89007: Calling groups_inventory to load vars for managed_node2 30564 1726882865.89009: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882865.89018: Calling all_plugins_play to load vars for managed_node2 30564 1726882865.89020: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882865.89023: Calling groups_plugins_play to load vars for managed_node2 30564 1726882865.90831: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882865.92566: done with get_vars() 30564 1726882865.92595: done getting variables 30564 1726882865.92653: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:41:05 -0400 (0:00:00.682) 0:01:04.508 ****** 30564 1726882865.92692: entering _queue_task() for managed_node2/service 30564 1726882865.93065: worker is 1 (out of 1 available) 30564 1726882865.93081: exiting _queue_task() for managed_node2/service 30564 1726882865.93099: done queuing things up, now waiting for results queue to drain 30564 1726882865.93100: waiting for pending results... 30564 1726882865.93420: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30564 1726882865.93583: in run() - task 0e448fcc-3ce9-4216-acec-00000000146c 30564 1726882865.93595: variable 'ansible_search_path' from source: unknown 30564 1726882865.93599: variable 'ansible_search_path' from source: unknown 30564 1726882865.93640: calling self._execute() 30564 1726882865.93748: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882865.93758: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882865.93770: variable 'omit' from source: magic vars 30564 1726882865.94174: variable 'ansible_distribution_major_version' from source: facts 30564 1726882865.94196: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882865.94329: variable 'network_provider' from source: set_fact 30564 1726882865.94334: Evaluated conditional (network_provider == "nm"): True 30564 1726882865.94435: variable '__network_wpa_supplicant_required' from source: role '' defaults 30564 1726882865.94533: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30564 1726882865.94710: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882865.97107: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882865.97180: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882865.97223: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882865.97267: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882865.97294: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882865.97395: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882865.97424: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882865.97454: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882865.97508: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882865.97523: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882865.97578: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882865.97602: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882865.97626: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882865.97665: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882865.97689: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882865.97728: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882865.97750: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882865.97781: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882865.97822: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882865.97835: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882865.97978: variable 'network_connections' from source: include params 30564 1726882865.97995: variable 'interface' from source: play vars 30564 1726882865.98063: variable 'interface' from source: play vars 30564 1726882865.98136: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882865.98295: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882865.98336: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882865.98373: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882865.98409: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882865.98454: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882865.98484: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882865.98516: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882865.98545: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882865.98598: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882865.99033: variable 'network_connections' from source: include params 30564 1726882865.99044: variable 'interface' from source: play vars 30564 1726882865.99113: variable 'interface' from source: play vars 30564 1726882865.99146: Evaluated conditional (__network_wpa_supplicant_required): False 30564 1726882865.99558: when evaluation is False, skipping this task 30564 1726882865.99568: _execute() done 30564 1726882865.99575: dumping result to json 30564 1726882865.99582: done dumping result, returning 30564 1726882865.99603: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0e448fcc-3ce9-4216-acec-00000000146c] 30564 1726882865.99628: sending task result for task 0e448fcc-3ce9-4216-acec-00000000146c skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 30564 1726882865.99788: no more pending results, returning what we have 30564 1726882865.99791: results queue empty 30564 1726882865.99792: checking for any_errors_fatal 30564 1726882865.99818: done checking for any_errors_fatal 30564 1726882865.99819: checking for max_fail_percentage 30564 1726882865.99820: done checking for max_fail_percentage 30564 1726882865.99821: checking to see if all hosts have failed and the running result is not ok 30564 1726882865.99822: done checking to see if all hosts have failed 30564 1726882865.99823: getting the remaining hosts for this loop 30564 1726882865.99824: done getting the remaining hosts for this loop 30564 1726882865.99828: getting the next task for host managed_node2 30564 1726882865.99837: done getting next task for host managed_node2 30564 1726882865.99841: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 30564 1726882865.99845: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882865.99875: getting variables 30564 1726882865.99877: in VariableManager get_vars() 30564 1726882865.99914: Calling all_inventory to load vars for managed_node2 30564 1726882865.99916: Calling groups_inventory to load vars for managed_node2 30564 1726882865.99919: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882865.99931: Calling all_plugins_play to load vars for managed_node2 30564 1726882865.99934: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882865.99937: Calling groups_plugins_play to load vars for managed_node2 30564 1726882866.00457: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000146c 30564 1726882866.00460: WORKER PROCESS EXITING 30564 1726882866.01135: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882866.02409: done with get_vars() 30564 1726882866.02433: done getting variables 30564 1726882866.02502: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:41:06 -0400 (0:00:00.098) 0:01:04.606 ****** 30564 1726882866.02534: entering _queue_task() for managed_node2/service 30564 1726882866.02838: worker is 1 (out of 1 available) 30564 1726882866.02851: exiting _queue_task() for managed_node2/service 30564 1726882866.02865: done queuing things up, now waiting for results queue to drain 30564 1726882866.02866: waiting for pending results... 30564 1726882866.03150: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 30564 1726882866.03300: in run() - task 0e448fcc-3ce9-4216-acec-00000000146d 30564 1726882866.03322: variable 'ansible_search_path' from source: unknown 30564 1726882866.03329: variable 'ansible_search_path' from source: unknown 30564 1726882866.03365: calling self._execute() 30564 1726882866.03456: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882866.03461: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882866.03471: variable 'omit' from source: magic vars 30564 1726882866.03766: variable 'ansible_distribution_major_version' from source: facts 30564 1726882866.03785: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882866.03902: variable 'network_provider' from source: set_fact 30564 1726882866.03912: Evaluated conditional (network_provider == "initscripts"): False 30564 1726882866.03918: when evaluation is False, skipping this task 30564 1726882866.03924: _execute() done 30564 1726882866.03929: dumping result to json 30564 1726882866.03935: done dumping result, returning 30564 1726882866.03944: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [0e448fcc-3ce9-4216-acec-00000000146d] 30564 1726882866.03952: sending task result for task 0e448fcc-3ce9-4216-acec-00000000146d 30564 1726882866.04063: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000146d skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30564 1726882866.04113: no more pending results, returning what we have 30564 1726882866.04117: results queue empty 30564 1726882866.04118: checking for any_errors_fatal 30564 1726882866.04127: done checking for any_errors_fatal 30564 1726882866.04127: checking for max_fail_percentage 30564 1726882866.04129: done checking for max_fail_percentage 30564 1726882866.04130: checking to see if all hosts have failed and the running result is not ok 30564 1726882866.04131: done checking to see if all hosts have failed 30564 1726882866.04131: getting the remaining hosts for this loop 30564 1726882866.04133: done getting the remaining hosts for this loop 30564 1726882866.04137: getting the next task for host managed_node2 30564 1726882866.04145: done getting next task for host managed_node2 30564 1726882866.04149: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30564 1726882866.04155: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882866.04189: getting variables 30564 1726882866.04191: in VariableManager get_vars() 30564 1726882866.04228: Calling all_inventory to load vars for managed_node2 30564 1726882866.04231: Calling groups_inventory to load vars for managed_node2 30564 1726882866.04234: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882866.04248: Calling all_plugins_play to load vars for managed_node2 30564 1726882866.04250: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882866.04254: Calling groups_plugins_play to load vars for managed_node2 30564 1726882866.04779: WORKER PROCESS EXITING 30564 1726882866.05400: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882866.06340: done with get_vars() 30564 1726882866.06356: done getting variables 30564 1726882866.06401: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:41:06 -0400 (0:00:00.038) 0:01:04.645 ****** 30564 1726882866.06425: entering _queue_task() for managed_node2/copy 30564 1726882866.06652: worker is 1 (out of 1 available) 30564 1726882866.06668: exiting _queue_task() for managed_node2/copy 30564 1726882866.06681: done queuing things up, now waiting for results queue to drain 30564 1726882866.06683: waiting for pending results... 30564 1726882866.06870: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30564 1726882866.06977: in run() - task 0e448fcc-3ce9-4216-acec-00000000146e 30564 1726882866.06989: variable 'ansible_search_path' from source: unknown 30564 1726882866.06993: variable 'ansible_search_path' from source: unknown 30564 1726882866.07024: calling self._execute() 30564 1726882866.07103: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882866.07107: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882866.07117: variable 'omit' from source: magic vars 30564 1726882866.07403: variable 'ansible_distribution_major_version' from source: facts 30564 1726882866.07414: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882866.07499: variable 'network_provider' from source: set_fact 30564 1726882866.07503: Evaluated conditional (network_provider == "initscripts"): False 30564 1726882866.07506: when evaluation is False, skipping this task 30564 1726882866.07509: _execute() done 30564 1726882866.07511: dumping result to json 30564 1726882866.07514: done dumping result, returning 30564 1726882866.07523: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0e448fcc-3ce9-4216-acec-00000000146e] 30564 1726882866.07528: sending task result for task 0e448fcc-3ce9-4216-acec-00000000146e 30564 1726882866.07619: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000146e 30564 1726882866.07622: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 30564 1726882866.07673: no more pending results, returning what we have 30564 1726882866.07677: results queue empty 30564 1726882866.07678: checking for any_errors_fatal 30564 1726882866.07687: done checking for any_errors_fatal 30564 1726882866.07687: checking for max_fail_percentage 30564 1726882866.07689: done checking for max_fail_percentage 30564 1726882866.07690: checking to see if all hosts have failed and the running result is not ok 30564 1726882866.07691: done checking to see if all hosts have failed 30564 1726882866.07691: getting the remaining hosts for this loop 30564 1726882866.07693: done getting the remaining hosts for this loop 30564 1726882866.07697: getting the next task for host managed_node2 30564 1726882866.07704: done getting next task for host managed_node2 30564 1726882866.07708: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30564 1726882866.07714: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882866.07738: getting variables 30564 1726882866.07740: in VariableManager get_vars() 30564 1726882866.07773: Calling all_inventory to load vars for managed_node2 30564 1726882866.07775: Calling groups_inventory to load vars for managed_node2 30564 1726882866.07778: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882866.07786: Calling all_plugins_play to load vars for managed_node2 30564 1726882866.07789: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882866.07791: Calling groups_plugins_play to load vars for managed_node2 30564 1726882866.08590: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882866.09996: done with get_vars() 30564 1726882866.10016: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:41:06 -0400 (0:00:00.036) 0:01:04.682 ****** 30564 1726882866.10089: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 30564 1726882866.10332: worker is 1 (out of 1 available) 30564 1726882866.10344: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 30564 1726882866.10355: done queuing things up, now waiting for results queue to drain 30564 1726882866.10356: waiting for pending results... 30564 1726882866.10626: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30564 1726882866.10760: in run() - task 0e448fcc-3ce9-4216-acec-00000000146f 30564 1726882866.10781: variable 'ansible_search_path' from source: unknown 30564 1726882866.10787: variable 'ansible_search_path' from source: unknown 30564 1726882866.10825: calling self._execute() 30564 1726882866.10923: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882866.10935: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882866.10949: variable 'omit' from source: magic vars 30564 1726882866.11299: variable 'ansible_distribution_major_version' from source: facts 30564 1726882866.11315: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882866.11324: variable 'omit' from source: magic vars 30564 1726882866.11386: variable 'omit' from source: magic vars 30564 1726882866.11532: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882866.13621: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882866.13666: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882866.13695: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882866.13720: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882866.13741: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882866.13798: variable 'network_provider' from source: set_fact 30564 1726882866.13888: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882866.13906: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882866.13925: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882866.13952: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882866.13964: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882866.14016: variable 'omit' from source: magic vars 30564 1726882866.14091: variable 'omit' from source: magic vars 30564 1726882866.14157: variable 'network_connections' from source: include params 30564 1726882866.14170: variable 'interface' from source: play vars 30564 1726882866.14212: variable 'interface' from source: play vars 30564 1726882866.14316: variable 'omit' from source: magic vars 30564 1726882866.14324: variable '__lsr_ansible_managed' from source: task vars 30564 1726882866.14369: variable '__lsr_ansible_managed' from source: task vars 30564 1726882866.14501: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 30564 1726882866.14639: Loaded config def from plugin (lookup/template) 30564 1726882866.14643: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 30564 1726882866.14662: File lookup term: get_ansible_managed.j2 30564 1726882866.14670: variable 'ansible_search_path' from source: unknown 30564 1726882866.14674: evaluation_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 30564 1726882866.14687: search_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 30564 1726882866.14698: variable 'ansible_search_path' from source: unknown 30564 1726882866.18241: variable 'ansible_managed' from source: unknown 30564 1726882866.18322: variable 'omit' from source: magic vars 30564 1726882866.18339: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882866.18358: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882866.18371: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882866.18388: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882866.18396: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882866.18417: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882866.18420: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882866.18423: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882866.18484: Set connection var ansible_timeout to 10 30564 1726882866.18487: Set connection var ansible_pipelining to False 30564 1726882866.18490: Set connection var ansible_shell_type to sh 30564 1726882866.18495: Set connection var ansible_shell_executable to /bin/sh 30564 1726882866.18502: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882866.18504: Set connection var ansible_connection to ssh 30564 1726882866.18523: variable 'ansible_shell_executable' from source: unknown 30564 1726882866.18526: variable 'ansible_connection' from source: unknown 30564 1726882866.18528: variable 'ansible_module_compression' from source: unknown 30564 1726882866.18531: variable 'ansible_shell_type' from source: unknown 30564 1726882866.18533: variable 'ansible_shell_executable' from source: unknown 30564 1726882866.18535: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882866.18537: variable 'ansible_pipelining' from source: unknown 30564 1726882866.18540: variable 'ansible_timeout' from source: unknown 30564 1726882866.18544: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882866.18629: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30564 1726882866.18641: variable 'omit' from source: magic vars 30564 1726882866.18644: starting attempt loop 30564 1726882866.18646: running the handler 30564 1726882866.18656: _low_level_execute_command(): starting 30564 1726882866.18661: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882866.19151: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882866.19175: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882866.19188: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30564 1726882866.19200: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882866.19244: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882866.19256: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882866.19375: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882866.21035: stdout chunk (state=3): >>>/root <<< 30564 1726882866.21138: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882866.21189: stderr chunk (state=3): >>><<< 30564 1726882866.21192: stdout chunk (state=3): >>><<< 30564 1726882866.21209: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882866.21218: _low_level_execute_command(): starting 30564 1726882866.21224: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882866.2120895-33369-280212683855458 `" && echo ansible-tmp-1726882866.2120895-33369-280212683855458="` echo /root/.ansible/tmp/ansible-tmp-1726882866.2120895-33369-280212683855458 `" ) && sleep 0' 30564 1726882866.21641: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882866.21653: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882866.21665: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 30564 1726882866.21678: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882866.21688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882866.21732: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882866.21751: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882866.21850: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882866.23728: stdout chunk (state=3): >>>ansible-tmp-1726882866.2120895-33369-280212683855458=/root/.ansible/tmp/ansible-tmp-1726882866.2120895-33369-280212683855458 <<< 30564 1726882866.23848: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882866.23892: stderr chunk (state=3): >>><<< 30564 1726882866.23896: stdout chunk (state=3): >>><<< 30564 1726882866.23908: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882866.2120895-33369-280212683855458=/root/.ansible/tmp/ansible-tmp-1726882866.2120895-33369-280212683855458 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882866.23938: variable 'ansible_module_compression' from source: unknown 30564 1726882866.23974: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 30564 1726882866.24013: variable 'ansible_facts' from source: unknown 30564 1726882866.24104: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882866.2120895-33369-280212683855458/AnsiballZ_network_connections.py 30564 1726882866.24199: Sending initial data 30564 1726882866.24209: Sent initial data (168 bytes) 30564 1726882866.24837: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882866.24841: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882866.24878: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882866.24885: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 30564 1726882866.24888: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882866.24937: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882866.24940: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882866.25040: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882866.26776: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882866.26875: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882866.26967: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmp6tr4u2aa /root/.ansible/tmp/ansible-tmp-1726882866.2120895-33369-280212683855458/AnsiballZ_network_connections.py <<< 30564 1726882866.27062: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882866.28420: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882866.28518: stderr chunk (state=3): >>><<< 30564 1726882866.28521: stdout chunk (state=3): >>><<< 30564 1726882866.28536: done transferring module to remote 30564 1726882866.28545: _low_level_execute_command(): starting 30564 1726882866.28549: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882866.2120895-33369-280212683855458/ /root/.ansible/tmp/ansible-tmp-1726882866.2120895-33369-280212683855458/AnsiballZ_network_connections.py && sleep 0' 30564 1726882866.29000: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882866.29003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882866.29038: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882866.29042: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882866.29046: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 30564 1726882866.29048: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882866.29100: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882866.29104: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882866.29113: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882866.29225: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882866.31030: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882866.31073: stderr chunk (state=3): >>><<< 30564 1726882866.31077: stdout chunk (state=3): >>><<< 30564 1726882866.31089: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882866.31093: _low_level_execute_command(): starting 30564 1726882866.31097: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882866.2120895-33369-280212683855458/AnsiballZ_network_connections.py && sleep 0' 30564 1726882866.31507: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882866.31518: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882866.31543: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882866.31555: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882866.31609: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882866.31621: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882866.31741: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882866.56350: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_7crgmlkx/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_7crgmlkx/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on statebr/ef91e5fd-4b93-4ee4-ae54-4de7a703b196: error=unknown <<< 30564 1726882866.56543: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 30564 1726882866.58118: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882866.58122: stdout chunk (state=3): >>><<< 30564 1726882866.58125: stderr chunk (state=3): >>><<< 30564 1726882866.58285: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_7crgmlkx/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_7crgmlkx/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on statebr/ef91e5fd-4b93-4ee4-ae54-4de7a703b196: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882866.58289: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'persistent_state': 'absent'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882866.2120895-33369-280212683855458/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882866.58296: _low_level_execute_command(): starting 30564 1726882866.58299: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882866.2120895-33369-280212683855458/ > /dev/null 2>&1 && sleep 0' 30564 1726882866.59055: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882866.59059: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882866.59091: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 30564 1726882866.59094: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30564 1726882866.59097: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882866.59099: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882866.59175: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882866.59460: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882866.59574: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882866.61437: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882866.61441: stdout chunk (state=3): >>><<< 30564 1726882866.61448: stderr chunk (state=3): >>><<< 30564 1726882866.61475: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882866.61481: handler run complete 30564 1726882866.61520: attempt loop complete, returning result 30564 1726882866.61523: _execute() done 30564 1726882866.61525: dumping result to json 30564 1726882866.61528: done dumping result, returning 30564 1726882866.61539: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0e448fcc-3ce9-4216-acec-00000000146f] 30564 1726882866.61545: sending task result for task 0e448fcc-3ce9-4216-acec-00000000146f 30564 1726882866.61705: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000146f 30564 1726882866.61708: WORKER PROCESS EXITING changed: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 30564 1726882866.61867: no more pending results, returning what we have 30564 1726882866.61876: results queue empty 30564 1726882866.61878: checking for any_errors_fatal 30564 1726882866.61884: done checking for any_errors_fatal 30564 1726882866.61885: checking for max_fail_percentage 30564 1726882866.61886: done checking for max_fail_percentage 30564 1726882866.61887: checking to see if all hosts have failed and the running result is not ok 30564 1726882866.61888: done checking to see if all hosts have failed 30564 1726882866.61889: getting the remaining hosts for this loop 30564 1726882866.61890: done getting the remaining hosts for this loop 30564 1726882866.61902: getting the next task for host managed_node2 30564 1726882866.61914: done getting next task for host managed_node2 30564 1726882866.61918: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 30564 1726882866.61929: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882866.61941: getting variables 30564 1726882866.61948: in VariableManager get_vars() 30564 1726882866.62019: Calling all_inventory to load vars for managed_node2 30564 1726882866.62022: Calling groups_inventory to load vars for managed_node2 30564 1726882866.62024: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882866.62044: Calling all_plugins_play to load vars for managed_node2 30564 1726882866.62056: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882866.62071: Calling groups_plugins_play to load vars for managed_node2 30564 1726882866.65526: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882866.67827: done with get_vars() 30564 1726882866.67850: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:41:06 -0400 (0:00:00.580) 0:01:05.262 ****** 30564 1726882866.68158: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 30564 1726882866.68939: worker is 1 (out of 1 available) 30564 1726882866.68955: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 30564 1726882866.68971: done queuing things up, now waiting for results queue to drain 30564 1726882866.68972: waiting for pending results... 30564 1726882866.70169: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 30564 1726882866.70436: in run() - task 0e448fcc-3ce9-4216-acec-000000001470 30564 1726882866.70485: variable 'ansible_search_path' from source: unknown 30564 1726882866.70494: variable 'ansible_search_path' from source: unknown 30564 1726882866.70595: calling self._execute() 30564 1726882866.70755: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882866.70769: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882866.70796: variable 'omit' from source: magic vars 30564 1726882866.71208: variable 'ansible_distribution_major_version' from source: facts 30564 1726882866.71240: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882866.71379: variable 'network_state' from source: role '' defaults 30564 1726882866.71396: Evaluated conditional (network_state != {}): False 30564 1726882866.71404: when evaluation is False, skipping this task 30564 1726882866.71412: _execute() done 30564 1726882866.71419: dumping result to json 30564 1726882866.71426: done dumping result, returning 30564 1726882866.71442: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [0e448fcc-3ce9-4216-acec-000000001470] 30564 1726882866.71453: sending task result for task 0e448fcc-3ce9-4216-acec-000000001470 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30564 1726882866.71608: no more pending results, returning what we have 30564 1726882866.71613: results queue empty 30564 1726882866.71614: checking for any_errors_fatal 30564 1726882866.71627: done checking for any_errors_fatal 30564 1726882866.71628: checking for max_fail_percentage 30564 1726882866.71630: done checking for max_fail_percentage 30564 1726882866.71631: checking to see if all hosts have failed and the running result is not ok 30564 1726882866.71632: done checking to see if all hosts have failed 30564 1726882866.71633: getting the remaining hosts for this loop 30564 1726882866.71635: done getting the remaining hosts for this loop 30564 1726882866.71639: getting the next task for host managed_node2 30564 1726882866.71647: done getting next task for host managed_node2 30564 1726882866.71651: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30564 1726882866.71658: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882866.71740: getting variables 30564 1726882866.71743: in VariableManager get_vars() 30564 1726882866.71799: Calling all_inventory to load vars for managed_node2 30564 1726882866.71802: Calling groups_inventory to load vars for managed_node2 30564 1726882866.71805: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882866.71818: Calling all_plugins_play to load vars for managed_node2 30564 1726882866.71822: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882866.71825: Calling groups_plugins_play to load vars for managed_node2 30564 1726882866.73396: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001470 30564 1726882866.73400: WORKER PROCESS EXITING 30564 1726882866.74211: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882866.76264: done with get_vars() 30564 1726882866.76286: done getting variables 30564 1726882866.76341: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:41:06 -0400 (0:00:00.082) 0:01:05.345 ****** 30564 1726882866.76376: entering _queue_task() for managed_node2/debug 30564 1726882866.76647: worker is 1 (out of 1 available) 30564 1726882866.76660: exiting _queue_task() for managed_node2/debug 30564 1726882866.76676: done queuing things up, now waiting for results queue to drain 30564 1726882866.76678: waiting for pending results... 30564 1726882866.76967: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30564 1726882866.77107: in run() - task 0e448fcc-3ce9-4216-acec-000000001471 30564 1726882866.77132: variable 'ansible_search_path' from source: unknown 30564 1726882866.77151: variable 'ansible_search_path' from source: unknown 30564 1726882866.77314: calling self._execute() 30564 1726882866.77690: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882866.77712: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882866.77728: variable 'omit' from source: magic vars 30564 1726882866.78612: variable 'ansible_distribution_major_version' from source: facts 30564 1726882866.78633: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882866.78645: variable 'omit' from source: magic vars 30564 1726882866.78769: variable 'omit' from source: magic vars 30564 1726882866.78832: variable 'omit' from source: magic vars 30564 1726882866.78879: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882866.78922: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882866.78970: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882866.78994: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882866.79011: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882866.79051: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882866.79060: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882866.79072: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882866.79185: Set connection var ansible_timeout to 10 30564 1726882866.79199: Set connection var ansible_pipelining to False 30564 1726882866.79217: Set connection var ansible_shell_type to sh 30564 1726882866.79256: Set connection var ansible_shell_executable to /bin/sh 30564 1726882866.79297: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882866.79345: Set connection var ansible_connection to ssh 30564 1726882866.79442: variable 'ansible_shell_executable' from source: unknown 30564 1726882866.79450: variable 'ansible_connection' from source: unknown 30564 1726882866.79458: variable 'ansible_module_compression' from source: unknown 30564 1726882866.79474: variable 'ansible_shell_type' from source: unknown 30564 1726882866.79493: variable 'ansible_shell_executable' from source: unknown 30564 1726882866.79501: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882866.79508: variable 'ansible_pipelining' from source: unknown 30564 1726882866.79515: variable 'ansible_timeout' from source: unknown 30564 1726882866.79523: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882866.79716: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882866.79733: variable 'omit' from source: magic vars 30564 1726882866.79743: starting attempt loop 30564 1726882866.79750: running the handler 30564 1726882866.79933: variable '__network_connections_result' from source: set_fact 30564 1726882866.80012: handler run complete 30564 1726882866.80041: attempt loop complete, returning result 30564 1726882866.80049: _execute() done 30564 1726882866.80057: dumping result to json 30564 1726882866.80091: done dumping result, returning 30564 1726882866.80105: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0e448fcc-3ce9-4216-acec-000000001471] 30564 1726882866.80115: sending task result for task 0e448fcc-3ce9-4216-acec-000000001471 30564 1726882866.80235: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001471 ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "" ] } 30564 1726882866.80329: no more pending results, returning what we have 30564 1726882866.80333: results queue empty 30564 1726882866.80334: checking for any_errors_fatal 30564 1726882866.80358: done checking for any_errors_fatal 30564 1726882866.80360: checking for max_fail_percentage 30564 1726882866.80361: done checking for max_fail_percentage 30564 1726882866.80362: checking to see if all hosts have failed and the running result is not ok 30564 1726882866.80365: done checking to see if all hosts have failed 30564 1726882866.80366: getting the remaining hosts for this loop 30564 1726882866.80368: done getting the remaining hosts for this loop 30564 1726882866.80372: getting the next task for host managed_node2 30564 1726882866.80380: done getting next task for host managed_node2 30564 1726882866.80396: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30564 1726882866.80403: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882866.80417: getting variables 30564 1726882866.80419: in VariableManager get_vars() 30564 1726882866.80457: Calling all_inventory to load vars for managed_node2 30564 1726882866.80460: Calling groups_inventory to load vars for managed_node2 30564 1726882866.80465: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882866.80476: Calling all_plugins_play to load vars for managed_node2 30564 1726882866.80480: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882866.80484: Calling groups_plugins_play to load vars for managed_node2 30564 1726882866.81674: WORKER PROCESS EXITING 30564 1726882866.82627: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882866.85983: done with get_vars() 30564 1726882866.86049: done getting variables 30564 1726882866.86192: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:41:06 -0400 (0:00:00.098) 0:01:05.444 ****** 30564 1726882866.86300: entering _queue_task() for managed_node2/debug 30564 1726882866.87022: worker is 1 (out of 1 available) 30564 1726882866.87059: exiting _queue_task() for managed_node2/debug 30564 1726882866.87111: done queuing things up, now waiting for results queue to drain 30564 1726882866.87119: waiting for pending results... 30564 1726882866.87650: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30564 1726882866.87955: in run() - task 0e448fcc-3ce9-4216-acec-000000001472 30564 1726882866.88010: variable 'ansible_search_path' from source: unknown 30564 1726882866.88037: variable 'ansible_search_path' from source: unknown 30564 1726882866.88080: calling self._execute() 30564 1726882866.88344: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882866.88390: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882866.88426: variable 'omit' from source: magic vars 30564 1726882866.89099: variable 'ansible_distribution_major_version' from source: facts 30564 1726882866.89118: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882866.89129: variable 'omit' from source: magic vars 30564 1726882866.89198: variable 'omit' from source: magic vars 30564 1726882866.89237: variable 'omit' from source: magic vars 30564 1726882866.89547: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882866.89597: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882866.89832: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882866.90045: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882866.90141: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882866.90291: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882866.90332: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882866.90410: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882866.91157: Set connection var ansible_timeout to 10 30564 1726882866.91239: Set connection var ansible_pipelining to False 30564 1726882866.91266: Set connection var ansible_shell_type to sh 30564 1726882866.91286: Set connection var ansible_shell_executable to /bin/sh 30564 1726882866.91317: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882866.91341: Set connection var ansible_connection to ssh 30564 1726882866.91574: variable 'ansible_shell_executable' from source: unknown 30564 1726882866.91653: variable 'ansible_connection' from source: unknown 30564 1726882866.91686: variable 'ansible_module_compression' from source: unknown 30564 1726882866.91727: variable 'ansible_shell_type' from source: unknown 30564 1726882866.91738: variable 'ansible_shell_executable' from source: unknown 30564 1726882866.91780: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882866.91856: variable 'ansible_pipelining' from source: unknown 30564 1726882866.91881: variable 'ansible_timeout' from source: unknown 30564 1726882866.91918: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882866.92691: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882866.92753: variable 'omit' from source: magic vars 30564 1726882866.92837: starting attempt loop 30564 1726882866.92875: running the handler 30564 1726882866.93179: variable '__network_connections_result' from source: set_fact 30564 1726882866.93541: variable '__network_connections_result' from source: set_fact 30564 1726882866.93821: handler run complete 30564 1726882866.93999: attempt loop complete, returning result 30564 1726882866.94014: _execute() done 30564 1726882866.94031: dumping result to json 30564 1726882866.94078: done dumping result, returning 30564 1726882866.94123: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0e448fcc-3ce9-4216-acec-000000001472] 30564 1726882866.94154: sending task result for task 0e448fcc-3ce9-4216-acec-000000001472 ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 30564 1726882866.94484: no more pending results, returning what we have 30564 1726882866.94488: results queue empty 30564 1726882866.94489: checking for any_errors_fatal 30564 1726882866.94534: done checking for any_errors_fatal 30564 1726882866.94535: checking for max_fail_percentage 30564 1726882866.94538: done checking for max_fail_percentage 30564 1726882866.94539: checking to see if all hosts have failed and the running result is not ok 30564 1726882866.94540: done checking to see if all hosts have failed 30564 1726882866.94541: getting the remaining hosts for this loop 30564 1726882866.94559: done getting the remaining hosts for this loop 30564 1726882866.94565: getting the next task for host managed_node2 30564 1726882866.94599: done getting next task for host managed_node2 30564 1726882866.94616: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30564 1726882866.94623: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882866.94636: getting variables 30564 1726882866.94638: in VariableManager get_vars() 30564 1726882866.94687: Calling all_inventory to load vars for managed_node2 30564 1726882866.94690: Calling groups_inventory to load vars for managed_node2 30564 1726882866.94693: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882866.94706: Calling all_plugins_play to load vars for managed_node2 30564 1726882866.94710: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882866.94723: Calling groups_plugins_play to load vars for managed_node2 30564 1726882866.95768: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001472 30564 1726882866.95778: WORKER PROCESS EXITING 30564 1726882866.97609: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882867.01798: done with get_vars() 30564 1726882867.01837: done getting variables 30564 1726882867.01975: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:41:07 -0400 (0:00:00.157) 0:01:05.602 ****** 30564 1726882867.02095: entering _queue_task() for managed_node2/debug 30564 1726882867.02756: worker is 1 (out of 1 available) 30564 1726882867.02817: exiting _queue_task() for managed_node2/debug 30564 1726882867.02880: done queuing things up, now waiting for results queue to drain 30564 1726882867.02882: waiting for pending results... 30564 1726882867.03361: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30564 1726882867.03605: in run() - task 0e448fcc-3ce9-4216-acec-000000001473 30564 1726882867.03653: variable 'ansible_search_path' from source: unknown 30564 1726882867.03660: variable 'ansible_search_path' from source: unknown 30564 1726882867.03719: calling self._execute() 30564 1726882867.03935: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882867.03946: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882867.03960: variable 'omit' from source: magic vars 30564 1726882867.05111: variable 'ansible_distribution_major_version' from source: facts 30564 1726882867.05140: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882867.05523: variable 'network_state' from source: role '' defaults 30564 1726882867.05545: Evaluated conditional (network_state != {}): False 30564 1726882867.05560: when evaluation is False, skipping this task 30564 1726882867.05585: _execute() done 30564 1726882867.05637: dumping result to json 30564 1726882867.05673: done dumping result, returning 30564 1726882867.05714: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0e448fcc-3ce9-4216-acec-000000001473] 30564 1726882867.05746: sending task result for task 0e448fcc-3ce9-4216-acec-000000001473 skipping: [managed_node2] => { "false_condition": "network_state != {}" } 30564 1726882867.05943: no more pending results, returning what we have 30564 1726882867.05947: results queue empty 30564 1726882867.05948: checking for any_errors_fatal 30564 1726882867.05959: done checking for any_errors_fatal 30564 1726882867.05960: checking for max_fail_percentage 30564 1726882867.05962: done checking for max_fail_percentage 30564 1726882867.05963: checking to see if all hosts have failed and the running result is not ok 30564 1726882867.05965: done checking to see if all hosts have failed 30564 1726882867.05966: getting the remaining hosts for this loop 30564 1726882867.05968: done getting the remaining hosts for this loop 30564 1726882867.05972: getting the next task for host managed_node2 30564 1726882867.05981: done getting next task for host managed_node2 30564 1726882867.05985: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 30564 1726882867.05991: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882867.06018: getting variables 30564 1726882867.06020: in VariableManager get_vars() 30564 1726882867.06060: Calling all_inventory to load vars for managed_node2 30564 1726882867.06065: Calling groups_inventory to load vars for managed_node2 30564 1726882867.06068: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882867.06081: Calling all_plugins_play to load vars for managed_node2 30564 1726882867.06084: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882867.06087: Calling groups_plugins_play to load vars for managed_node2 30564 1726882867.07082: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001473 30564 1726882867.07086: WORKER PROCESS EXITING 30564 1726882867.07994: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882867.09719: done with get_vars() 30564 1726882867.09741: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:41:07 -0400 (0:00:00.077) 0:01:05.679 ****** 30564 1726882867.09842: entering _queue_task() for managed_node2/ping 30564 1726882867.10138: worker is 1 (out of 1 available) 30564 1726882867.10151: exiting _queue_task() for managed_node2/ping 30564 1726882867.10165: done queuing things up, now waiting for results queue to drain 30564 1726882867.10167: waiting for pending results... 30564 1726882867.10459: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 30564 1726882867.10610: in run() - task 0e448fcc-3ce9-4216-acec-000000001474 30564 1726882867.10631: variable 'ansible_search_path' from source: unknown 30564 1726882867.10638: variable 'ansible_search_path' from source: unknown 30564 1726882867.10681: calling self._execute() 30564 1726882867.10789: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882867.10800: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882867.10814: variable 'omit' from source: magic vars 30564 1726882867.11196: variable 'ansible_distribution_major_version' from source: facts 30564 1726882867.11216: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882867.11228: variable 'omit' from source: magic vars 30564 1726882867.11300: variable 'omit' from source: magic vars 30564 1726882867.11334: variable 'omit' from source: magic vars 30564 1726882867.11383: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882867.11423: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882867.11445: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882867.11470: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882867.11491: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882867.11525: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882867.11533: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882867.11540: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882867.11648: Set connection var ansible_timeout to 10 30564 1726882867.11659: Set connection var ansible_pipelining to False 30564 1726882867.11668: Set connection var ansible_shell_type to sh 30564 1726882867.11678: Set connection var ansible_shell_executable to /bin/sh 30564 1726882867.11689: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882867.11700: Set connection var ansible_connection to ssh 30564 1726882867.11727: variable 'ansible_shell_executable' from source: unknown 30564 1726882867.11735: variable 'ansible_connection' from source: unknown 30564 1726882867.11742: variable 'ansible_module_compression' from source: unknown 30564 1726882867.11748: variable 'ansible_shell_type' from source: unknown 30564 1726882867.11754: variable 'ansible_shell_executable' from source: unknown 30564 1726882867.11759: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882867.11769: variable 'ansible_pipelining' from source: unknown 30564 1726882867.11775: variable 'ansible_timeout' from source: unknown 30564 1726882867.11782: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882867.11988: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30564 1726882867.12004: variable 'omit' from source: magic vars 30564 1726882867.12014: starting attempt loop 30564 1726882867.12025: running the handler 30564 1726882867.12043: _low_level_execute_command(): starting 30564 1726882867.12054: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882867.12819: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882867.12835: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882867.12849: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882867.12872: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882867.12915: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882867.12930: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882867.12943: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882867.12961: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882867.12975: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882867.12986: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882867.12998: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882867.13010: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882867.13025: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882867.13041: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882867.13052: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882867.13067: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882867.13146: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882867.13175: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882867.13191: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882867.13326: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882867.15016: stdout chunk (state=3): >>>/root <<< 30564 1726882867.15114: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882867.15189: stderr chunk (state=3): >>><<< 30564 1726882867.15200: stdout chunk (state=3): >>><<< 30564 1726882867.15270: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882867.15275: _low_level_execute_command(): starting 30564 1726882867.15279: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882867.1523194-33415-268949712161524 `" && echo ansible-tmp-1726882867.1523194-33415-268949712161524="` echo /root/.ansible/tmp/ansible-tmp-1726882867.1523194-33415-268949712161524 `" ) && sleep 0' 30564 1726882867.15909: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882867.15921: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882867.15935: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882867.15951: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882867.15999: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882867.16010: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882867.16023: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882867.16040: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882867.16050: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882867.16060: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882867.16074: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882867.16087: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882867.16102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882867.16118: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882867.16128: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882867.16139: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882867.16221: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882867.16243: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882867.16262: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882867.16395: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882867.18346: stdout chunk (state=3): >>>ansible-tmp-1726882867.1523194-33415-268949712161524=/root/.ansible/tmp/ansible-tmp-1726882867.1523194-33415-268949712161524 <<< 30564 1726882867.18456: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882867.18532: stderr chunk (state=3): >>><<< 30564 1726882867.18543: stdout chunk (state=3): >>><<< 30564 1726882867.18773: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882867.1523194-33415-268949712161524=/root/.ansible/tmp/ansible-tmp-1726882867.1523194-33415-268949712161524 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882867.18777: variable 'ansible_module_compression' from source: unknown 30564 1726882867.18780: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 30564 1726882867.18782: variable 'ansible_facts' from source: unknown 30564 1726882867.18784: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882867.1523194-33415-268949712161524/AnsiballZ_ping.py 30564 1726882867.18914: Sending initial data 30564 1726882867.18917: Sent initial data (153 bytes) 30564 1726882867.19895: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882867.19909: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882867.19923: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882867.19941: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882867.19990: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882867.20003: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882867.20017: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882867.20035: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882867.20047: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882867.20059: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882867.20076: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882867.20095: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882867.20113: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882867.20126: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882867.20138: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882867.20152: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882867.20234: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882867.20256: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882867.20275: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882867.20404: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882867.22161: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882867.22255: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882867.22356: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmpzd_urgyk /root/.ansible/tmp/ansible-tmp-1726882867.1523194-33415-268949712161524/AnsiballZ_ping.py <<< 30564 1726882867.22448: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882867.23717: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882867.23956: stderr chunk (state=3): >>><<< 30564 1726882867.23960: stdout chunk (state=3): >>><<< 30564 1726882867.23962: done transferring module to remote 30564 1726882867.23975: _low_level_execute_command(): starting 30564 1726882867.23978: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882867.1523194-33415-268949712161524/ /root/.ansible/tmp/ansible-tmp-1726882867.1523194-33415-268949712161524/AnsiballZ_ping.py && sleep 0' 30564 1726882867.24553: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882867.24570: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882867.24586: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882867.24604: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882867.24654: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882867.24683: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882867.24702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882867.24720: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882867.24740: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882867.24756: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882867.24772: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882867.24788: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882867.24805: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882867.24819: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882867.24832: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882867.24853: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882867.24930: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882867.24951: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882867.24975: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882867.25104: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882867.26945: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882867.26948: stdout chunk (state=3): >>><<< 30564 1726882867.26951: stderr chunk (state=3): >>><<< 30564 1726882867.27031: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882867.27035: _low_level_execute_command(): starting 30564 1726882867.27038: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882867.1523194-33415-268949712161524/AnsiballZ_ping.py && sleep 0' 30564 1726882867.27545: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882867.27559: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882867.27578: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882867.27597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882867.27634: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882867.27642: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882867.27651: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882867.27668: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882867.27678: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882867.27686: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882867.27693: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882867.27702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882867.27720: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882867.27729: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882867.27737: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882867.27750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882867.27829: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882867.27842: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882867.27848: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882867.27984: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882867.40933: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 30564 1726882867.41952: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882867.42021: stderr chunk (state=3): >>><<< 30564 1726882867.42032: stdout chunk (state=3): >>><<< 30564 1726882867.42171: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882867.42180: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882867.1523194-33415-268949712161524/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882867.42183: _low_level_execute_command(): starting 30564 1726882867.42185: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882867.1523194-33415-268949712161524/ > /dev/null 2>&1 && sleep 0' 30564 1726882867.42803: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882867.42816: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882867.42835: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882867.42856: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882867.42899: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882867.42911: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882867.42923: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882867.42946: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882867.42961: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882867.42974: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882867.42985: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882867.42997: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882867.43011: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882867.43022: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882867.43031: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882867.43047: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882867.43129: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882867.43154: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882867.43174: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882867.43308: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882867.45160: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882867.45232: stderr chunk (state=3): >>><<< 30564 1726882867.45242: stdout chunk (state=3): >>><<< 30564 1726882867.45475: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882867.45479: handler run complete 30564 1726882867.45481: attempt loop complete, returning result 30564 1726882867.45483: _execute() done 30564 1726882867.45485: dumping result to json 30564 1726882867.45487: done dumping result, returning 30564 1726882867.45490: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0e448fcc-3ce9-4216-acec-000000001474] 30564 1726882867.45492: sending task result for task 0e448fcc-3ce9-4216-acec-000000001474 30564 1726882867.45560: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001474 30564 1726882867.45563: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 30564 1726882867.45643: no more pending results, returning what we have 30564 1726882867.45647: results queue empty 30564 1726882867.45648: checking for any_errors_fatal 30564 1726882867.45655: done checking for any_errors_fatal 30564 1726882867.45656: checking for max_fail_percentage 30564 1726882867.45657: done checking for max_fail_percentage 30564 1726882867.45659: checking to see if all hosts have failed and the running result is not ok 30564 1726882867.45659: done checking to see if all hosts have failed 30564 1726882867.45660: getting the remaining hosts for this loop 30564 1726882867.45662: done getting the remaining hosts for this loop 30564 1726882867.45674: getting the next task for host managed_node2 30564 1726882867.45685: done getting next task for host managed_node2 30564 1726882867.45687: ^ task is: TASK: meta (role_complete) 30564 1726882867.45694: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882867.45706: getting variables 30564 1726882867.45708: in VariableManager get_vars() 30564 1726882867.45750: Calling all_inventory to load vars for managed_node2 30564 1726882867.45752: Calling groups_inventory to load vars for managed_node2 30564 1726882867.45755: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882867.45767: Calling all_plugins_play to load vars for managed_node2 30564 1726882867.45770: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882867.45773: Calling groups_plugins_play to load vars for managed_node2 30564 1726882867.47816: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882867.49795: done with get_vars() 30564 1726882867.49826: done getting variables 30564 1726882867.49922: done queuing things up, now waiting for results queue to drain 30564 1726882867.49924: results queue empty 30564 1726882867.49925: checking for any_errors_fatal 30564 1726882867.49928: done checking for any_errors_fatal 30564 1726882867.49929: checking for max_fail_percentage 30564 1726882867.49930: done checking for max_fail_percentage 30564 1726882867.49931: checking to see if all hosts have failed and the running result is not ok 30564 1726882867.49932: done checking to see if all hosts have failed 30564 1726882867.49933: getting the remaining hosts for this loop 30564 1726882867.49934: done getting the remaining hosts for this loop 30564 1726882867.49937: getting the next task for host managed_node2 30564 1726882867.49942: done getting next task for host managed_node2 30564 1726882867.49945: ^ task is: TASK: Asserts 30564 1726882867.49947: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882867.49950: getting variables 30564 1726882867.49951: in VariableManager get_vars() 30564 1726882867.49966: Calling all_inventory to load vars for managed_node2 30564 1726882867.49971: Calling groups_inventory to load vars for managed_node2 30564 1726882867.49973: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882867.49979: Calling all_plugins_play to load vars for managed_node2 30564 1726882867.49981: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882867.49984: Calling groups_plugins_play to load vars for managed_node2 30564 1726882867.51151: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882867.52916: done with get_vars() 30564 1726882867.52937: done getting variables TASK [Asserts] ***************************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:36 Friday 20 September 2024 21:41:07 -0400 (0:00:00.431) 0:01:06.111 ****** 30564 1726882867.53016: entering _queue_task() for managed_node2/include_tasks 30564 1726882867.53365: worker is 1 (out of 1 available) 30564 1726882867.53381: exiting _queue_task() for managed_node2/include_tasks 30564 1726882867.53393: done queuing things up, now waiting for results queue to drain 30564 1726882867.53395: waiting for pending results... 30564 1726882867.53703: running TaskExecutor() for managed_node2/TASK: Asserts 30564 1726882867.53803: in run() - task 0e448fcc-3ce9-4216-acec-00000000100a 30564 1726882867.53819: variable 'ansible_search_path' from source: unknown 30564 1726882867.53823: variable 'ansible_search_path' from source: unknown 30564 1726882867.53876: variable 'lsr_assert' from source: include params 30564 1726882867.54092: variable 'lsr_assert' from source: include params 30564 1726882867.54165: variable 'omit' from source: magic vars 30564 1726882867.54309: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882867.54318: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882867.54330: variable 'omit' from source: magic vars 30564 1726882867.54562: variable 'ansible_distribution_major_version' from source: facts 30564 1726882867.54577: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882867.54584: variable 'item' from source: unknown 30564 1726882867.54647: variable 'item' from source: unknown 30564 1726882867.54684: variable 'item' from source: unknown 30564 1726882867.54748: variable 'item' from source: unknown 30564 1726882867.54882: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882867.54886: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882867.54889: variable 'omit' from source: magic vars 30564 1726882867.55012: variable 'ansible_distribution_major_version' from source: facts 30564 1726882867.55017: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882867.55023: variable 'item' from source: unknown 30564 1726882867.55088: variable 'item' from source: unknown 30564 1726882867.55116: variable 'item' from source: unknown 30564 1726882867.55169: variable 'item' from source: unknown 30564 1726882867.55258: dumping result to json 30564 1726882867.55261: done dumping result, returning 30564 1726882867.55266: done running TaskExecutor() for managed_node2/TASK: Asserts [0e448fcc-3ce9-4216-acec-00000000100a] 30564 1726882867.55268: sending task result for task 0e448fcc-3ce9-4216-acec-00000000100a 30564 1726882867.55301: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000100a 30564 1726882867.55304: WORKER PROCESS EXITING 30564 1726882867.55387: no more pending results, returning what we have 30564 1726882867.55391: in VariableManager get_vars() 30564 1726882867.55433: Calling all_inventory to load vars for managed_node2 30564 1726882867.55436: Calling groups_inventory to load vars for managed_node2 30564 1726882867.55439: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882867.55449: Calling all_plugins_play to load vars for managed_node2 30564 1726882867.55452: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882867.55456: Calling groups_plugins_play to load vars for managed_node2 30564 1726882867.56389: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882867.57645: done with get_vars() 30564 1726882867.57672: variable 'ansible_search_path' from source: unknown 30564 1726882867.57678: variable 'ansible_search_path' from source: unknown 30564 1726882867.57719: variable 'ansible_search_path' from source: unknown 30564 1726882867.57720: variable 'ansible_search_path' from source: unknown 30564 1726882867.57751: we have included files to process 30564 1726882867.57752: generating all_blocks data 30564 1726882867.57755: done generating all_blocks data 30564 1726882867.57760: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 30564 1726882867.57761: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 30564 1726882867.57766: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 30564 1726882867.57887: in VariableManager get_vars() 30564 1726882867.57902: done with get_vars() 30564 1726882867.58000: done processing included file 30564 1726882867.58003: iterating over new_blocks loaded from include file 30564 1726882867.58005: in VariableManager get_vars() 30564 1726882867.58016: done with get_vars() 30564 1726882867.58017: filtering new block on tags 30564 1726882867.58038: done filtering new block on tags 30564 1726882867.58039: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed_node2 => (item=tasks/assert_device_present.yml) 30564 1726882867.58042: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 30564 1726882867.58043: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 30564 1726882867.58045: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 30564 1726882867.58141: in VariableManager get_vars() 30564 1726882867.58154: done with get_vars() 30564 1726882867.58214: done processing included file 30564 1726882867.58216: iterating over new_blocks loaded from include file 30564 1726882867.58217: in VariableManager get_vars() 30564 1726882867.58227: done with get_vars() 30564 1726882867.58228: filtering new block on tags 30564 1726882867.58246: done filtering new block on tags 30564 1726882867.58247: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml for managed_node2 => (item=tasks/assert_profile_absent.yml) 30564 1726882867.58249: extending task lists for all hosts with included blocks 30564 1726882867.58869: done extending task lists 30564 1726882867.58871: done processing included files 30564 1726882867.58872: results queue empty 30564 1726882867.58872: checking for any_errors_fatal 30564 1726882867.58873: done checking for any_errors_fatal 30564 1726882867.58874: checking for max_fail_percentage 30564 1726882867.58875: done checking for max_fail_percentage 30564 1726882867.58875: checking to see if all hosts have failed and the running result is not ok 30564 1726882867.58876: done checking to see if all hosts have failed 30564 1726882867.58876: getting the remaining hosts for this loop 30564 1726882867.58877: done getting the remaining hosts for this loop 30564 1726882867.58879: getting the next task for host managed_node2 30564 1726882867.58882: done getting next task for host managed_node2 30564 1726882867.58883: ^ task is: TASK: Include the task 'get_interface_stat.yml' 30564 1726882867.58885: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882867.58891: getting variables 30564 1726882867.58892: in VariableManager get_vars() 30564 1726882867.58898: Calling all_inventory to load vars for managed_node2 30564 1726882867.58900: Calling groups_inventory to load vars for managed_node2 30564 1726882867.58901: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882867.58905: Calling all_plugins_play to load vars for managed_node2 30564 1726882867.58906: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882867.58908: Calling groups_plugins_play to load vars for managed_node2 30564 1726882867.59570: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882867.61033: done with get_vars() 30564 1726882867.61057: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 21:41:07 -0400 (0:00:00.081) 0:01:06.192 ****** 30564 1726882867.61128: entering _queue_task() for managed_node2/include_tasks 30564 1726882867.61438: worker is 1 (out of 1 available) 30564 1726882867.61453: exiting _queue_task() for managed_node2/include_tasks 30564 1726882867.61473: done queuing things up, now waiting for results queue to drain 30564 1726882867.61475: waiting for pending results... 30564 1726882867.61799: running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' 30564 1726882867.61935: in run() - task 0e448fcc-3ce9-4216-acec-0000000015cf 30564 1726882867.61965: variable 'ansible_search_path' from source: unknown 30564 1726882867.61973: variable 'ansible_search_path' from source: unknown 30564 1726882867.62011: calling self._execute() 30564 1726882867.62116: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882867.62129: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882867.62152: variable 'omit' from source: magic vars 30564 1726882867.62539: variable 'ansible_distribution_major_version' from source: facts 30564 1726882867.62558: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882867.62577: _execute() done 30564 1726882867.62587: dumping result to json 30564 1726882867.62594: done dumping result, returning 30564 1726882867.62603: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' [0e448fcc-3ce9-4216-acec-0000000015cf] 30564 1726882867.62614: sending task result for task 0e448fcc-3ce9-4216-acec-0000000015cf 30564 1726882867.62744: no more pending results, returning what we have 30564 1726882867.62749: in VariableManager get_vars() 30564 1726882867.62790: Calling all_inventory to load vars for managed_node2 30564 1726882867.62793: Calling groups_inventory to load vars for managed_node2 30564 1726882867.62797: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882867.62811: Calling all_plugins_play to load vars for managed_node2 30564 1726882867.62815: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882867.62818: Calling groups_plugins_play to load vars for managed_node2 30564 1726882867.63914: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000015cf 30564 1726882867.63918: WORKER PROCESS EXITING 30564 1726882867.64707: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882867.66559: done with get_vars() 30564 1726882867.66580: variable 'ansible_search_path' from source: unknown 30564 1726882867.66582: variable 'ansible_search_path' from source: unknown 30564 1726882867.66589: variable 'item' from source: include params 30564 1726882867.66699: variable 'item' from source: include params 30564 1726882867.66732: we have included files to process 30564 1726882867.66733: generating all_blocks data 30564 1726882867.66735: done generating all_blocks data 30564 1726882867.66736: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30564 1726882867.66737: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30564 1726882867.66739: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30564 1726882867.66918: done processing included file 30564 1726882867.66920: iterating over new_blocks loaded from include file 30564 1726882867.66921: in VariableManager get_vars() 30564 1726882867.66937: done with get_vars() 30564 1726882867.66939: filtering new block on tags 30564 1726882867.66966: done filtering new block on tags 30564 1726882867.66968: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node2 30564 1726882867.66973: extending task lists for all hosts with included blocks 30564 1726882867.67150: done extending task lists 30564 1726882867.67152: done processing included files 30564 1726882867.67153: results queue empty 30564 1726882867.67153: checking for any_errors_fatal 30564 1726882867.67157: done checking for any_errors_fatal 30564 1726882867.67158: checking for max_fail_percentage 30564 1726882867.67159: done checking for max_fail_percentage 30564 1726882867.67159: checking to see if all hosts have failed and the running result is not ok 30564 1726882867.67160: done checking to see if all hosts have failed 30564 1726882867.67161: getting the remaining hosts for this loop 30564 1726882867.67162: done getting the remaining hosts for this loop 30564 1726882867.67167: getting the next task for host managed_node2 30564 1726882867.67171: done getting next task for host managed_node2 30564 1726882867.67173: ^ task is: TASK: Get stat for interface {{ interface }} 30564 1726882867.67177: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882867.67179: getting variables 30564 1726882867.67180: in VariableManager get_vars() 30564 1726882867.67189: Calling all_inventory to load vars for managed_node2 30564 1726882867.67196: Calling groups_inventory to load vars for managed_node2 30564 1726882867.67199: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882867.67208: Calling all_plugins_play to load vars for managed_node2 30564 1726882867.67211: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882867.67214: Calling groups_plugins_play to load vars for managed_node2 30564 1726882867.68591: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882867.70396: done with get_vars() 30564 1726882867.70416: done getting variables 30564 1726882867.70537: variable 'interface' from source: play vars TASK [Get stat for interface statebr] ****************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:41:07 -0400 (0:00:00.094) 0:01:06.287 ****** 30564 1726882867.70575: entering _queue_task() for managed_node2/stat 30564 1726882867.70861: worker is 1 (out of 1 available) 30564 1726882867.70881: exiting _queue_task() for managed_node2/stat 30564 1726882867.70898: done queuing things up, now waiting for results queue to drain 30564 1726882867.70899: waiting for pending results... 30564 1726882867.71213: running TaskExecutor() for managed_node2/TASK: Get stat for interface statebr 30564 1726882867.71354: in run() - task 0e448fcc-3ce9-4216-acec-000000001647 30564 1726882867.71375: variable 'ansible_search_path' from source: unknown 30564 1726882867.71383: variable 'ansible_search_path' from source: unknown 30564 1726882867.71417: calling self._execute() 30564 1726882867.71519: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882867.71529: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882867.71543: variable 'omit' from source: magic vars 30564 1726882867.71901: variable 'ansible_distribution_major_version' from source: facts 30564 1726882867.71919: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882867.71929: variable 'omit' from source: magic vars 30564 1726882867.71987: variable 'omit' from source: magic vars 30564 1726882867.72084: variable 'interface' from source: play vars 30564 1726882867.72115: variable 'omit' from source: magic vars 30564 1726882867.72157: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882867.72196: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882867.72226: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882867.72248: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882867.72262: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882867.72296: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882867.72304: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882867.72310: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882867.72419: Set connection var ansible_timeout to 10 30564 1726882867.72438: Set connection var ansible_pipelining to False 30564 1726882867.72444: Set connection var ansible_shell_type to sh 30564 1726882867.72452: Set connection var ansible_shell_executable to /bin/sh 30564 1726882867.72463: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882867.72471: Set connection var ansible_connection to ssh 30564 1726882867.72499: variable 'ansible_shell_executable' from source: unknown 30564 1726882867.72507: variable 'ansible_connection' from source: unknown 30564 1726882867.72513: variable 'ansible_module_compression' from source: unknown 30564 1726882867.72519: variable 'ansible_shell_type' from source: unknown 30564 1726882867.72524: variable 'ansible_shell_executable' from source: unknown 30564 1726882867.72530: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882867.72545: variable 'ansible_pipelining' from source: unknown 30564 1726882867.72551: variable 'ansible_timeout' from source: unknown 30564 1726882867.72559: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882867.72777: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30564 1726882867.72794: variable 'omit' from source: magic vars 30564 1726882867.72804: starting attempt loop 30564 1726882867.72810: running the handler 30564 1726882867.72830: _low_level_execute_command(): starting 30564 1726882867.72842: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882867.73590: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882867.73639: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882867.73642: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30564 1726882867.73646: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882867.73649: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882867.73691: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882867.73702: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882867.73819: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882867.75495: stdout chunk (state=3): >>>/root <<< 30564 1726882867.75580: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882867.75666: stderr chunk (state=3): >>><<< 30564 1726882867.75670: stdout chunk (state=3): >>><<< 30564 1726882867.75779: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882867.75783: _low_level_execute_command(): starting 30564 1726882867.75786: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882867.7569-33430-8126755935846 `" && echo ansible-tmp-1726882867.7569-33430-8126755935846="` echo /root/.ansible/tmp/ansible-tmp-1726882867.7569-33430-8126755935846 `" ) && sleep 0' 30564 1726882867.76341: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882867.76355: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882867.76374: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882867.76392: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882867.76432: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882867.76468: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882867.76491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882867.76495: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882867.76498: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882867.76556: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882867.76567: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882867.76685: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882867.78588: stdout chunk (state=3): >>>ansible-tmp-1726882867.7569-33430-8126755935846=/root/.ansible/tmp/ansible-tmp-1726882867.7569-33430-8126755935846 <<< 30564 1726882867.78742: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882867.78902: stderr chunk (state=3): >>><<< 30564 1726882867.78923: stdout chunk (state=3): >>><<< 30564 1726882867.78939: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882867.7569-33430-8126755935846=/root/.ansible/tmp/ansible-tmp-1726882867.7569-33430-8126755935846 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882867.79015: variable 'ansible_module_compression' from source: unknown 30564 1726882867.79115: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 30564 1726882867.79168: variable 'ansible_facts' from source: unknown 30564 1726882867.79289: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882867.7569-33430-8126755935846/AnsiballZ_stat.py 30564 1726882867.79401: Sending initial data 30564 1726882867.79406: Sent initial data (148 bytes) 30564 1726882867.80038: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882867.80046: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882867.80054: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882867.80065: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882867.80100: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882867.80110: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882867.80153: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration <<< 30564 1726882867.80156: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882867.80158: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882867.80190: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882867.80193: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882867.80247: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882867.80252: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882867.80312: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882867.80355: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882867.80360: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882867.80506: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882867.82321: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882867.82419: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882867.82517: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmp0xx14nhj /root/.ansible/tmp/ansible-tmp-1726882867.7569-33430-8126755935846/AnsiballZ_stat.py <<< 30564 1726882867.82614: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882867.83897: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882867.84034: stderr chunk (state=3): >>><<< 30564 1726882867.84037: stdout chunk (state=3): >>><<< 30564 1726882867.84054: done transferring module to remote 30564 1726882867.84066: _low_level_execute_command(): starting 30564 1726882867.84075: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882867.7569-33430-8126755935846/ /root/.ansible/tmp/ansible-tmp-1726882867.7569-33430-8126755935846/AnsiballZ_stat.py && sleep 0' 30564 1726882867.84748: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882867.84752: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882867.84754: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882867.84757: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882867.84759: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882867.85081: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882867.85084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882867.85086: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882867.85091: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882867.85093: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882867.85095: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882867.85097: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882867.85099: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882867.85101: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882867.85103: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882867.85104: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882867.85106: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882867.85108: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882867.85110: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882867.85129: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882867.86910: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882867.86914: stderr chunk (state=3): >>><<< 30564 1726882867.86916: stdout chunk (state=3): >>><<< 30564 1726882867.86930: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882867.86933: _low_level_execute_command(): starting 30564 1726882867.86938: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882867.7569-33430-8126755935846/AnsiballZ_stat.py && sleep 0' 30564 1726882867.87925: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882867.87930: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882867.87958: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882867.87967: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882867.87979: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882867.87984: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration <<< 30564 1726882867.87992: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882867.87997: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882867.88007: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882867.88013: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882867.88016: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882867.88026: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882867.88078: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882867.88084: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882867.88094: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882867.88207: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882868.01649: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/statebr", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 32294, "dev": 21, "nlink": 1, "atime": 1726882853.3144777, "mtime": 1726882853.3144777, "ctime": 1726882853.3144777, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/statebr", "lnk_target": "../../devices/virtual/net/statebr", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 30564 1726882868.02726: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882868.02845: stderr chunk (state=3): >>><<< 30564 1726882868.02848: stdout chunk (state=3): >>><<< 30564 1726882868.02884: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/statebr", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 32294, "dev": 21, "nlink": 1, "atime": 1726882853.3144777, "mtime": 1726882853.3144777, "ctime": 1726882853.3144777, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/statebr", "lnk_target": "../../devices/virtual/net/statebr", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882868.02985: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882867.7569-33430-8126755935846/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882868.03015: _low_level_execute_command(): starting 30564 1726882868.03022: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882867.7569-33430-8126755935846/ > /dev/null 2>&1 && sleep 0' 30564 1726882868.03565: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882868.03572: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882868.03627: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 30564 1726882868.03633: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882868.03636: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882868.03638: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882868.03739: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882868.03745: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882868.03857: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882868.05669: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882868.05800: stderr chunk (state=3): >>><<< 30564 1726882868.05846: stdout chunk (state=3): >>><<< 30564 1726882868.05884: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882868.05890: handler run complete 30564 1726882868.05956: attempt loop complete, returning result 30564 1726882868.05959: _execute() done 30564 1726882868.05961: dumping result to json 30564 1726882868.05965: done dumping result, returning 30564 1726882868.05967: done running TaskExecutor() for managed_node2/TASK: Get stat for interface statebr [0e448fcc-3ce9-4216-acec-000000001647] 30564 1726882868.05969: sending task result for task 0e448fcc-3ce9-4216-acec-000000001647 30564 1726882868.06236: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001647 30564 1726882868.06239: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "atime": 1726882853.3144777, "block_size": 4096, "blocks": 0, "ctime": 1726882853.3144777, "dev": 21, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 32294, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/statebr", "lnk_target": "../../devices/virtual/net/statebr", "mode": "0777", "mtime": 1726882853.3144777, "nlink": 1, "path": "/sys/class/net/statebr", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 30564 1726882868.06347: no more pending results, returning what we have 30564 1726882868.06351: results queue empty 30564 1726882868.06351: checking for any_errors_fatal 30564 1726882868.06353: done checking for any_errors_fatal 30564 1726882868.06354: checking for max_fail_percentage 30564 1726882868.06356: done checking for max_fail_percentage 30564 1726882868.06356: checking to see if all hosts have failed and the running result is not ok 30564 1726882868.06357: done checking to see if all hosts have failed 30564 1726882868.06362: getting the remaining hosts for this loop 30564 1726882868.06363: done getting the remaining hosts for this loop 30564 1726882868.06379: getting the next task for host managed_node2 30564 1726882868.06388: done getting next task for host managed_node2 30564 1726882868.06392: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 30564 1726882868.06396: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882868.06401: getting variables 30564 1726882868.06402: in VariableManager get_vars() 30564 1726882868.06435: Calling all_inventory to load vars for managed_node2 30564 1726882868.06438: Calling groups_inventory to load vars for managed_node2 30564 1726882868.06441: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882868.06452: Calling all_plugins_play to load vars for managed_node2 30564 1726882868.06455: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882868.06462: Calling groups_plugins_play to load vars for managed_node2 30564 1726882868.07860: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882868.08878: done with get_vars() 30564 1726882868.08915: done getting variables 30564 1726882868.08980: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30564 1726882868.09140: variable 'interface' from source: play vars TASK [Assert that the interface is present - 'statebr'] ************************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 21:41:08 -0400 (0:00:00.386) 0:01:06.673 ****** 30564 1726882868.09210: entering _queue_task() for managed_node2/assert 30564 1726882868.09602: worker is 1 (out of 1 available) 30564 1726882868.09618: exiting _queue_task() for managed_node2/assert 30564 1726882868.09800: done queuing things up, now waiting for results queue to drain 30564 1726882868.09802: waiting for pending results... 30564 1726882868.10204: running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'statebr' 30564 1726882868.10431: in run() - task 0e448fcc-3ce9-4216-acec-0000000015d0 30564 1726882868.10483: variable 'ansible_search_path' from source: unknown 30564 1726882868.10497: variable 'ansible_search_path' from source: unknown 30564 1726882868.10701: calling self._execute() 30564 1726882868.10842: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882868.10857: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882868.10879: variable 'omit' from source: magic vars 30564 1726882868.11510: variable 'ansible_distribution_major_version' from source: facts 30564 1726882868.11523: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882868.11531: variable 'omit' from source: magic vars 30564 1726882868.11565: variable 'omit' from source: magic vars 30564 1726882868.11641: variable 'interface' from source: play vars 30564 1726882868.11652: variable 'omit' from source: magic vars 30564 1726882868.11690: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882868.11716: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882868.11732: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882868.11749: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882868.11757: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882868.11785: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882868.11788: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882868.11790: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882868.11885: Set connection var ansible_timeout to 10 30564 1726882868.11889: Set connection var ansible_pipelining to False 30564 1726882868.11891: Set connection var ansible_shell_type to sh 30564 1726882868.11897: Set connection var ansible_shell_executable to /bin/sh 30564 1726882868.11903: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882868.11906: Set connection var ansible_connection to ssh 30564 1726882868.11923: variable 'ansible_shell_executable' from source: unknown 30564 1726882868.11926: variable 'ansible_connection' from source: unknown 30564 1726882868.11929: variable 'ansible_module_compression' from source: unknown 30564 1726882868.11933: variable 'ansible_shell_type' from source: unknown 30564 1726882868.11938: variable 'ansible_shell_executable' from source: unknown 30564 1726882868.11940: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882868.11944: variable 'ansible_pipelining' from source: unknown 30564 1726882868.11946: variable 'ansible_timeout' from source: unknown 30564 1726882868.11952: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882868.12054: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882868.12064: variable 'omit' from source: magic vars 30564 1726882868.12070: starting attempt loop 30564 1726882868.12074: running the handler 30564 1726882868.12163: variable 'interface_stat' from source: set_fact 30564 1726882868.12183: Evaluated conditional (interface_stat.stat.exists): True 30564 1726882868.12188: handler run complete 30564 1726882868.12200: attempt loop complete, returning result 30564 1726882868.12202: _execute() done 30564 1726882868.12205: dumping result to json 30564 1726882868.12207: done dumping result, returning 30564 1726882868.12213: done running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'statebr' [0e448fcc-3ce9-4216-acec-0000000015d0] 30564 1726882868.12219: sending task result for task 0e448fcc-3ce9-4216-acec-0000000015d0 30564 1726882868.12324: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000015d0 30564 1726882868.12326: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 30564 1726882868.12400: no more pending results, returning what we have 30564 1726882868.12404: results queue empty 30564 1726882868.12405: checking for any_errors_fatal 30564 1726882868.12418: done checking for any_errors_fatal 30564 1726882868.12418: checking for max_fail_percentage 30564 1726882868.12422: done checking for max_fail_percentage 30564 1726882868.12423: checking to see if all hosts have failed and the running result is not ok 30564 1726882868.12424: done checking to see if all hosts have failed 30564 1726882868.12427: getting the remaining hosts for this loop 30564 1726882868.12429: done getting the remaining hosts for this loop 30564 1726882868.12438: getting the next task for host managed_node2 30564 1726882868.12448: done getting next task for host managed_node2 30564 1726882868.12454: ^ task is: TASK: Include the task 'get_profile_stat.yml' 30564 1726882868.12460: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882868.12466: getting variables 30564 1726882868.12469: in VariableManager get_vars() 30564 1726882868.12510: Calling all_inventory to load vars for managed_node2 30564 1726882868.12513: Calling groups_inventory to load vars for managed_node2 30564 1726882868.12517: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882868.12531: Calling all_plugins_play to load vars for managed_node2 30564 1726882868.12538: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882868.12543: Calling groups_plugins_play to load vars for managed_node2 30564 1726882868.18929: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882868.20779: done with get_vars() 30564 1726882868.20801: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:3 Friday 20 September 2024 21:41:08 -0400 (0:00:00.116) 0:01:06.790 ****** 30564 1726882868.20892: entering _queue_task() for managed_node2/include_tasks 30564 1726882868.21256: worker is 1 (out of 1 available) 30564 1726882868.21274: exiting _queue_task() for managed_node2/include_tasks 30564 1726882868.21287: done queuing things up, now waiting for results queue to drain 30564 1726882868.21289: waiting for pending results... 30564 1726882868.21588: running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' 30564 1726882868.21722: in run() - task 0e448fcc-3ce9-4216-acec-0000000015d4 30564 1726882868.21755: variable 'ansible_search_path' from source: unknown 30564 1726882868.21781: variable 'ansible_search_path' from source: unknown 30564 1726882868.21823: calling self._execute() 30564 1726882868.21945: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882868.21966: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882868.21985: variable 'omit' from source: magic vars 30564 1726882868.22437: variable 'ansible_distribution_major_version' from source: facts 30564 1726882868.22457: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882868.22473: _execute() done 30564 1726882868.22481: dumping result to json 30564 1726882868.22489: done dumping result, returning 30564 1726882868.22508: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' [0e448fcc-3ce9-4216-acec-0000000015d4] 30564 1726882868.22524: sending task result for task 0e448fcc-3ce9-4216-acec-0000000015d4 30564 1726882868.22673: no more pending results, returning what we have 30564 1726882868.22680: in VariableManager get_vars() 30564 1726882868.22721: Calling all_inventory to load vars for managed_node2 30564 1726882868.22726: Calling groups_inventory to load vars for managed_node2 30564 1726882868.22730: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882868.22745: Calling all_plugins_play to load vars for managed_node2 30564 1726882868.22748: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882868.22752: Calling groups_plugins_play to load vars for managed_node2 30564 1726882868.23784: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000015d4 30564 1726882868.23788: WORKER PROCESS EXITING 30564 1726882868.24757: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882868.26616: done with get_vars() 30564 1726882868.26642: variable 'ansible_search_path' from source: unknown 30564 1726882868.26643: variable 'ansible_search_path' from source: unknown 30564 1726882868.26651: variable 'item' from source: include params 30564 1726882868.26791: variable 'item' from source: include params 30564 1726882868.26833: we have included files to process 30564 1726882868.26834: generating all_blocks data 30564 1726882868.26836: done generating all_blocks data 30564 1726882868.26846: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30564 1726882868.26850: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30564 1726882868.26852: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30564 1726882868.28061: done processing included file 30564 1726882868.28067: iterating over new_blocks loaded from include file 30564 1726882868.28070: in VariableManager get_vars() 30564 1726882868.28088: done with get_vars() 30564 1726882868.28090: filtering new block on tags 30564 1726882868.28190: done filtering new block on tags 30564 1726882868.28193: in VariableManager get_vars() 30564 1726882868.28216: done with get_vars() 30564 1726882868.28218: filtering new block on tags 30564 1726882868.28290: done filtering new block on tags 30564 1726882868.28293: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node2 30564 1726882868.28298: extending task lists for all hosts with included blocks 30564 1726882868.28785: done extending task lists 30564 1726882868.28789: done processing included files 30564 1726882868.28790: results queue empty 30564 1726882868.28790: checking for any_errors_fatal 30564 1726882868.28823: done checking for any_errors_fatal 30564 1726882868.28826: checking for max_fail_percentage 30564 1726882868.28827: done checking for max_fail_percentage 30564 1726882868.28828: checking to see if all hosts have failed and the running result is not ok 30564 1726882868.28829: done checking to see if all hosts have failed 30564 1726882868.28830: getting the remaining hosts for this loop 30564 1726882868.28853: done getting the remaining hosts for this loop 30564 1726882868.28858: getting the next task for host managed_node2 30564 1726882868.28862: done getting next task for host managed_node2 30564 1726882868.28869: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 30564 1726882868.28885: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882868.28888: getting variables 30564 1726882868.28889: in VariableManager get_vars() 30564 1726882868.28898: Calling all_inventory to load vars for managed_node2 30564 1726882868.28900: Calling groups_inventory to load vars for managed_node2 30564 1726882868.28902: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882868.28907: Calling all_plugins_play to load vars for managed_node2 30564 1726882868.28909: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882868.28912: Calling groups_plugins_play to load vars for managed_node2 30564 1726882868.31724: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882868.35835: done with get_vars() 30564 1726882868.35898: done getting variables 30564 1726882868.36108: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 21:41:08 -0400 (0:00:00.152) 0:01:06.942 ****** 30564 1726882868.36143: entering _queue_task() for managed_node2/set_fact 30564 1726882868.37496: worker is 1 (out of 1 available) 30564 1726882868.37549: exiting _queue_task() for managed_node2/set_fact 30564 1726882868.37584: done queuing things up, now waiting for results queue to drain 30564 1726882868.37586: waiting for pending results... 30564 1726882868.37986: running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag 30564 1726882868.38126: in run() - task 0e448fcc-3ce9-4216-acec-000000001665 30564 1726882868.38152: variable 'ansible_search_path' from source: unknown 30564 1726882868.38159: variable 'ansible_search_path' from source: unknown 30564 1726882868.38203: calling self._execute() 30564 1726882868.38315: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882868.38329: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882868.38347: variable 'omit' from source: magic vars 30564 1726882868.39582: variable 'ansible_distribution_major_version' from source: facts 30564 1726882868.39619: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882868.39651: variable 'omit' from source: magic vars 30564 1726882868.39759: variable 'omit' from source: magic vars 30564 1726882868.39940: variable 'omit' from source: magic vars 30564 1726882868.40070: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882868.40222: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882868.40273: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882868.40339: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882868.40418: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882868.40508: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882868.40569: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882868.40602: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882868.40950: Set connection var ansible_timeout to 10 30564 1726882868.40962: Set connection var ansible_pipelining to False 30564 1726882868.40977: Set connection var ansible_shell_type to sh 30564 1726882868.40989: Set connection var ansible_shell_executable to /bin/sh 30564 1726882868.41002: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882868.41018: Set connection var ansible_connection to ssh 30564 1726882868.41172: variable 'ansible_shell_executable' from source: unknown 30564 1726882868.41219: variable 'ansible_connection' from source: unknown 30564 1726882868.41252: variable 'ansible_module_compression' from source: unknown 30564 1726882868.41326: variable 'ansible_shell_type' from source: unknown 30564 1726882868.41376: variable 'ansible_shell_executable' from source: unknown 30564 1726882868.41402: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882868.41416: variable 'ansible_pipelining' from source: unknown 30564 1726882868.41428: variable 'ansible_timeout' from source: unknown 30564 1726882868.41456: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882868.41917: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882868.42026: variable 'omit' from source: magic vars 30564 1726882868.42218: starting attempt loop 30564 1726882868.42241: running the handler 30564 1726882868.42269: handler run complete 30564 1726882868.42359: attempt loop complete, returning result 30564 1726882868.42440: _execute() done 30564 1726882868.42453: dumping result to json 30564 1726882868.42489: done dumping result, returning 30564 1726882868.42508: done running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag [0e448fcc-3ce9-4216-acec-000000001665] 30564 1726882868.42547: sending task result for task 0e448fcc-3ce9-4216-acec-000000001665 ok: [managed_node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 30564 1726882868.42752: no more pending results, returning what we have 30564 1726882868.42759: results queue empty 30564 1726882868.42761: checking for any_errors_fatal 30564 1726882868.42763: done checking for any_errors_fatal 30564 1726882868.42767: checking for max_fail_percentage 30564 1726882868.42774: done checking for max_fail_percentage 30564 1726882868.42775: checking to see if all hosts have failed and the running result is not ok 30564 1726882868.42776: done checking to see if all hosts have failed 30564 1726882868.42777: getting the remaining hosts for this loop 30564 1726882868.42782: done getting the remaining hosts for this loop 30564 1726882868.42786: getting the next task for host managed_node2 30564 1726882868.42795: done getting next task for host managed_node2 30564 1726882868.42797: ^ task is: TASK: Stat profile file 30564 1726882868.42804: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882868.42809: getting variables 30564 1726882868.42810: in VariableManager get_vars() 30564 1726882868.42851: Calling all_inventory to load vars for managed_node2 30564 1726882868.42854: Calling groups_inventory to load vars for managed_node2 30564 1726882868.42861: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882868.42883: Calling all_plugins_play to load vars for managed_node2 30564 1726882868.42891: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882868.42895: Calling groups_plugins_play to load vars for managed_node2 30564 1726882868.43949: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001665 30564 1726882868.43952: WORKER PROCESS EXITING 30564 1726882868.46091: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882868.47423: done with get_vars() 30564 1726882868.47439: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 21:41:08 -0400 (0:00:00.113) 0:01:07.056 ****** 30564 1726882868.47522: entering _queue_task() for managed_node2/stat 30564 1726882868.47758: worker is 1 (out of 1 available) 30564 1726882868.47774: exiting _queue_task() for managed_node2/stat 30564 1726882868.47786: done queuing things up, now waiting for results queue to drain 30564 1726882868.47788: waiting for pending results... 30564 1726882868.48098: running TaskExecutor() for managed_node2/TASK: Stat profile file 30564 1726882868.48242: in run() - task 0e448fcc-3ce9-4216-acec-000000001666 30564 1726882868.48270: variable 'ansible_search_path' from source: unknown 30564 1726882868.48279: variable 'ansible_search_path' from source: unknown 30564 1726882868.48315: calling self._execute() 30564 1726882868.48428: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882868.48441: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882868.48460: variable 'omit' from source: magic vars 30564 1726882868.48805: variable 'ansible_distribution_major_version' from source: facts 30564 1726882868.48816: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882868.48821: variable 'omit' from source: magic vars 30564 1726882868.48855: variable 'omit' from source: magic vars 30564 1726882868.48971: variable 'profile' from source: play vars 30564 1726882868.48975: variable 'interface' from source: play vars 30564 1726882868.49134: variable 'interface' from source: play vars 30564 1726882868.49137: variable 'omit' from source: magic vars 30564 1726882868.49140: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882868.49143: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882868.49148: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882868.49150: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882868.49152: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882868.49183: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882868.49186: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882868.49189: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882868.49289: Set connection var ansible_timeout to 10 30564 1726882868.49295: Set connection var ansible_pipelining to False 30564 1726882868.49298: Set connection var ansible_shell_type to sh 30564 1726882868.49304: Set connection var ansible_shell_executable to /bin/sh 30564 1726882868.49312: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882868.49315: Set connection var ansible_connection to ssh 30564 1726882868.49339: variable 'ansible_shell_executable' from source: unknown 30564 1726882868.49343: variable 'ansible_connection' from source: unknown 30564 1726882868.49345: variable 'ansible_module_compression' from source: unknown 30564 1726882868.49347: variable 'ansible_shell_type' from source: unknown 30564 1726882868.49350: variable 'ansible_shell_executable' from source: unknown 30564 1726882868.49352: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882868.49354: variable 'ansible_pipelining' from source: unknown 30564 1726882868.49357: variable 'ansible_timeout' from source: unknown 30564 1726882868.49362: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882868.49563: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30564 1726882868.49577: variable 'omit' from source: magic vars 30564 1726882868.49583: starting attempt loop 30564 1726882868.49586: running the handler 30564 1726882868.49599: _low_level_execute_command(): starting 30564 1726882868.49608: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882868.50258: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882868.50262: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882868.50278: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882868.50285: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882868.50295: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882868.50328: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882868.50331: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 30564 1726882868.50334: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882868.50374: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882868.50381: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882868.50400: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882868.50513: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882868.52160: stdout chunk (state=3): >>>/root <<< 30564 1726882868.52274: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882868.52337: stderr chunk (state=3): >>><<< 30564 1726882868.52341: stdout chunk (state=3): >>><<< 30564 1726882868.52366: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882868.52379: _low_level_execute_command(): starting 30564 1726882868.52382: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882868.5236468-33466-87835933523776 `" && echo ansible-tmp-1726882868.5236468-33466-87835933523776="` echo /root/.ansible/tmp/ansible-tmp-1726882868.5236468-33466-87835933523776 `" ) && sleep 0' 30564 1726882868.52914: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882868.52925: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882868.52935: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882868.52946: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882868.52978: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882868.52993: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30564 1726882868.52996: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration <<< 30564 1726882868.53006: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882868.53016: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882868.53022: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882868.53033: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882868.53036: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882868.53043: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882868.53095: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882868.53107: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882868.53122: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882868.53229: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882868.55097: stdout chunk (state=3): >>>ansible-tmp-1726882868.5236468-33466-87835933523776=/root/.ansible/tmp/ansible-tmp-1726882868.5236468-33466-87835933523776 <<< 30564 1726882868.55210: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882868.55253: stderr chunk (state=3): >>><<< 30564 1726882868.55258: stdout chunk (state=3): >>><<< 30564 1726882868.55273: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882868.5236468-33466-87835933523776=/root/.ansible/tmp/ansible-tmp-1726882868.5236468-33466-87835933523776 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882868.55309: variable 'ansible_module_compression' from source: unknown 30564 1726882868.55351: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 30564 1726882868.55379: variable 'ansible_facts' from source: unknown 30564 1726882868.55438: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882868.5236468-33466-87835933523776/AnsiballZ_stat.py 30564 1726882868.55541: Sending initial data 30564 1726882868.55544: Sent initial data (152 bytes) 30564 1726882868.56182: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882868.56185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882868.56215: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882868.56219: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882868.56222: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882868.56273: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882868.56278: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882868.56392: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882868.58100: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882868.58192: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882868.58292: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmp0ddpsepc /root/.ansible/tmp/ansible-tmp-1726882868.5236468-33466-87835933523776/AnsiballZ_stat.py <<< 30564 1726882868.58385: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882868.59408: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882868.59494: stderr chunk (state=3): >>><<< 30564 1726882868.59498: stdout chunk (state=3): >>><<< 30564 1726882868.59512: done transferring module to remote 30564 1726882868.59520: _low_level_execute_command(): starting 30564 1726882868.59523: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882868.5236468-33466-87835933523776/ /root/.ansible/tmp/ansible-tmp-1726882868.5236468-33466-87835933523776/AnsiballZ_stat.py && sleep 0' 30564 1726882868.59924: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882868.59930: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882868.59980: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882868.59983: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882868.59988: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 30564 1726882868.59990: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882868.60036: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882868.60043: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882868.60156: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882868.61880: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882868.61918: stderr chunk (state=3): >>><<< 30564 1726882868.61922: stdout chunk (state=3): >>><<< 30564 1726882868.61936: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882868.61941: _low_level_execute_command(): starting 30564 1726882868.61944: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882868.5236468-33466-87835933523776/AnsiballZ_stat.py && sleep 0' 30564 1726882868.62338: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882868.62350: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882868.62371: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882868.62385: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882868.62436: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882868.62442: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882868.62568: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882868.75709: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 30564 1726882868.76731: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882868.76747: stderr chunk (state=3): >>>Shared connection to 10.31.11.158 closed. <<< 30564 1726882868.76824: stderr chunk (state=3): >>><<< 30564 1726882868.76834: stdout chunk (state=3): >>><<< 30564 1726882868.76869: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882868.76980: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882868.5236468-33466-87835933523776/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882868.76984: _low_level_execute_command(): starting 30564 1726882868.76986: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882868.5236468-33466-87835933523776/ > /dev/null 2>&1 && sleep 0' 30564 1726882868.77558: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882868.77576: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882868.77591: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882868.77609: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882868.77658: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882868.77676: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882868.77691: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882868.77708: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882868.77720: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882868.77738: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882868.77752: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882868.77779: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882868.77797: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882868.77809: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882868.77820: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882868.77834: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882868.77917: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882868.77940: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882868.77956: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882868.78090: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882868.79910: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882868.79989: stderr chunk (state=3): >>><<< 30564 1726882868.80000: stdout chunk (state=3): >>><<< 30564 1726882868.80175: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882868.80179: handler run complete 30564 1726882868.80181: attempt loop complete, returning result 30564 1726882868.80183: _execute() done 30564 1726882868.80185: dumping result to json 30564 1726882868.80187: done dumping result, returning 30564 1726882868.80189: done running TaskExecutor() for managed_node2/TASK: Stat profile file [0e448fcc-3ce9-4216-acec-000000001666] 30564 1726882868.80191: sending task result for task 0e448fcc-3ce9-4216-acec-000000001666 30564 1726882868.80267: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001666 30564 1726882868.80271: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 30564 1726882868.80523: no more pending results, returning what we have 30564 1726882868.80526: results queue empty 30564 1726882868.80527: checking for any_errors_fatal 30564 1726882868.80533: done checking for any_errors_fatal 30564 1726882868.80534: checking for max_fail_percentage 30564 1726882868.80536: done checking for max_fail_percentage 30564 1726882868.80537: checking to see if all hosts have failed and the running result is not ok 30564 1726882868.80537: done checking to see if all hosts have failed 30564 1726882868.80538: getting the remaining hosts for this loop 30564 1726882868.80540: done getting the remaining hosts for this loop 30564 1726882868.80543: getting the next task for host managed_node2 30564 1726882868.80551: done getting next task for host managed_node2 30564 1726882868.80553: ^ task is: TASK: Set NM profile exist flag based on the profile files 30564 1726882868.80559: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882868.80563: getting variables 30564 1726882868.80569: in VariableManager get_vars() 30564 1726882868.80602: Calling all_inventory to load vars for managed_node2 30564 1726882868.80605: Calling groups_inventory to load vars for managed_node2 30564 1726882868.80608: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882868.80619: Calling all_plugins_play to load vars for managed_node2 30564 1726882868.80622: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882868.80625: Calling groups_plugins_play to load vars for managed_node2 30564 1726882868.82421: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882868.84304: done with get_vars() 30564 1726882868.84326: done getting variables 30564 1726882868.84397: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 21:41:08 -0400 (0:00:00.369) 0:01:07.425 ****** 30564 1726882868.84428: entering _queue_task() for managed_node2/set_fact 30564 1726882868.84746: worker is 1 (out of 1 available) 30564 1726882868.84757: exiting _queue_task() for managed_node2/set_fact 30564 1726882868.84774: done queuing things up, now waiting for results queue to drain 30564 1726882868.84775: waiting for pending results... 30564 1726882868.85086: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files 30564 1726882868.85230: in run() - task 0e448fcc-3ce9-4216-acec-000000001667 30564 1726882868.85257: variable 'ansible_search_path' from source: unknown 30564 1726882868.85267: variable 'ansible_search_path' from source: unknown 30564 1726882868.85306: calling self._execute() 30564 1726882868.85421: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882868.85438: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882868.85465: variable 'omit' from source: magic vars 30564 1726882868.85865: variable 'ansible_distribution_major_version' from source: facts 30564 1726882868.85895: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882868.86039: variable 'profile_stat' from source: set_fact 30564 1726882868.86053: Evaluated conditional (profile_stat.stat.exists): False 30564 1726882868.86061: when evaluation is False, skipping this task 30564 1726882868.86072: _execute() done 30564 1726882868.86080: dumping result to json 30564 1726882868.86092: done dumping result, returning 30564 1726882868.86112: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files [0e448fcc-3ce9-4216-acec-000000001667] 30564 1726882868.86123: sending task result for task 0e448fcc-3ce9-4216-acec-000000001667 skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30564 1726882868.86271: no more pending results, returning what we have 30564 1726882868.86275: results queue empty 30564 1726882868.86276: checking for any_errors_fatal 30564 1726882868.86290: done checking for any_errors_fatal 30564 1726882868.86290: checking for max_fail_percentage 30564 1726882868.86293: done checking for max_fail_percentage 30564 1726882868.86294: checking to see if all hosts have failed and the running result is not ok 30564 1726882868.86294: done checking to see if all hosts have failed 30564 1726882868.86295: getting the remaining hosts for this loop 30564 1726882868.86297: done getting the remaining hosts for this loop 30564 1726882868.86301: getting the next task for host managed_node2 30564 1726882868.86310: done getting next task for host managed_node2 30564 1726882868.86312: ^ task is: TASK: Get NM profile info 30564 1726882868.86319: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882868.86324: getting variables 30564 1726882868.86326: in VariableManager get_vars() 30564 1726882868.86371: Calling all_inventory to load vars for managed_node2 30564 1726882868.86374: Calling groups_inventory to load vars for managed_node2 30564 1726882868.86378: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882868.86392: Calling all_plugins_play to load vars for managed_node2 30564 1726882868.86396: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882868.86399: Calling groups_plugins_play to load vars for managed_node2 30564 1726882868.87432: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001667 30564 1726882868.87436: WORKER PROCESS EXITING 30564 1726882868.88209: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882868.90001: done with get_vars() 30564 1726882868.90021: done getting variables 30564 1726882868.90086: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 21:41:08 -0400 (0:00:00.056) 0:01:07.482 ****** 30564 1726882868.90115: entering _queue_task() for managed_node2/shell 30564 1726882868.90390: worker is 1 (out of 1 available) 30564 1726882868.90407: exiting _queue_task() for managed_node2/shell 30564 1726882868.90420: done queuing things up, now waiting for results queue to drain 30564 1726882868.90421: waiting for pending results... 30564 1726882868.90718: running TaskExecutor() for managed_node2/TASK: Get NM profile info 30564 1726882868.90859: in run() - task 0e448fcc-3ce9-4216-acec-000000001668 30564 1726882868.90885: variable 'ansible_search_path' from source: unknown 30564 1726882868.90894: variable 'ansible_search_path' from source: unknown 30564 1726882868.90932: calling self._execute() 30564 1726882868.91030: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882868.91034: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882868.91043: variable 'omit' from source: magic vars 30564 1726882868.91341: variable 'ansible_distribution_major_version' from source: facts 30564 1726882868.91352: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882868.91358: variable 'omit' from source: magic vars 30564 1726882868.91398: variable 'omit' from source: magic vars 30564 1726882868.91462: variable 'profile' from source: play vars 30564 1726882868.91470: variable 'interface' from source: play vars 30564 1726882868.91516: variable 'interface' from source: play vars 30564 1726882868.91531: variable 'omit' from source: magic vars 30564 1726882868.91565: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882868.91591: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882868.91610: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882868.91621: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882868.91630: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882868.91653: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882868.91657: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882868.91659: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882868.91729: Set connection var ansible_timeout to 10 30564 1726882868.91733: Set connection var ansible_pipelining to False 30564 1726882868.91735: Set connection var ansible_shell_type to sh 30564 1726882868.91740: Set connection var ansible_shell_executable to /bin/sh 30564 1726882868.91747: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882868.91749: Set connection var ansible_connection to ssh 30564 1726882868.91770: variable 'ansible_shell_executable' from source: unknown 30564 1726882868.91773: variable 'ansible_connection' from source: unknown 30564 1726882868.91775: variable 'ansible_module_compression' from source: unknown 30564 1726882868.91777: variable 'ansible_shell_type' from source: unknown 30564 1726882868.91780: variable 'ansible_shell_executable' from source: unknown 30564 1726882868.91782: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882868.91784: variable 'ansible_pipelining' from source: unknown 30564 1726882868.91786: variable 'ansible_timeout' from source: unknown 30564 1726882868.91790: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882868.91887: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882868.91896: variable 'omit' from source: magic vars 30564 1726882868.91901: starting attempt loop 30564 1726882868.91903: running the handler 30564 1726882868.91913: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882868.91929: _low_level_execute_command(): starting 30564 1726882868.91937: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882868.92430: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882868.92441: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882868.92471: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882868.92486: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882868.92536: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882868.92549: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882868.92659: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882868.94343: stdout chunk (state=3): >>>/root <<< 30564 1726882868.94451: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882868.94528: stderr chunk (state=3): >>><<< 30564 1726882868.94546: stdout chunk (state=3): >>><<< 30564 1726882868.94666: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882868.94671: _low_level_execute_command(): starting 30564 1726882868.94674: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882868.9458091-33483-278858045777643 `" && echo ansible-tmp-1726882868.9458091-33483-278858045777643="` echo /root/.ansible/tmp/ansible-tmp-1726882868.9458091-33483-278858045777643 `" ) && sleep 0' 30564 1726882868.95203: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882868.95214: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882868.95255: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882868.95259: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882868.95262: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882868.95311: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882868.95315: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882868.95427: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882868.97310: stdout chunk (state=3): >>>ansible-tmp-1726882868.9458091-33483-278858045777643=/root/.ansible/tmp/ansible-tmp-1726882868.9458091-33483-278858045777643 <<< 30564 1726882868.97473: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882868.97479: stderr chunk (state=3): >>><<< 30564 1726882868.97481: stdout chunk (state=3): >>><<< 30564 1726882868.97500: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882868.9458091-33483-278858045777643=/root/.ansible/tmp/ansible-tmp-1726882868.9458091-33483-278858045777643 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882868.97530: variable 'ansible_module_compression' from source: unknown 30564 1726882868.97592: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30564 1726882868.97623: variable 'ansible_facts' from source: unknown 30564 1726882868.97707: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882868.9458091-33483-278858045777643/AnsiballZ_command.py 30564 1726882868.97826: Sending initial data 30564 1726882868.97829: Sent initial data (156 bytes) 30564 1726882868.98456: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882868.98465: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882868.98513: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882868.98516: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882868.98518: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882868.98567: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882868.98570: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882868.98689: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882869.00416: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882869.00505: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882869.00603: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmpi8wrfs7p /root/.ansible/tmp/ansible-tmp-1726882868.9458091-33483-278858045777643/AnsiballZ_command.py <<< 30564 1726882869.00696: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882869.01878: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882869.02075: stderr chunk (state=3): >>><<< 30564 1726882869.02079: stdout chunk (state=3): >>><<< 30564 1726882869.02081: done transferring module to remote 30564 1726882869.02084: _low_level_execute_command(): starting 30564 1726882869.02086: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882868.9458091-33483-278858045777643/ /root/.ansible/tmp/ansible-tmp-1726882868.9458091-33483-278858045777643/AnsiballZ_command.py && sleep 0' 30564 1726882869.02696: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882869.02711: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882869.02734: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882869.02753: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882869.02798: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882869.02811: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882869.02826: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882869.02854: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882869.02869: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882869.02882: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882869.02894: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882869.02913: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882869.02929: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882869.02941: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882869.02955: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882869.02976: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882869.03053: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882869.03078: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882869.03094: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882869.03222: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882869.04951: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882869.04992: stderr chunk (state=3): >>><<< 30564 1726882869.04995: stdout chunk (state=3): >>><<< 30564 1726882869.05010: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882869.05014: _low_level_execute_command(): starting 30564 1726882869.05017: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882868.9458091-33483-278858045777643/AnsiballZ_command.py && sleep 0' 30564 1726882869.05420: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882869.05425: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882869.05482: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 30564 1726882869.05485: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30564 1726882869.05487: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882869.05489: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882869.05534: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882869.05538: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882869.05646: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882869.20705: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-20 21:41:09.185273", "end": "2024-09-20 21:41:09.204964", "delta": "0:00:00.019691", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30564 1726882869.21917: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.11.158 closed. <<< 30564 1726882869.21921: stderr chunk (state=3): >>><<< 30564 1726882869.21928: stdout chunk (state=3): >>><<< 30564 1726882869.21948: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-20 21:41:09.185273", "end": "2024-09-20 21:41:09.204964", "delta": "0:00:00.019691", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.11.158 closed. 30564 1726882869.21989: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882868.9458091-33483-278858045777643/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882869.21996: _low_level_execute_command(): starting 30564 1726882869.22002: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882868.9458091-33483-278858045777643/ > /dev/null 2>&1 && sleep 0' 30564 1726882869.22625: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882869.22629: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882869.22672: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882869.22681: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration <<< 30564 1726882869.22685: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882869.22704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882869.22709: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882869.22782: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882869.22786: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882869.22795: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882869.22920: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882869.24751: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882869.24843: stderr chunk (state=3): >>><<< 30564 1726882869.24854: stdout chunk (state=3): >>><<< 30564 1726882869.25058: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882869.25061: handler run complete 30564 1726882869.25065: Evaluated conditional (False): False 30564 1726882869.25067: attempt loop complete, returning result 30564 1726882869.25068: _execute() done 30564 1726882869.25070: dumping result to json 30564 1726882869.25072: done dumping result, returning 30564 1726882869.25073: done running TaskExecutor() for managed_node2/TASK: Get NM profile info [0e448fcc-3ce9-4216-acec-000000001668] 30564 1726882869.25075: sending task result for task 0e448fcc-3ce9-4216-acec-000000001668 30564 1726882869.25141: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001668 30564 1726882869.25144: WORKER PROCESS EXITING fatal: [managed_node2]: FAILED! => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "delta": "0:00:00.019691", "end": "2024-09-20 21:41:09.204964", "rc": 1, "start": "2024-09-20 21:41:09.185273" } MSG: non-zero return code ...ignoring 30564 1726882869.25229: no more pending results, returning what we have 30564 1726882869.25232: results queue empty 30564 1726882869.25233: checking for any_errors_fatal 30564 1726882869.25239: done checking for any_errors_fatal 30564 1726882869.25239: checking for max_fail_percentage 30564 1726882869.25241: done checking for max_fail_percentage 30564 1726882869.25242: checking to see if all hosts have failed and the running result is not ok 30564 1726882869.25243: done checking to see if all hosts have failed 30564 1726882869.25243: getting the remaining hosts for this loop 30564 1726882869.25245: done getting the remaining hosts for this loop 30564 1726882869.25248: getting the next task for host managed_node2 30564 1726882869.25256: done getting next task for host managed_node2 30564 1726882869.25258: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 30564 1726882869.25270: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882869.25275: getting variables 30564 1726882869.25276: in VariableManager get_vars() 30564 1726882869.25306: Calling all_inventory to load vars for managed_node2 30564 1726882869.25309: Calling groups_inventory to load vars for managed_node2 30564 1726882869.25312: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882869.25321: Calling all_plugins_play to load vars for managed_node2 30564 1726882869.25324: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882869.25326: Calling groups_plugins_play to load vars for managed_node2 30564 1726882869.26336: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882869.27462: done with get_vars() 30564 1726882869.27491: done getting variables 30564 1726882869.27553: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 21:41:09 -0400 (0:00:00.374) 0:01:07.857 ****** 30564 1726882869.27590: entering _queue_task() for managed_node2/set_fact 30564 1726882869.27909: worker is 1 (out of 1 available) 30564 1726882869.27921: exiting _queue_task() for managed_node2/set_fact 30564 1726882869.27935: done queuing things up, now waiting for results queue to drain 30564 1726882869.27936: waiting for pending results... 30564 1726882869.28253: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 30564 1726882869.28355: in run() - task 0e448fcc-3ce9-4216-acec-000000001669 30564 1726882869.28368: variable 'ansible_search_path' from source: unknown 30564 1726882869.28375: variable 'ansible_search_path' from source: unknown 30564 1726882869.28418: calling self._execute() 30564 1726882869.28510: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882869.28515: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882869.28525: variable 'omit' from source: magic vars 30564 1726882869.28818: variable 'ansible_distribution_major_version' from source: facts 30564 1726882869.28829: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882869.28927: variable 'nm_profile_exists' from source: set_fact 30564 1726882869.28936: Evaluated conditional (nm_profile_exists.rc == 0): False 30564 1726882869.28939: when evaluation is False, skipping this task 30564 1726882869.28942: _execute() done 30564 1726882869.28945: dumping result to json 30564 1726882869.28947: done dumping result, returning 30564 1726882869.28953: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0e448fcc-3ce9-4216-acec-000000001669] 30564 1726882869.28959: sending task result for task 0e448fcc-3ce9-4216-acec-000000001669 30564 1726882869.29050: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001669 30564 1726882869.29053: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "nm_profile_exists.rc == 0", "skip_reason": "Conditional result was False" } 30564 1726882869.29105: no more pending results, returning what we have 30564 1726882869.29109: results queue empty 30564 1726882869.29110: checking for any_errors_fatal 30564 1726882869.29122: done checking for any_errors_fatal 30564 1726882869.29122: checking for max_fail_percentage 30564 1726882869.29124: done checking for max_fail_percentage 30564 1726882869.29125: checking to see if all hosts have failed and the running result is not ok 30564 1726882869.29126: done checking to see if all hosts have failed 30564 1726882869.29126: getting the remaining hosts for this loop 30564 1726882869.29128: done getting the remaining hosts for this loop 30564 1726882869.29131: getting the next task for host managed_node2 30564 1726882869.29140: done getting next task for host managed_node2 30564 1726882869.29143: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 30564 1726882869.29148: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882869.29151: getting variables 30564 1726882869.29152: in VariableManager get_vars() 30564 1726882869.29193: Calling all_inventory to load vars for managed_node2 30564 1726882869.29195: Calling groups_inventory to load vars for managed_node2 30564 1726882869.29198: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882869.29208: Calling all_plugins_play to load vars for managed_node2 30564 1726882869.29211: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882869.29213: Calling groups_plugins_play to load vars for managed_node2 30564 1726882869.30011: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882869.31812: done with get_vars() 30564 1726882869.31833: done getting variables 30564 1726882869.31909: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30564 1726882869.32040: variable 'profile' from source: play vars 30564 1726882869.32045: variable 'interface' from source: play vars 30564 1726882869.32116: variable 'interface' from source: play vars TASK [Get the ansible_managed comment in ifcfg-statebr] ************************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 21:41:09 -0400 (0:00:00.045) 0:01:07.902 ****** 30564 1726882869.32148: entering _queue_task() for managed_node2/command 30564 1726882869.32398: worker is 1 (out of 1 available) 30564 1726882869.32412: exiting _queue_task() for managed_node2/command 30564 1726882869.32424: done queuing things up, now waiting for results queue to drain 30564 1726882869.32426: waiting for pending results... 30564 1726882869.32603: running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-statebr 30564 1726882869.32689: in run() - task 0e448fcc-3ce9-4216-acec-00000000166b 30564 1726882869.32701: variable 'ansible_search_path' from source: unknown 30564 1726882869.32705: variable 'ansible_search_path' from source: unknown 30564 1726882869.32732: calling self._execute() 30564 1726882869.32814: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882869.32818: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882869.32827: variable 'omit' from source: magic vars 30564 1726882869.33089: variable 'ansible_distribution_major_version' from source: facts 30564 1726882869.33102: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882869.33185: variable 'profile_stat' from source: set_fact 30564 1726882869.33192: Evaluated conditional (profile_stat.stat.exists): False 30564 1726882869.33195: when evaluation is False, skipping this task 30564 1726882869.33199: _execute() done 30564 1726882869.33202: dumping result to json 30564 1726882869.33205: done dumping result, returning 30564 1726882869.33212: done running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-statebr [0e448fcc-3ce9-4216-acec-00000000166b] 30564 1726882869.33214: sending task result for task 0e448fcc-3ce9-4216-acec-00000000166b 30564 1726882869.33302: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000166b 30564 1726882869.33305: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30564 1726882869.33379: no more pending results, returning what we have 30564 1726882869.33382: results queue empty 30564 1726882869.33383: checking for any_errors_fatal 30564 1726882869.33389: done checking for any_errors_fatal 30564 1726882869.33389: checking for max_fail_percentage 30564 1726882869.33391: done checking for max_fail_percentage 30564 1726882869.33392: checking to see if all hosts have failed and the running result is not ok 30564 1726882869.33392: done checking to see if all hosts have failed 30564 1726882869.33393: getting the remaining hosts for this loop 30564 1726882869.33394: done getting the remaining hosts for this loop 30564 1726882869.33397: getting the next task for host managed_node2 30564 1726882869.33404: done getting next task for host managed_node2 30564 1726882869.33406: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 30564 1726882869.33410: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882869.33414: getting variables 30564 1726882869.33415: in VariableManager get_vars() 30564 1726882869.33450: Calling all_inventory to load vars for managed_node2 30564 1726882869.33453: Calling groups_inventory to load vars for managed_node2 30564 1726882869.33455: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882869.33466: Calling all_plugins_play to load vars for managed_node2 30564 1726882869.33470: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882869.33472: Calling groups_plugins_play to load vars for managed_node2 30564 1726882869.34372: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882869.35960: done with get_vars() 30564 1726882869.35985: done getting variables 30564 1726882869.36040: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30564 1726882869.36146: variable 'profile' from source: play vars 30564 1726882869.36150: variable 'interface' from source: play vars 30564 1726882869.36210: variable 'interface' from source: play vars TASK [Verify the ansible_managed comment in ifcfg-statebr] ********************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 21:41:09 -0400 (0:00:00.040) 0:01:07.943 ****** 30564 1726882869.36242: entering _queue_task() for managed_node2/set_fact 30564 1726882869.36519: worker is 1 (out of 1 available) 30564 1726882869.36533: exiting _queue_task() for managed_node2/set_fact 30564 1726882869.36546: done queuing things up, now waiting for results queue to drain 30564 1726882869.36547: waiting for pending results... 30564 1726882869.36843: running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-statebr 30564 1726882869.36980: in run() - task 0e448fcc-3ce9-4216-acec-00000000166c 30564 1726882869.37004: variable 'ansible_search_path' from source: unknown 30564 1726882869.37016: variable 'ansible_search_path' from source: unknown 30564 1726882869.37056: calling self._execute() 30564 1726882869.37159: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882869.37162: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882869.37196: variable 'omit' from source: magic vars 30564 1726882869.37478: variable 'ansible_distribution_major_version' from source: facts 30564 1726882869.37489: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882869.37578: variable 'profile_stat' from source: set_fact 30564 1726882869.37586: Evaluated conditional (profile_stat.stat.exists): False 30564 1726882869.37589: when evaluation is False, skipping this task 30564 1726882869.37591: _execute() done 30564 1726882869.37594: dumping result to json 30564 1726882869.37596: done dumping result, returning 30564 1726882869.37602: done running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-statebr [0e448fcc-3ce9-4216-acec-00000000166c] 30564 1726882869.37609: sending task result for task 0e448fcc-3ce9-4216-acec-00000000166c 30564 1726882869.37699: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000166c 30564 1726882869.37703: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30564 1726882869.37750: no more pending results, returning what we have 30564 1726882869.37753: results queue empty 30564 1726882869.37754: checking for any_errors_fatal 30564 1726882869.37761: done checking for any_errors_fatal 30564 1726882869.37762: checking for max_fail_percentage 30564 1726882869.37765: done checking for max_fail_percentage 30564 1726882869.37766: checking to see if all hosts have failed and the running result is not ok 30564 1726882869.37767: done checking to see if all hosts have failed 30564 1726882869.37768: getting the remaining hosts for this loop 30564 1726882869.37769: done getting the remaining hosts for this loop 30564 1726882869.37773: getting the next task for host managed_node2 30564 1726882869.37780: done getting next task for host managed_node2 30564 1726882869.37782: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 30564 1726882869.37786: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882869.37790: getting variables 30564 1726882869.37791: in VariableManager get_vars() 30564 1726882869.37819: Calling all_inventory to load vars for managed_node2 30564 1726882869.37821: Calling groups_inventory to load vars for managed_node2 30564 1726882869.37824: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882869.37834: Calling all_plugins_play to load vars for managed_node2 30564 1726882869.37836: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882869.37838: Calling groups_plugins_play to load vars for managed_node2 30564 1726882869.38752: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882869.40286: done with get_vars() 30564 1726882869.40307: done getting variables 30564 1726882869.40348: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30564 1726882869.40426: variable 'profile' from source: play vars 30564 1726882869.40429: variable 'interface' from source: play vars 30564 1726882869.40468: variable 'interface' from source: play vars TASK [Get the fingerprint comment in ifcfg-statebr] **************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 21:41:09 -0400 (0:00:00.042) 0:01:07.986 ****** 30564 1726882869.40493: entering _queue_task() for managed_node2/command 30564 1726882869.40702: worker is 1 (out of 1 available) 30564 1726882869.40718: exiting _queue_task() for managed_node2/command 30564 1726882869.40732: done queuing things up, now waiting for results queue to drain 30564 1726882869.40733: waiting for pending results... 30564 1726882869.40919: running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-statebr 30564 1726882869.41000: in run() - task 0e448fcc-3ce9-4216-acec-00000000166d 30564 1726882869.41014: variable 'ansible_search_path' from source: unknown 30564 1726882869.41018: variable 'ansible_search_path' from source: unknown 30564 1726882869.41047: calling self._execute() 30564 1726882869.41125: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882869.41130: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882869.41141: variable 'omit' from source: magic vars 30564 1726882869.41408: variable 'ansible_distribution_major_version' from source: facts 30564 1726882869.41418: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882869.41508: variable 'profile_stat' from source: set_fact 30564 1726882869.41516: Evaluated conditional (profile_stat.stat.exists): False 30564 1726882869.41519: when evaluation is False, skipping this task 30564 1726882869.41521: _execute() done 30564 1726882869.41524: dumping result to json 30564 1726882869.41527: done dumping result, returning 30564 1726882869.41532: done running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-statebr [0e448fcc-3ce9-4216-acec-00000000166d] 30564 1726882869.41538: sending task result for task 0e448fcc-3ce9-4216-acec-00000000166d 30564 1726882869.41625: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000166d 30564 1726882869.41628: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30564 1726882869.41682: no more pending results, returning what we have 30564 1726882869.41686: results queue empty 30564 1726882869.41687: checking for any_errors_fatal 30564 1726882869.41693: done checking for any_errors_fatal 30564 1726882869.41694: checking for max_fail_percentage 30564 1726882869.41695: done checking for max_fail_percentage 30564 1726882869.41696: checking to see if all hosts have failed and the running result is not ok 30564 1726882869.41697: done checking to see if all hosts have failed 30564 1726882869.41698: getting the remaining hosts for this loop 30564 1726882869.41699: done getting the remaining hosts for this loop 30564 1726882869.41702: getting the next task for host managed_node2 30564 1726882869.41709: done getting next task for host managed_node2 30564 1726882869.41711: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 30564 1726882869.41715: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882869.41718: getting variables 30564 1726882869.41719: in VariableManager get_vars() 30564 1726882869.41754: Calling all_inventory to load vars for managed_node2 30564 1726882869.41756: Calling groups_inventory to load vars for managed_node2 30564 1726882869.41759: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882869.41770: Calling all_plugins_play to load vars for managed_node2 30564 1726882869.41772: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882869.41775: Calling groups_plugins_play to load vars for managed_node2 30564 1726882869.43097: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882869.44831: done with get_vars() 30564 1726882869.44857: done getting variables 30564 1726882869.44938: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30564 1726882869.45050: variable 'profile' from source: play vars 30564 1726882869.45054: variable 'interface' from source: play vars 30564 1726882869.45140: variable 'interface' from source: play vars TASK [Verify the fingerprint comment in ifcfg-statebr] ************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 21:41:09 -0400 (0:00:00.046) 0:01:08.033 ****** 30564 1726882869.45179: entering _queue_task() for managed_node2/set_fact 30564 1726882869.45502: worker is 1 (out of 1 available) 30564 1726882869.45518: exiting _queue_task() for managed_node2/set_fact 30564 1726882869.45536: done queuing things up, now waiting for results queue to drain 30564 1726882869.45538: waiting for pending results... 30564 1726882869.45836: running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-statebr 30564 1726882869.45933: in run() - task 0e448fcc-3ce9-4216-acec-00000000166e 30564 1726882869.45949: variable 'ansible_search_path' from source: unknown 30564 1726882869.45972: variable 'ansible_search_path' from source: unknown 30564 1726882869.46002: calling self._execute() 30564 1726882869.46096: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882869.46100: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882869.46116: variable 'omit' from source: magic vars 30564 1726882869.46407: variable 'ansible_distribution_major_version' from source: facts 30564 1726882869.46431: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882869.46528: variable 'profile_stat' from source: set_fact 30564 1726882869.46548: Evaluated conditional (profile_stat.stat.exists): False 30564 1726882869.46551: when evaluation is False, skipping this task 30564 1726882869.46554: _execute() done 30564 1726882869.46566: dumping result to json 30564 1726882869.46575: done dumping result, returning 30564 1726882869.46578: done running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-statebr [0e448fcc-3ce9-4216-acec-00000000166e] 30564 1726882869.46580: sending task result for task 0e448fcc-3ce9-4216-acec-00000000166e 30564 1726882869.46680: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000166e 30564 1726882869.46684: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30564 1726882869.46731: no more pending results, returning what we have 30564 1726882869.46735: results queue empty 30564 1726882869.46736: checking for any_errors_fatal 30564 1726882869.46741: done checking for any_errors_fatal 30564 1726882869.46741: checking for max_fail_percentage 30564 1726882869.46743: done checking for max_fail_percentage 30564 1726882869.46743: checking to see if all hosts have failed and the running result is not ok 30564 1726882869.46744: done checking to see if all hosts have failed 30564 1726882869.46745: getting the remaining hosts for this loop 30564 1726882869.46746: done getting the remaining hosts for this loop 30564 1726882869.46749: getting the next task for host managed_node2 30564 1726882869.46757: done getting next task for host managed_node2 30564 1726882869.46759: ^ task is: TASK: Assert that the profile is absent - '{{ profile }}' 30564 1726882869.46763: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882869.46769: getting variables 30564 1726882869.46771: in VariableManager get_vars() 30564 1726882869.46805: Calling all_inventory to load vars for managed_node2 30564 1726882869.46808: Calling groups_inventory to load vars for managed_node2 30564 1726882869.46811: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882869.46820: Calling all_plugins_play to load vars for managed_node2 30564 1726882869.46822: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882869.46824: Calling groups_plugins_play to load vars for managed_node2 30564 1726882869.47607: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882869.48577: done with get_vars() 30564 1726882869.48592: done getting variables 30564 1726882869.48631: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30564 1726882869.48709: variable 'profile' from source: play vars 30564 1726882869.48712: variable 'interface' from source: play vars 30564 1726882869.48796: variable 'interface' from source: play vars TASK [Assert that the profile is absent - 'statebr'] *************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:5 Friday 20 September 2024 21:41:09 -0400 (0:00:00.036) 0:01:08.069 ****** 30564 1726882869.48837: entering _queue_task() for managed_node2/assert 30564 1726882869.49108: worker is 1 (out of 1 available) 30564 1726882869.49122: exiting _queue_task() for managed_node2/assert 30564 1726882869.49136: done queuing things up, now waiting for results queue to drain 30564 1726882869.49138: waiting for pending results... 30564 1726882869.49432: running TaskExecutor() for managed_node2/TASK: Assert that the profile is absent - 'statebr' 30564 1726882869.49510: in run() - task 0e448fcc-3ce9-4216-acec-0000000015d5 30564 1726882869.49521: variable 'ansible_search_path' from source: unknown 30564 1726882869.49525: variable 'ansible_search_path' from source: unknown 30564 1726882869.49552: calling self._execute() 30564 1726882869.49686: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882869.49690: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882869.49700: variable 'omit' from source: magic vars 30564 1726882869.50086: variable 'ansible_distribution_major_version' from source: facts 30564 1726882869.50099: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882869.50108: variable 'omit' from source: magic vars 30564 1726882869.50171: variable 'omit' from source: magic vars 30564 1726882869.50271: variable 'profile' from source: play vars 30564 1726882869.50279: variable 'interface' from source: play vars 30564 1726882869.50334: variable 'interface' from source: play vars 30564 1726882869.50352: variable 'omit' from source: magic vars 30564 1726882869.50387: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882869.50414: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882869.50440: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882869.50451: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882869.50463: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882869.50490: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882869.50493: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882869.50496: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882869.50623: Set connection var ansible_timeout to 10 30564 1726882869.50635: Set connection var ansible_pipelining to False 30564 1726882869.50638: Set connection var ansible_shell_type to sh 30564 1726882869.50646: Set connection var ansible_shell_executable to /bin/sh 30564 1726882869.50654: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882869.50657: Set connection var ansible_connection to ssh 30564 1726882869.50685: variable 'ansible_shell_executable' from source: unknown 30564 1726882869.50699: variable 'ansible_connection' from source: unknown 30564 1726882869.50702: variable 'ansible_module_compression' from source: unknown 30564 1726882869.50704: variable 'ansible_shell_type' from source: unknown 30564 1726882869.50707: variable 'ansible_shell_executable' from source: unknown 30564 1726882869.50709: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882869.50711: variable 'ansible_pipelining' from source: unknown 30564 1726882869.50714: variable 'ansible_timeout' from source: unknown 30564 1726882869.50716: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882869.50822: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882869.50831: variable 'omit' from source: magic vars 30564 1726882869.50836: starting attempt loop 30564 1726882869.50838: running the handler 30564 1726882869.50986: variable 'lsr_net_profile_exists' from source: set_fact 30564 1726882869.50989: Evaluated conditional (not lsr_net_profile_exists): True 30564 1726882869.50998: handler run complete 30564 1726882869.51012: attempt loop complete, returning result 30564 1726882869.51015: _execute() done 30564 1726882869.51017: dumping result to json 30564 1726882869.51020: done dumping result, returning 30564 1726882869.51026: done running TaskExecutor() for managed_node2/TASK: Assert that the profile is absent - 'statebr' [0e448fcc-3ce9-4216-acec-0000000015d5] 30564 1726882869.51034: sending task result for task 0e448fcc-3ce9-4216-acec-0000000015d5 30564 1726882869.51151: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000015d5 30564 1726882869.51160: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 30564 1726882869.51248: no more pending results, returning what we have 30564 1726882869.51251: results queue empty 30564 1726882869.51252: checking for any_errors_fatal 30564 1726882869.51258: done checking for any_errors_fatal 30564 1726882869.51259: checking for max_fail_percentage 30564 1726882869.51260: done checking for max_fail_percentage 30564 1726882869.51261: checking to see if all hosts have failed and the running result is not ok 30564 1726882869.51262: done checking to see if all hosts have failed 30564 1726882869.51264: getting the remaining hosts for this loop 30564 1726882869.51266: done getting the remaining hosts for this loop 30564 1726882869.51269: getting the next task for host managed_node2 30564 1726882869.51276: done getting next task for host managed_node2 30564 1726882869.51278: ^ task is: TASK: Conditional asserts 30564 1726882869.51302: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882869.51306: getting variables 30564 1726882869.51307: in VariableManager get_vars() 30564 1726882869.51329: Calling all_inventory to load vars for managed_node2 30564 1726882869.51330: Calling groups_inventory to load vars for managed_node2 30564 1726882869.51332: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882869.51339: Calling all_plugins_play to load vars for managed_node2 30564 1726882869.51341: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882869.51344: Calling groups_plugins_play to load vars for managed_node2 30564 1726882869.52299: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882869.53408: done with get_vars() 30564 1726882869.53425: done getting variables TASK [Conditional asserts] ***************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:42 Friday 20 September 2024 21:41:09 -0400 (0:00:00.046) 0:01:08.116 ****** 30564 1726882869.53494: entering _queue_task() for managed_node2/include_tasks 30564 1726882869.53692: worker is 1 (out of 1 available) 30564 1726882869.53705: exiting _queue_task() for managed_node2/include_tasks 30564 1726882869.53723: done queuing things up, now waiting for results queue to drain 30564 1726882869.53725: waiting for pending results... 30564 1726882869.53983: running TaskExecutor() for managed_node2/TASK: Conditional asserts 30564 1726882869.54089: in run() - task 0e448fcc-3ce9-4216-acec-00000000100b 30564 1726882869.54100: variable 'ansible_search_path' from source: unknown 30564 1726882869.54103: variable 'ansible_search_path' from source: unknown 30564 1726882869.54315: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882869.56276: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882869.56321: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882869.56356: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882869.56388: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882869.56407: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882869.56479: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882869.56501: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882869.56518: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882869.56545: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882869.56556: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882869.56672: dumping result to json 30564 1726882869.56676: done dumping result, returning 30564 1726882869.56683: done running TaskExecutor() for managed_node2/TASK: Conditional asserts [0e448fcc-3ce9-4216-acec-00000000100b] 30564 1726882869.56689: sending task result for task 0e448fcc-3ce9-4216-acec-00000000100b 30564 1726882869.56791: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000100b 30564 1726882869.56793: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "skipped_reason": "No items in the list" } 30564 1726882869.56858: no more pending results, returning what we have 30564 1726882869.56861: results queue empty 30564 1726882869.56862: checking for any_errors_fatal 30564 1726882869.56869: done checking for any_errors_fatal 30564 1726882869.56870: checking for max_fail_percentage 30564 1726882869.56871: done checking for max_fail_percentage 30564 1726882869.56872: checking to see if all hosts have failed and the running result is not ok 30564 1726882869.56873: done checking to see if all hosts have failed 30564 1726882869.56874: getting the remaining hosts for this loop 30564 1726882869.56875: done getting the remaining hosts for this loop 30564 1726882869.56878: getting the next task for host managed_node2 30564 1726882869.56884: done getting next task for host managed_node2 30564 1726882869.56886: ^ task is: TASK: Success in test '{{ lsr_description }}' 30564 1726882869.56889: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882869.56892: getting variables 30564 1726882869.56894: in VariableManager get_vars() 30564 1726882869.56928: Calling all_inventory to load vars for managed_node2 30564 1726882869.56931: Calling groups_inventory to load vars for managed_node2 30564 1726882869.56934: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882869.56943: Calling all_plugins_play to load vars for managed_node2 30564 1726882869.56945: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882869.56948: Calling groups_plugins_play to load vars for managed_node2 30564 1726882869.57778: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882869.59538: done with get_vars() 30564 1726882869.59555: done getting variables 30564 1726882869.59600: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30564 1726882869.59694: variable 'lsr_description' from source: include params TASK [Success in test 'I can remove an existing profile without taking it down'] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:47 Friday 20 September 2024 21:41:09 -0400 (0:00:00.062) 0:01:08.178 ****** 30564 1726882869.59718: entering _queue_task() for managed_node2/debug 30564 1726882869.59931: worker is 1 (out of 1 available) 30564 1726882869.59949: exiting _queue_task() for managed_node2/debug 30564 1726882869.59967: done queuing things up, now waiting for results queue to drain 30564 1726882869.59970: waiting for pending results... 30564 1726882869.60248: running TaskExecutor() for managed_node2/TASK: Success in test 'I can remove an existing profile without taking it down' 30564 1726882869.60356: in run() - task 0e448fcc-3ce9-4216-acec-00000000100c 30564 1726882869.60373: variable 'ansible_search_path' from source: unknown 30564 1726882869.60377: variable 'ansible_search_path' from source: unknown 30564 1726882869.60404: calling self._execute() 30564 1726882869.60481: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882869.60485: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882869.60494: variable 'omit' from source: magic vars 30564 1726882869.60826: variable 'ansible_distribution_major_version' from source: facts 30564 1726882869.60837: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882869.60842: variable 'omit' from source: magic vars 30564 1726882869.60870: variable 'omit' from source: magic vars 30564 1726882869.60954: variable 'lsr_description' from source: include params 30564 1726882869.60969: variable 'omit' from source: magic vars 30564 1726882869.61005: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882869.61045: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882869.61066: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882869.61087: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882869.61100: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882869.61142: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882869.61145: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882869.61147: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882869.61226: Set connection var ansible_timeout to 10 30564 1726882869.61230: Set connection var ansible_pipelining to False 30564 1726882869.61234: Set connection var ansible_shell_type to sh 30564 1726882869.61237: Set connection var ansible_shell_executable to /bin/sh 30564 1726882869.61256: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882869.61259: Set connection var ansible_connection to ssh 30564 1726882869.61273: variable 'ansible_shell_executable' from source: unknown 30564 1726882869.61276: variable 'ansible_connection' from source: unknown 30564 1726882869.61279: variable 'ansible_module_compression' from source: unknown 30564 1726882869.61281: variable 'ansible_shell_type' from source: unknown 30564 1726882869.61284: variable 'ansible_shell_executable' from source: unknown 30564 1726882869.61286: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882869.61290: variable 'ansible_pipelining' from source: unknown 30564 1726882869.61292: variable 'ansible_timeout' from source: unknown 30564 1726882869.61296: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882869.61426: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882869.61429: variable 'omit' from source: magic vars 30564 1726882869.61434: starting attempt loop 30564 1726882869.61437: running the handler 30564 1726882869.61476: handler run complete 30564 1726882869.61488: attempt loop complete, returning result 30564 1726882869.61491: _execute() done 30564 1726882869.61494: dumping result to json 30564 1726882869.61496: done dumping result, returning 30564 1726882869.61503: done running TaskExecutor() for managed_node2/TASK: Success in test 'I can remove an existing profile without taking it down' [0e448fcc-3ce9-4216-acec-00000000100c] 30564 1726882869.61508: sending task result for task 0e448fcc-3ce9-4216-acec-00000000100c 30564 1726882869.61601: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000100c 30564 1726882869.61603: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: +++++ Success in test 'I can remove an existing profile without taking it down' +++++ 30564 1726882869.61667: no more pending results, returning what we have 30564 1726882869.61671: results queue empty 30564 1726882869.61674: checking for any_errors_fatal 30564 1726882869.61685: done checking for any_errors_fatal 30564 1726882869.61686: checking for max_fail_percentage 30564 1726882869.61687: done checking for max_fail_percentage 30564 1726882869.61688: checking to see if all hosts have failed and the running result is not ok 30564 1726882869.61689: done checking to see if all hosts have failed 30564 1726882869.61689: getting the remaining hosts for this loop 30564 1726882869.61691: done getting the remaining hosts for this loop 30564 1726882869.61695: getting the next task for host managed_node2 30564 1726882869.61701: done getting next task for host managed_node2 30564 1726882869.61704: ^ task is: TASK: Cleanup 30564 1726882869.61706: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882869.61710: getting variables 30564 1726882869.61711: in VariableManager get_vars() 30564 1726882869.61740: Calling all_inventory to load vars for managed_node2 30564 1726882869.61742: Calling groups_inventory to load vars for managed_node2 30564 1726882869.61745: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882869.61754: Calling all_plugins_play to load vars for managed_node2 30564 1726882869.61757: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882869.61759: Calling groups_plugins_play to load vars for managed_node2 30564 1726882869.62906: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882869.63891: done with get_vars() 30564 1726882869.63907: done getting variables TASK [Cleanup] ***************************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:66 Friday 20 September 2024 21:41:09 -0400 (0:00:00.042) 0:01:08.221 ****** 30564 1726882869.63984: entering _queue_task() for managed_node2/include_tasks 30564 1726882869.64250: worker is 1 (out of 1 available) 30564 1726882869.64265: exiting _queue_task() for managed_node2/include_tasks 30564 1726882869.64285: done queuing things up, now waiting for results queue to drain 30564 1726882869.64287: waiting for pending results... 30564 1726882869.64560: running TaskExecutor() for managed_node2/TASK: Cleanup 30564 1726882869.64682: in run() - task 0e448fcc-3ce9-4216-acec-000000001010 30564 1726882869.64687: variable 'ansible_search_path' from source: unknown 30564 1726882869.64690: variable 'ansible_search_path' from source: unknown 30564 1726882869.64745: variable 'lsr_cleanup' from source: include params 30564 1726882869.64969: variable 'lsr_cleanup' from source: include params 30564 1726882869.65021: variable 'omit' from source: magic vars 30564 1726882869.65130: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882869.65137: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882869.65148: variable 'omit' from source: magic vars 30564 1726882869.65382: variable 'ansible_distribution_major_version' from source: facts 30564 1726882869.65408: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882869.65411: variable 'item' from source: unknown 30564 1726882869.65457: variable 'item' from source: unknown 30564 1726882869.65494: variable 'item' from source: unknown 30564 1726882869.65554: variable 'item' from source: unknown 30564 1726882869.65699: dumping result to json 30564 1726882869.65702: done dumping result, returning 30564 1726882869.65722: done running TaskExecutor() for managed_node2/TASK: Cleanup [0e448fcc-3ce9-4216-acec-000000001010] 30564 1726882869.65726: sending task result for task 0e448fcc-3ce9-4216-acec-000000001010 30564 1726882869.65816: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001010 30564 1726882869.65837: WORKER PROCESS EXITING 30564 1726882869.65918: no more pending results, returning what we have 30564 1726882869.65925: in VariableManager get_vars() 30564 1726882869.65956: Calling all_inventory to load vars for managed_node2 30564 1726882869.65960: Calling groups_inventory to load vars for managed_node2 30564 1726882869.65969: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882869.65979: Calling all_plugins_play to load vars for managed_node2 30564 1726882869.65982: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882869.65985: Calling groups_plugins_play to load vars for managed_node2 30564 1726882869.67422: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882869.69117: done with get_vars() 30564 1726882869.69135: variable 'ansible_search_path' from source: unknown 30564 1726882869.69138: variable 'ansible_search_path' from source: unknown 30564 1726882869.69188: we have included files to process 30564 1726882869.69193: generating all_blocks data 30564 1726882869.69195: done generating all_blocks data 30564 1726882869.69204: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30564 1726882869.69206: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30564 1726882869.69208: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30564 1726882869.69388: done processing included file 30564 1726882869.69390: iterating over new_blocks loaded from include file 30564 1726882869.69392: in VariableManager get_vars() 30564 1726882869.69417: done with get_vars() 30564 1726882869.69421: filtering new block on tags 30564 1726882869.69453: done filtering new block on tags 30564 1726882869.69456: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml for managed_node2 => (item=tasks/cleanup_profile+device.yml) 30564 1726882869.69466: extending task lists for all hosts with included blocks 30564 1726882869.70596: done extending task lists 30564 1726882869.70598: done processing included files 30564 1726882869.70599: results queue empty 30564 1726882869.70599: checking for any_errors_fatal 30564 1726882869.70604: done checking for any_errors_fatal 30564 1726882869.70605: checking for max_fail_percentage 30564 1726882869.70606: done checking for max_fail_percentage 30564 1726882869.70607: checking to see if all hosts have failed and the running result is not ok 30564 1726882869.70608: done checking to see if all hosts have failed 30564 1726882869.70609: getting the remaining hosts for this loop 30564 1726882869.70610: done getting the remaining hosts for this loop 30564 1726882869.70613: getting the next task for host managed_node2 30564 1726882869.70620: done getting next task for host managed_node2 30564 1726882869.70626: ^ task is: TASK: Cleanup profile and device 30564 1726882869.70629: ^ state is: HOST STATE: block=6, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882869.70632: getting variables 30564 1726882869.70633: in VariableManager get_vars() 30564 1726882869.70645: Calling all_inventory to load vars for managed_node2 30564 1726882869.70647: Calling groups_inventory to load vars for managed_node2 30564 1726882869.70650: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882869.70655: Calling all_plugins_play to load vars for managed_node2 30564 1726882869.70657: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882869.70665: Calling groups_plugins_play to load vars for managed_node2 30564 1726882869.72184: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882869.79199: done with get_vars() 30564 1726882869.79231: done getting variables 30564 1726882869.79299: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Cleanup profile and device] ********************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml:3 Friday 20 September 2024 21:41:09 -0400 (0:00:00.153) 0:01:08.374 ****** 30564 1726882869.79341: entering _queue_task() for managed_node2/shell 30564 1726882869.79808: worker is 1 (out of 1 available) 30564 1726882869.79826: exiting _queue_task() for managed_node2/shell 30564 1726882869.79839: done queuing things up, now waiting for results queue to drain 30564 1726882869.79841: waiting for pending results... 30564 1726882869.80279: running TaskExecutor() for managed_node2/TASK: Cleanup profile and device 30564 1726882869.80442: in run() - task 0e448fcc-3ce9-4216-acec-0000000016ad 30564 1726882869.80452: variable 'ansible_search_path' from source: unknown 30564 1726882869.80457: variable 'ansible_search_path' from source: unknown 30564 1726882869.80503: calling self._execute() 30564 1726882869.80659: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882869.80666: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882869.80692: variable 'omit' from source: magic vars 30564 1726882869.81149: variable 'ansible_distribution_major_version' from source: facts 30564 1726882869.81160: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882869.81170: variable 'omit' from source: magic vars 30564 1726882869.81230: variable 'omit' from source: magic vars 30564 1726882869.81415: variable 'interface' from source: play vars 30564 1726882869.81438: variable 'omit' from source: magic vars 30564 1726882869.81494: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882869.81546: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882869.81580: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882869.81608: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882869.81617: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882869.81660: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882869.81667: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882869.81669: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882869.81805: Set connection var ansible_timeout to 10 30564 1726882869.81817: Set connection var ansible_pipelining to False 30564 1726882869.81821: Set connection var ansible_shell_type to sh 30564 1726882869.81836: Set connection var ansible_shell_executable to /bin/sh 30564 1726882869.81847: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882869.81851: Set connection var ansible_connection to ssh 30564 1726882869.81881: variable 'ansible_shell_executable' from source: unknown 30564 1726882869.81885: variable 'ansible_connection' from source: unknown 30564 1726882869.81888: variable 'ansible_module_compression' from source: unknown 30564 1726882869.81899: variable 'ansible_shell_type' from source: unknown 30564 1726882869.81902: variable 'ansible_shell_executable' from source: unknown 30564 1726882869.81905: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882869.81907: variable 'ansible_pipelining' from source: unknown 30564 1726882869.81916: variable 'ansible_timeout' from source: unknown 30564 1726882869.81919: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882869.82125: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882869.82137: variable 'omit' from source: magic vars 30564 1726882869.82140: starting attempt loop 30564 1726882869.82143: running the handler 30564 1726882869.82165: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882869.82193: _low_level_execute_command(): starting 30564 1726882869.82210: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882869.83161: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882869.83175: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882869.83193: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882869.83210: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882869.83276: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882869.83281: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882869.83300: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 30564 1726882869.83304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882869.83358: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882869.83377: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882869.83403: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882869.83579: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882869.85199: stdout chunk (state=3): >>>/root <<< 30564 1726882869.85317: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882869.85445: stderr chunk (state=3): >>><<< 30564 1726882869.85478: stdout chunk (state=3): >>><<< 30564 1726882869.85524: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882869.85533: _low_level_execute_command(): starting 30564 1726882869.85548: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882869.8551786-33516-187236721466242 `" && echo ansible-tmp-1726882869.8551786-33516-187236721466242="` echo /root/.ansible/tmp/ansible-tmp-1726882869.8551786-33516-187236721466242 `" ) && sleep 0' 30564 1726882869.86340: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882869.86358: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882869.86384: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882869.86405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882869.86462: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882869.86472: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882869.86486: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882869.86502: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882869.86529: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882869.86533: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882869.86539: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882869.86555: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882869.86561: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882869.86575: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882869.86586: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882869.86591: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882869.86648: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882869.86659: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882869.86776: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882869.88649: stdout chunk (state=3): >>>ansible-tmp-1726882869.8551786-33516-187236721466242=/root/.ansible/tmp/ansible-tmp-1726882869.8551786-33516-187236721466242 <<< 30564 1726882869.88758: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882869.88811: stderr chunk (state=3): >>><<< 30564 1726882869.88814: stdout chunk (state=3): >>><<< 30564 1726882869.88827: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882869.8551786-33516-187236721466242=/root/.ansible/tmp/ansible-tmp-1726882869.8551786-33516-187236721466242 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882869.88854: variable 'ansible_module_compression' from source: unknown 30564 1726882869.88898: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30564 1726882869.88930: variable 'ansible_facts' from source: unknown 30564 1726882869.88994: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882869.8551786-33516-187236721466242/AnsiballZ_command.py 30564 1726882869.89099: Sending initial data 30564 1726882869.89102: Sent initial data (156 bytes) 30564 1726882869.89743: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882869.89746: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882869.89789: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882869.89793: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882869.89795: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882869.89845: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882869.89848: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882869.89852: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882869.89950: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882869.91722: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882869.91814: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882869.91911: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmpmw1d0u95 /root/.ansible/tmp/ansible-tmp-1726882869.8551786-33516-187236721466242/AnsiballZ_command.py <<< 30564 1726882869.92007: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882869.93021: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882869.93109: stderr chunk (state=3): >>><<< 30564 1726882869.93113: stdout chunk (state=3): >>><<< 30564 1726882869.93131: done transferring module to remote 30564 1726882869.93138: _low_level_execute_command(): starting 30564 1726882869.93146: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882869.8551786-33516-187236721466242/ /root/.ansible/tmp/ansible-tmp-1726882869.8551786-33516-187236721466242/AnsiballZ_command.py && sleep 0' 30564 1726882869.93547: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882869.93554: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882869.93591: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 30564 1726882869.93596: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882869.93606: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882869.93612: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882869.93621: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882869.93627: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882869.93682: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882869.93693: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882869.93700: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882869.93813: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882869.95563: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882869.95607: stderr chunk (state=3): >>><<< 30564 1726882869.95610: stdout chunk (state=3): >>><<< 30564 1726882869.95624: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882869.95627: _low_level_execute_command(): starting 30564 1726882869.95630: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882869.8551786-33516-187236721466242/AnsiballZ_command.py && sleep 0' 30564 1726882869.96043: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882869.96055: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882869.96078: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882869.96090: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882869.96100: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882869.96148: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882869.96155: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882869.96278: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882870.16091: stdout chunk (state=3): >>> {"changed": true, "stdout": "Connection 'statebr' (ef91e5fd-4b93-4ee4-ae54-4de7a703b196) successfully deleted.", "stderr": "Cannot find device \"statebr\"", "rc": 1, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-20 21:41:10.091006", "end": "2024-09-20 21:41:10.158812", "delta": "0:00:00.067806", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30564 1726882870.17265: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.11.158 closed. <<< 30564 1726882870.17318: stderr chunk (state=3): >>><<< 30564 1726882870.17321: stdout chunk (state=3): >>><<< 30564 1726882870.17344: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "Connection 'statebr' (ef91e5fd-4b93-4ee4-ae54-4de7a703b196) successfully deleted.", "stderr": "Cannot find device \"statebr\"", "rc": 1, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-20 21:41:10.091006", "end": "2024-09-20 21:41:10.158812", "delta": "0:00:00.067806", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.11.158 closed. 30564 1726882870.17374: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882869.8551786-33516-187236721466242/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882870.17383: _low_level_execute_command(): starting 30564 1726882870.17386: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882869.8551786-33516-187236721466242/ > /dev/null 2>&1 && sleep 0' 30564 1726882870.17851: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882870.17855: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882870.17893: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration <<< 30564 1726882870.17898: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882870.17952: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882870.17955: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882870.17957: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882870.18060: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882870.19917: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882870.19969: stderr chunk (state=3): >>><<< 30564 1726882870.19972: stdout chunk (state=3): >>><<< 30564 1726882870.19988: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882870.19994: handler run complete 30564 1726882870.20011: Evaluated conditional (False): False 30564 1726882870.20023: attempt loop complete, returning result 30564 1726882870.20026: _execute() done 30564 1726882870.20029: dumping result to json 30564 1726882870.20033: done dumping result, returning 30564 1726882870.20042: done running TaskExecutor() for managed_node2/TASK: Cleanup profile and device [0e448fcc-3ce9-4216-acec-0000000016ad] 30564 1726882870.20048: sending task result for task 0e448fcc-3ce9-4216-acec-0000000016ad 30564 1726882870.20150: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000016ad 30564 1726882870.20153: WORKER PROCESS EXITING fatal: [managed_node2]: FAILED! => { "changed": false, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "delta": "0:00:00.067806", "end": "2024-09-20 21:41:10.158812", "rc": 1, "start": "2024-09-20 21:41:10.091006" } STDOUT: Connection 'statebr' (ef91e5fd-4b93-4ee4-ae54-4de7a703b196) successfully deleted. STDERR: Cannot find device "statebr" MSG: non-zero return code ...ignoring 30564 1726882870.20217: no more pending results, returning what we have 30564 1726882870.20221: results queue empty 30564 1726882870.20222: checking for any_errors_fatal 30564 1726882870.20224: done checking for any_errors_fatal 30564 1726882870.20224: checking for max_fail_percentage 30564 1726882870.20226: done checking for max_fail_percentage 30564 1726882870.20227: checking to see if all hosts have failed and the running result is not ok 30564 1726882870.20228: done checking to see if all hosts have failed 30564 1726882870.20229: getting the remaining hosts for this loop 30564 1726882870.20231: done getting the remaining hosts for this loop 30564 1726882870.20234: getting the next task for host managed_node2 30564 1726882870.20246: done getting next task for host managed_node2 30564 1726882870.20248: ^ task is: TASK: Include the task 'run_test.yml' 30564 1726882870.20250: ^ state is: HOST STATE: block=7, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882870.20255: getting variables 30564 1726882870.20257: in VariableManager get_vars() 30564 1726882870.20299: Calling all_inventory to load vars for managed_node2 30564 1726882870.20302: Calling groups_inventory to load vars for managed_node2 30564 1726882870.20305: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882870.20317: Calling all_plugins_play to load vars for managed_node2 30564 1726882870.20319: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882870.20321: Calling groups_plugins_play to load vars for managed_node2 30564 1726882870.21193: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882870.22148: done with get_vars() 30564 1726882870.22169: done getting variables TASK [Include the task 'run_test.yml'] ***************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_states.yml:102 Friday 20 September 2024 21:41:10 -0400 (0:00:00.429) 0:01:08.803 ****** 30564 1726882870.22237: entering _queue_task() for managed_node2/include_tasks 30564 1726882870.22476: worker is 1 (out of 1 available) 30564 1726882870.22490: exiting _queue_task() for managed_node2/include_tasks 30564 1726882870.22502: done queuing things up, now waiting for results queue to drain 30564 1726882870.22503: waiting for pending results... 30564 1726882870.22696: running TaskExecutor() for managed_node2/TASK: Include the task 'run_test.yml' 30564 1726882870.22767: in run() - task 0e448fcc-3ce9-4216-acec-000000000015 30564 1726882870.22779: variable 'ansible_search_path' from source: unknown 30564 1726882870.22808: calling self._execute() 30564 1726882870.22894: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882870.22898: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882870.22907: variable 'omit' from source: magic vars 30564 1726882870.23210: variable 'ansible_distribution_major_version' from source: facts 30564 1726882870.23221: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882870.23226: _execute() done 30564 1726882870.23229: dumping result to json 30564 1726882870.23232: done dumping result, returning 30564 1726882870.23238: done running TaskExecutor() for managed_node2/TASK: Include the task 'run_test.yml' [0e448fcc-3ce9-4216-acec-000000000015] 30564 1726882870.23244: sending task result for task 0e448fcc-3ce9-4216-acec-000000000015 30564 1726882870.23353: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000015 30564 1726882870.23355: WORKER PROCESS EXITING 30564 1726882870.23399: no more pending results, returning what we have 30564 1726882870.23404: in VariableManager get_vars() 30564 1726882870.23445: Calling all_inventory to load vars for managed_node2 30564 1726882870.23448: Calling groups_inventory to load vars for managed_node2 30564 1726882870.23451: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882870.23468: Calling all_plugins_play to load vars for managed_node2 30564 1726882870.23472: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882870.23479: Calling groups_plugins_play to load vars for managed_node2 30564 1726882870.24468: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882870.25397: done with get_vars() 30564 1726882870.25410: variable 'ansible_search_path' from source: unknown 30564 1726882870.25421: we have included files to process 30564 1726882870.25422: generating all_blocks data 30564 1726882870.25423: done generating all_blocks data 30564 1726882870.25427: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30564 1726882870.25428: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30564 1726882870.25430: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30564 1726882870.25691: in VariableManager get_vars() 30564 1726882870.25704: done with get_vars() 30564 1726882870.25728: in VariableManager get_vars() 30564 1726882870.25738: done with get_vars() 30564 1726882870.25768: in VariableManager get_vars() 30564 1726882870.25780: done with get_vars() 30564 1726882870.25806: in VariableManager get_vars() 30564 1726882870.25816: done with get_vars() 30564 1726882870.25840: in VariableManager get_vars() 30564 1726882870.25851: done with get_vars() 30564 1726882870.26107: in VariableManager get_vars() 30564 1726882870.26119: done with get_vars() 30564 1726882870.26126: done processing included file 30564 1726882870.26128: iterating over new_blocks loaded from include file 30564 1726882870.26129: in VariableManager get_vars() 30564 1726882870.26135: done with get_vars() 30564 1726882870.26136: filtering new block on tags 30564 1726882870.26199: done filtering new block on tags 30564 1726882870.26201: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml for managed_node2 30564 1726882870.26205: extending task lists for all hosts with included blocks 30564 1726882870.26227: done extending task lists 30564 1726882870.26227: done processing included files 30564 1726882870.26228: results queue empty 30564 1726882870.26228: checking for any_errors_fatal 30564 1726882870.26231: done checking for any_errors_fatal 30564 1726882870.26231: checking for max_fail_percentage 30564 1726882870.26232: done checking for max_fail_percentage 30564 1726882870.26233: checking to see if all hosts have failed and the running result is not ok 30564 1726882870.26233: done checking to see if all hosts have failed 30564 1726882870.26234: getting the remaining hosts for this loop 30564 1726882870.26234: done getting the remaining hosts for this loop 30564 1726882870.26236: getting the next task for host managed_node2 30564 1726882870.26238: done getting next task for host managed_node2 30564 1726882870.26240: ^ task is: TASK: TEST: {{ lsr_description }} 30564 1726882870.26242: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882870.26243: getting variables 30564 1726882870.26244: in VariableManager get_vars() 30564 1726882870.26250: Calling all_inventory to load vars for managed_node2 30564 1726882870.26252: Calling groups_inventory to load vars for managed_node2 30564 1726882870.26253: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882870.26257: Calling all_plugins_play to load vars for managed_node2 30564 1726882870.26258: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882870.26260: Calling groups_plugins_play to load vars for managed_node2 30564 1726882870.26925: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882870.27896: done with get_vars() 30564 1726882870.27912: done getting variables 30564 1726882870.27939: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30564 1726882870.28022: variable 'lsr_description' from source: include params TASK [TEST: I can take a profile down that is absent] ************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:5 Friday 20 September 2024 21:41:10 -0400 (0:00:00.058) 0:01:08.861 ****** 30564 1726882870.28043: entering _queue_task() for managed_node2/debug 30564 1726882870.28276: worker is 1 (out of 1 available) 30564 1726882870.28290: exiting _queue_task() for managed_node2/debug 30564 1726882870.28302: done queuing things up, now waiting for results queue to drain 30564 1726882870.28304: waiting for pending results... 30564 1726882870.28499: running TaskExecutor() for managed_node2/TASK: TEST: I can take a profile down that is absent 30564 1726882870.28567: in run() - task 0e448fcc-3ce9-4216-acec-000000001744 30564 1726882870.28585: variable 'ansible_search_path' from source: unknown 30564 1726882870.28589: variable 'ansible_search_path' from source: unknown 30564 1726882870.28616: calling self._execute() 30564 1726882870.28700: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882870.28704: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882870.28714: variable 'omit' from source: magic vars 30564 1726882870.28994: variable 'ansible_distribution_major_version' from source: facts 30564 1726882870.29007: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882870.29011: variable 'omit' from source: magic vars 30564 1726882870.29039: variable 'omit' from source: magic vars 30564 1726882870.29112: variable 'lsr_description' from source: include params 30564 1726882870.29126: variable 'omit' from source: magic vars 30564 1726882870.29157: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882870.29189: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882870.29204: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882870.29217: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882870.29230: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882870.29257: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882870.29260: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882870.29263: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882870.29336: Set connection var ansible_timeout to 10 30564 1726882870.29343: Set connection var ansible_pipelining to False 30564 1726882870.29346: Set connection var ansible_shell_type to sh 30564 1726882870.29351: Set connection var ansible_shell_executable to /bin/sh 30564 1726882870.29358: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882870.29360: Set connection var ansible_connection to ssh 30564 1726882870.29383: variable 'ansible_shell_executable' from source: unknown 30564 1726882870.29386: variable 'ansible_connection' from source: unknown 30564 1726882870.29389: variable 'ansible_module_compression' from source: unknown 30564 1726882870.29391: variable 'ansible_shell_type' from source: unknown 30564 1726882870.29394: variable 'ansible_shell_executable' from source: unknown 30564 1726882870.29396: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882870.29401: variable 'ansible_pipelining' from source: unknown 30564 1726882870.29404: variable 'ansible_timeout' from source: unknown 30564 1726882870.29409: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882870.29512: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882870.29522: variable 'omit' from source: magic vars 30564 1726882870.29526: starting attempt loop 30564 1726882870.29529: running the handler 30564 1726882870.29570: handler run complete 30564 1726882870.29582: attempt loop complete, returning result 30564 1726882870.29585: _execute() done 30564 1726882870.29587: dumping result to json 30564 1726882870.29590: done dumping result, returning 30564 1726882870.29596: done running TaskExecutor() for managed_node2/TASK: TEST: I can take a profile down that is absent [0e448fcc-3ce9-4216-acec-000000001744] 30564 1726882870.29601: sending task result for task 0e448fcc-3ce9-4216-acec-000000001744 30564 1726882870.29681: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001744 30564 1726882870.29684: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: ########## I can take a profile down that is absent ########## 30564 1726882870.29726: no more pending results, returning what we have 30564 1726882870.29730: results queue empty 30564 1726882870.29731: checking for any_errors_fatal 30564 1726882870.29733: done checking for any_errors_fatal 30564 1726882870.29733: checking for max_fail_percentage 30564 1726882870.29735: done checking for max_fail_percentage 30564 1726882870.29736: checking to see if all hosts have failed and the running result is not ok 30564 1726882870.29736: done checking to see if all hosts have failed 30564 1726882870.29737: getting the remaining hosts for this loop 30564 1726882870.29739: done getting the remaining hosts for this loop 30564 1726882870.29742: getting the next task for host managed_node2 30564 1726882870.29749: done getting next task for host managed_node2 30564 1726882870.29751: ^ task is: TASK: Show item 30564 1726882870.29756: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882870.29759: getting variables 30564 1726882870.29761: in VariableManager get_vars() 30564 1726882870.29799: Calling all_inventory to load vars for managed_node2 30564 1726882870.29801: Calling groups_inventory to load vars for managed_node2 30564 1726882870.29805: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882870.29815: Calling all_plugins_play to load vars for managed_node2 30564 1726882870.29817: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882870.29820: Calling groups_plugins_play to load vars for managed_node2 30564 1726882870.30629: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882870.31580: done with get_vars() 30564 1726882870.31596: done getting variables 30564 1726882870.31633: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show item] *************************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:9 Friday 20 September 2024 21:41:10 -0400 (0:00:00.036) 0:01:08.897 ****** 30564 1726882870.31651: entering _queue_task() for managed_node2/debug 30564 1726882870.31848: worker is 1 (out of 1 available) 30564 1726882870.31860: exiting _queue_task() for managed_node2/debug 30564 1726882870.31879: done queuing things up, now waiting for results queue to drain 30564 1726882870.31880: waiting for pending results... 30564 1726882870.32048: running TaskExecutor() for managed_node2/TASK: Show item 30564 1726882870.32110: in run() - task 0e448fcc-3ce9-4216-acec-000000001745 30564 1726882870.32122: variable 'ansible_search_path' from source: unknown 30564 1726882870.32125: variable 'ansible_search_path' from source: unknown 30564 1726882870.32174: variable 'omit' from source: magic vars 30564 1726882870.32280: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882870.32288: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882870.32296: variable 'omit' from source: magic vars 30564 1726882870.32550: variable 'ansible_distribution_major_version' from source: facts 30564 1726882870.32561: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882870.32568: variable 'omit' from source: magic vars 30564 1726882870.32604: variable 'omit' from source: magic vars 30564 1726882870.32631: variable 'item' from source: unknown 30564 1726882870.32685: variable 'item' from source: unknown 30564 1726882870.32697: variable 'omit' from source: magic vars 30564 1726882870.32731: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882870.32757: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882870.32778: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882870.32793: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882870.32802: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882870.32826: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882870.32829: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882870.32832: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882870.32901: Set connection var ansible_timeout to 10 30564 1726882870.32905: Set connection var ansible_pipelining to False 30564 1726882870.32908: Set connection var ansible_shell_type to sh 30564 1726882870.32914: Set connection var ansible_shell_executable to /bin/sh 30564 1726882870.32921: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882870.32926: Set connection var ansible_connection to ssh 30564 1726882870.32942: variable 'ansible_shell_executable' from source: unknown 30564 1726882870.32945: variable 'ansible_connection' from source: unknown 30564 1726882870.32948: variable 'ansible_module_compression' from source: unknown 30564 1726882870.32950: variable 'ansible_shell_type' from source: unknown 30564 1726882870.32952: variable 'ansible_shell_executable' from source: unknown 30564 1726882870.32956: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882870.32960: variable 'ansible_pipelining' from source: unknown 30564 1726882870.32962: variable 'ansible_timeout' from source: unknown 30564 1726882870.32971: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882870.33070: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882870.33076: variable 'omit' from source: magic vars 30564 1726882870.33082: starting attempt loop 30564 1726882870.33085: running the handler 30564 1726882870.33122: variable 'lsr_description' from source: include params 30564 1726882870.33170: variable 'lsr_description' from source: include params 30564 1726882870.33176: handler run complete 30564 1726882870.33189: attempt loop complete, returning result 30564 1726882870.33201: variable 'item' from source: unknown 30564 1726882870.33248: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_description) => { "ansible_loop_var": "item", "item": "lsr_description", "lsr_description": "I can take a profile down that is absent" } 30564 1726882870.33400: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882870.33403: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882870.33406: variable 'omit' from source: magic vars 30564 1726882870.33465: variable 'ansible_distribution_major_version' from source: facts 30564 1726882870.33471: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882870.33475: variable 'omit' from source: magic vars 30564 1726882870.33484: variable 'omit' from source: magic vars 30564 1726882870.33512: variable 'item' from source: unknown 30564 1726882870.33556: variable 'item' from source: unknown 30564 1726882870.33568: variable 'omit' from source: magic vars 30564 1726882870.33588: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882870.33594: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882870.33600: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882870.33609: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882870.33612: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882870.33614: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882870.33666: Set connection var ansible_timeout to 10 30564 1726882870.33670: Set connection var ansible_pipelining to False 30564 1726882870.33676: Set connection var ansible_shell_type to sh 30564 1726882870.33685: Set connection var ansible_shell_executable to /bin/sh 30564 1726882870.33691: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882870.33693: Set connection var ansible_connection to ssh 30564 1726882870.33710: variable 'ansible_shell_executable' from source: unknown 30564 1726882870.33712: variable 'ansible_connection' from source: unknown 30564 1726882870.33715: variable 'ansible_module_compression' from source: unknown 30564 1726882870.33718: variable 'ansible_shell_type' from source: unknown 30564 1726882870.33720: variable 'ansible_shell_executable' from source: unknown 30564 1726882870.33723: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882870.33725: variable 'ansible_pipelining' from source: unknown 30564 1726882870.33727: variable 'ansible_timeout' from source: unknown 30564 1726882870.33732: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882870.33794: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882870.33802: variable 'omit' from source: magic vars 30564 1726882870.33805: starting attempt loop 30564 1726882870.33807: running the handler 30564 1726882870.33824: variable 'lsr_setup' from source: include params 30564 1726882870.33873: variable 'lsr_setup' from source: include params 30564 1726882870.33912: handler run complete 30564 1726882870.33922: attempt loop complete, returning result 30564 1726882870.33933: variable 'item' from source: unknown 30564 1726882870.33981: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_setup) => { "ansible_loop_var": "item", "item": "lsr_setup", "lsr_setup": [ "tasks/create_bridge_profile.yml", "tasks/activate_profile.yml", "tasks/remove_profile.yml" ] } 30564 1726882870.34064: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882870.34070: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882870.34072: variable 'omit' from source: magic vars 30564 1726882870.34170: variable 'ansible_distribution_major_version' from source: facts 30564 1726882870.34176: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882870.34179: variable 'omit' from source: magic vars 30564 1726882870.34193: variable 'omit' from source: magic vars 30564 1726882870.34221: variable 'item' from source: unknown 30564 1726882870.34265: variable 'item' from source: unknown 30564 1726882870.34278: variable 'omit' from source: magic vars 30564 1726882870.34294: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882870.34299: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882870.34305: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882870.34315: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882870.34318: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882870.34320: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882870.34367: Set connection var ansible_timeout to 10 30564 1726882870.34374: Set connection var ansible_pipelining to False 30564 1726882870.34376: Set connection var ansible_shell_type to sh 30564 1726882870.34381: Set connection var ansible_shell_executable to /bin/sh 30564 1726882870.34387: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882870.34390: Set connection var ansible_connection to ssh 30564 1726882870.34405: variable 'ansible_shell_executable' from source: unknown 30564 1726882870.34408: variable 'ansible_connection' from source: unknown 30564 1726882870.34410: variable 'ansible_module_compression' from source: unknown 30564 1726882870.34412: variable 'ansible_shell_type' from source: unknown 30564 1726882870.34415: variable 'ansible_shell_executable' from source: unknown 30564 1726882870.34421: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882870.34423: variable 'ansible_pipelining' from source: unknown 30564 1726882870.34425: variable 'ansible_timeout' from source: unknown 30564 1726882870.34427: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882870.34486: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882870.34492: variable 'omit' from source: magic vars 30564 1726882870.34495: starting attempt loop 30564 1726882870.34497: running the handler 30564 1726882870.34513: variable 'lsr_test' from source: include params 30564 1726882870.34559: variable 'lsr_test' from source: include params 30564 1726882870.34575: handler run complete 30564 1726882870.34585: attempt loop complete, returning result 30564 1726882870.34596: variable 'item' from source: unknown 30564 1726882870.34640: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_test) => { "ansible_loop_var": "item", "item": "lsr_test", "lsr_test": [ "tasks/remove+down_profile.yml" ] } 30564 1726882870.34712: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882870.34715: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882870.34726: variable 'omit' from source: magic vars 30564 1726882870.34824: variable 'ansible_distribution_major_version' from source: facts 30564 1726882870.34827: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882870.34834: variable 'omit' from source: magic vars 30564 1726882870.34842: variable 'omit' from source: magic vars 30564 1726882870.34873: variable 'item' from source: unknown 30564 1726882870.34916: variable 'item' from source: unknown 30564 1726882870.34927: variable 'omit' from source: magic vars 30564 1726882870.34942: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882870.34945: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882870.34951: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882870.34962: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882870.34965: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882870.34971: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882870.35013: Set connection var ansible_timeout to 10 30564 1726882870.35016: Set connection var ansible_pipelining to False 30564 1726882870.35019: Set connection var ansible_shell_type to sh 30564 1726882870.35024: Set connection var ansible_shell_executable to /bin/sh 30564 1726882870.35030: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882870.35033: Set connection var ansible_connection to ssh 30564 1726882870.35050: variable 'ansible_shell_executable' from source: unknown 30564 1726882870.35052: variable 'ansible_connection' from source: unknown 30564 1726882870.35055: variable 'ansible_module_compression' from source: unknown 30564 1726882870.35057: variable 'ansible_shell_type' from source: unknown 30564 1726882870.35059: variable 'ansible_shell_executable' from source: unknown 30564 1726882870.35061: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882870.35067: variable 'ansible_pipelining' from source: unknown 30564 1726882870.35070: variable 'ansible_timeout' from source: unknown 30564 1726882870.35072: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882870.35127: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882870.35134: variable 'omit' from source: magic vars 30564 1726882870.35137: starting attempt loop 30564 1726882870.35139: running the handler 30564 1726882870.35156: variable 'lsr_assert' from source: include params 30564 1726882870.35204: variable 'lsr_assert' from source: include params 30564 1726882870.35215: handler run complete 30564 1726882870.35225: attempt loop complete, returning result 30564 1726882870.35235: variable 'item' from source: unknown 30564 1726882870.35281: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_assert) => { "ansible_loop_var": "item", "item": "lsr_assert", "lsr_assert": [ "tasks/assert_profile_absent.yml" ] } 30564 1726882870.35353: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882870.35356: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882870.35361: variable 'omit' from source: magic vars 30564 1726882870.35488: variable 'ansible_distribution_major_version' from source: facts 30564 1726882870.35491: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882870.35497: variable 'omit' from source: magic vars 30564 1726882870.35506: variable 'omit' from source: magic vars 30564 1726882870.35533: variable 'item' from source: unknown 30564 1726882870.35578: variable 'item' from source: unknown 30564 1726882870.35588: variable 'omit' from source: magic vars 30564 1726882870.35602: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882870.35612: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882870.35615: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882870.35623: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882870.35626: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882870.35628: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882870.35675: Set connection var ansible_timeout to 10 30564 1726882870.35678: Set connection var ansible_pipelining to False 30564 1726882870.35681: Set connection var ansible_shell_type to sh 30564 1726882870.35686: Set connection var ansible_shell_executable to /bin/sh 30564 1726882870.35692: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882870.35694: Set connection var ansible_connection to ssh 30564 1726882870.35711: variable 'ansible_shell_executable' from source: unknown 30564 1726882870.35714: variable 'ansible_connection' from source: unknown 30564 1726882870.35720: variable 'ansible_module_compression' from source: unknown 30564 1726882870.35723: variable 'ansible_shell_type' from source: unknown 30564 1726882870.35725: variable 'ansible_shell_executable' from source: unknown 30564 1726882870.35731: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882870.35733: variable 'ansible_pipelining' from source: unknown 30564 1726882870.35735: variable 'ansible_timeout' from source: unknown 30564 1726882870.35737: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882870.35791: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882870.35797: variable 'omit' from source: magic vars 30564 1726882870.35800: starting attempt loop 30564 1726882870.35802: running the handler 30564 1726882870.35819: variable 'lsr_assert_when' from source: include params 30564 1726882870.35864: variable 'lsr_assert_when' from source: include params 30564 1726882870.35923: variable 'network_provider' from source: set_fact 30564 1726882870.35948: handler run complete 30564 1726882870.35959: attempt loop complete, returning result 30564 1726882870.35973: variable 'item' from source: unknown 30564 1726882870.36014: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_assert_when) => { "ansible_loop_var": "item", "item": "lsr_assert_when", "lsr_assert_when": [ { "condition": true, "what": "tasks/assert_device_absent.yml" } ] } 30564 1726882870.36095: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882870.36098: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882870.36101: variable 'omit' from source: magic vars 30564 1726882870.36197: variable 'ansible_distribution_major_version' from source: facts 30564 1726882870.36200: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882870.36204: variable 'omit' from source: magic vars 30564 1726882870.36215: variable 'omit' from source: magic vars 30564 1726882870.36241: variable 'item' from source: unknown 30564 1726882870.36287: variable 'item' from source: unknown 30564 1726882870.36298: variable 'omit' from source: magic vars 30564 1726882870.36310: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882870.36316: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882870.36322: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882870.36330: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882870.36332: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882870.36335: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882870.36386: Set connection var ansible_timeout to 10 30564 1726882870.36389: Set connection var ansible_pipelining to False 30564 1726882870.36391: Set connection var ansible_shell_type to sh 30564 1726882870.36396: Set connection var ansible_shell_executable to /bin/sh 30564 1726882870.36402: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882870.36405: Set connection var ansible_connection to ssh 30564 1726882870.36420: variable 'ansible_shell_executable' from source: unknown 30564 1726882870.36423: variable 'ansible_connection' from source: unknown 30564 1726882870.36425: variable 'ansible_module_compression' from source: unknown 30564 1726882870.36428: variable 'ansible_shell_type' from source: unknown 30564 1726882870.36430: variable 'ansible_shell_executable' from source: unknown 30564 1726882870.36432: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882870.36436: variable 'ansible_pipelining' from source: unknown 30564 1726882870.36439: variable 'ansible_timeout' from source: unknown 30564 1726882870.36443: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882870.36504: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882870.36510: variable 'omit' from source: magic vars 30564 1726882870.36513: starting attempt loop 30564 1726882870.36515: running the handler 30564 1726882870.36529: variable 'lsr_fail_debug' from source: play vars 30564 1726882870.36574: variable 'lsr_fail_debug' from source: play vars 30564 1726882870.36588: handler run complete 30564 1726882870.36599: attempt loop complete, returning result 30564 1726882870.36610: variable 'item' from source: unknown 30564 1726882870.36651: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_fail_debug) => { "ansible_loop_var": "item", "item": "lsr_fail_debug", "lsr_fail_debug": [ "__network_connections_result" ] } 30564 1726882870.36733: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882870.36737: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882870.36740: variable 'omit' from source: magic vars 30564 1726882870.36831: variable 'ansible_distribution_major_version' from source: facts 30564 1726882870.36835: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882870.36842: variable 'omit' from source: magic vars 30564 1726882870.36850: variable 'omit' from source: magic vars 30564 1726882870.36879: variable 'item' from source: unknown 30564 1726882870.36922: variable 'item' from source: unknown 30564 1726882870.36933: variable 'omit' from source: magic vars 30564 1726882870.36947: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882870.36956: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882870.36959: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882870.36967: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882870.36974: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882870.36976: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882870.37027: Set connection var ansible_timeout to 10 30564 1726882870.37031: Set connection var ansible_pipelining to False 30564 1726882870.37034: Set connection var ansible_shell_type to sh 30564 1726882870.37036: Set connection var ansible_shell_executable to /bin/sh 30564 1726882870.37042: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882870.37044: Set connection var ansible_connection to ssh 30564 1726882870.37061: variable 'ansible_shell_executable' from source: unknown 30564 1726882870.37064: variable 'ansible_connection' from source: unknown 30564 1726882870.37069: variable 'ansible_module_compression' from source: unknown 30564 1726882870.37072: variable 'ansible_shell_type' from source: unknown 30564 1726882870.37074: variable 'ansible_shell_executable' from source: unknown 30564 1726882870.37076: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882870.37078: variable 'ansible_pipelining' from source: unknown 30564 1726882870.37082: variable 'ansible_timeout' from source: unknown 30564 1726882870.37086: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882870.37143: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882870.37149: variable 'omit' from source: magic vars 30564 1726882870.37152: starting attempt loop 30564 1726882870.37154: running the handler 30564 1726882870.37169: variable 'lsr_cleanup' from source: include params 30564 1726882870.37213: variable 'lsr_cleanup' from source: include params 30564 1726882870.37224: handler run complete 30564 1726882870.37235: attempt loop complete, returning result 30564 1726882870.37247: variable 'item' from source: unknown 30564 1726882870.37293: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_cleanup) => { "ansible_loop_var": "item", "item": "lsr_cleanup", "lsr_cleanup": [ "tasks/cleanup_profile+device.yml" ] } 30564 1726882870.37370: dumping result to json 30564 1726882870.37373: done dumping result, returning 30564 1726882870.37375: done running TaskExecutor() for managed_node2/TASK: Show item [0e448fcc-3ce9-4216-acec-000000001745] 30564 1726882870.37377: sending task result for task 0e448fcc-3ce9-4216-acec-000000001745 30564 1726882870.37473: no more pending results, returning what we have 30564 1726882870.37477: results queue empty 30564 1726882870.37479: checking for any_errors_fatal 30564 1726882870.37485: done checking for any_errors_fatal 30564 1726882870.37486: checking for max_fail_percentage 30564 1726882870.37487: done checking for max_fail_percentage 30564 1726882870.37488: checking to see if all hosts have failed and the running result is not ok 30564 1726882870.37489: done checking to see if all hosts have failed 30564 1726882870.37490: getting the remaining hosts for this loop 30564 1726882870.37492: done getting the remaining hosts for this loop 30564 1726882870.37495: getting the next task for host managed_node2 30564 1726882870.37501: done getting next task for host managed_node2 30564 1726882870.37504: ^ task is: TASK: Include the task 'show_interfaces.yml' 30564 1726882870.37506: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882870.37510: getting variables 30564 1726882870.37511: in VariableManager get_vars() 30564 1726882870.37541: Calling all_inventory to load vars for managed_node2 30564 1726882870.37543: Calling groups_inventory to load vars for managed_node2 30564 1726882870.37546: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882870.37555: Calling all_plugins_play to load vars for managed_node2 30564 1726882870.37558: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882870.37561: Calling groups_plugins_play to load vars for managed_node2 30564 1726882870.38478: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001745 30564 1726882870.38493: WORKER PROCESS EXITING 30564 1726882870.38502: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882870.39433: done with get_vars() 30564 1726882870.39449: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:21 Friday 20 September 2024 21:41:10 -0400 (0:00:00.078) 0:01:08.976 ****** 30564 1726882870.39512: entering _queue_task() for managed_node2/include_tasks 30564 1726882870.39712: worker is 1 (out of 1 available) 30564 1726882870.39725: exiting _queue_task() for managed_node2/include_tasks 30564 1726882870.39738: done queuing things up, now waiting for results queue to drain 30564 1726882870.39739: waiting for pending results... 30564 1726882870.39918: running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' 30564 1726882870.39985: in run() - task 0e448fcc-3ce9-4216-acec-000000001746 30564 1726882870.39997: variable 'ansible_search_path' from source: unknown 30564 1726882870.40001: variable 'ansible_search_path' from source: unknown 30564 1726882870.40034: calling self._execute() 30564 1726882870.40113: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882870.40118: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882870.40127: variable 'omit' from source: magic vars 30564 1726882870.40408: variable 'ansible_distribution_major_version' from source: facts 30564 1726882870.40419: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882870.40426: _execute() done 30564 1726882870.40429: dumping result to json 30564 1726882870.40432: done dumping result, returning 30564 1726882870.40435: done running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' [0e448fcc-3ce9-4216-acec-000000001746] 30564 1726882870.40445: sending task result for task 0e448fcc-3ce9-4216-acec-000000001746 30564 1726882870.40531: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001746 30564 1726882870.40534: WORKER PROCESS EXITING 30564 1726882870.40559: no more pending results, returning what we have 30564 1726882870.40565: in VariableManager get_vars() 30564 1726882870.40606: Calling all_inventory to load vars for managed_node2 30564 1726882870.40609: Calling groups_inventory to load vars for managed_node2 30564 1726882870.40612: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882870.40624: Calling all_plugins_play to load vars for managed_node2 30564 1726882870.40627: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882870.40629: Calling groups_plugins_play to load vars for managed_node2 30564 1726882870.41850: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882870.42960: done with get_vars() 30564 1726882870.42978: variable 'ansible_search_path' from source: unknown 30564 1726882870.42979: variable 'ansible_search_path' from source: unknown 30564 1726882870.43006: we have included files to process 30564 1726882870.43007: generating all_blocks data 30564 1726882870.43008: done generating all_blocks data 30564 1726882870.43012: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30564 1726882870.43012: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30564 1726882870.43014: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30564 1726882870.43085: in VariableManager get_vars() 30564 1726882870.43100: done with get_vars() 30564 1726882870.43173: done processing included file 30564 1726882870.43175: iterating over new_blocks loaded from include file 30564 1726882870.43176: in VariableManager get_vars() 30564 1726882870.43186: done with get_vars() 30564 1726882870.43187: filtering new block on tags 30564 1726882870.43210: done filtering new block on tags 30564 1726882870.43212: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node2 30564 1726882870.43215: extending task lists for all hosts with included blocks 30564 1726882870.43488: done extending task lists 30564 1726882870.43489: done processing included files 30564 1726882870.43490: results queue empty 30564 1726882870.43490: checking for any_errors_fatal 30564 1726882870.43494: done checking for any_errors_fatal 30564 1726882870.43495: checking for max_fail_percentage 30564 1726882870.43495: done checking for max_fail_percentage 30564 1726882870.43496: checking to see if all hosts have failed and the running result is not ok 30564 1726882870.43496: done checking to see if all hosts have failed 30564 1726882870.43497: getting the remaining hosts for this loop 30564 1726882870.43498: done getting the remaining hosts for this loop 30564 1726882870.43500: getting the next task for host managed_node2 30564 1726882870.43502: done getting next task for host managed_node2 30564 1726882870.43504: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 30564 1726882870.43506: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882870.43507: getting variables 30564 1726882870.43508: in VariableManager get_vars() 30564 1726882870.43515: Calling all_inventory to load vars for managed_node2 30564 1726882870.43517: Calling groups_inventory to load vars for managed_node2 30564 1726882870.43519: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882870.43522: Calling all_plugins_play to load vars for managed_node2 30564 1726882870.43524: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882870.43526: Calling groups_plugins_play to load vars for managed_node2 30564 1726882870.44514: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882870.46186: done with get_vars() 30564 1726882870.46213: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 21:41:10 -0400 (0:00:00.067) 0:01:09.044 ****** 30564 1726882870.46296: entering _queue_task() for managed_node2/include_tasks 30564 1726882870.46631: worker is 1 (out of 1 available) 30564 1726882870.46643: exiting _queue_task() for managed_node2/include_tasks 30564 1726882870.46656: done queuing things up, now waiting for results queue to drain 30564 1726882870.46657: waiting for pending results... 30564 1726882870.46959: running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' 30564 1726882870.47090: in run() - task 0e448fcc-3ce9-4216-acec-00000000176d 30564 1726882870.47114: variable 'ansible_search_path' from source: unknown 30564 1726882870.47123: variable 'ansible_search_path' from source: unknown 30564 1726882870.47166: calling self._execute() 30564 1726882870.47283: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882870.47300: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882870.47311: variable 'omit' from source: magic vars 30564 1726882870.47622: variable 'ansible_distribution_major_version' from source: facts 30564 1726882870.47633: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882870.47639: _execute() done 30564 1726882870.47642: dumping result to json 30564 1726882870.47645: done dumping result, returning 30564 1726882870.47650: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' [0e448fcc-3ce9-4216-acec-00000000176d] 30564 1726882870.47655: sending task result for task 0e448fcc-3ce9-4216-acec-00000000176d 30564 1726882870.47745: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000176d 30564 1726882870.47747: WORKER PROCESS EXITING 30564 1726882870.47780: no more pending results, returning what we have 30564 1726882870.47785: in VariableManager get_vars() 30564 1726882870.47825: Calling all_inventory to load vars for managed_node2 30564 1726882870.47828: Calling groups_inventory to load vars for managed_node2 30564 1726882870.47831: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882870.47844: Calling all_plugins_play to load vars for managed_node2 30564 1726882870.47847: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882870.47850: Calling groups_plugins_play to load vars for managed_node2 30564 1726882870.48754: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882870.50210: done with get_vars() 30564 1726882870.50225: variable 'ansible_search_path' from source: unknown 30564 1726882870.50226: variable 'ansible_search_path' from source: unknown 30564 1726882870.50255: we have included files to process 30564 1726882870.50256: generating all_blocks data 30564 1726882870.50257: done generating all_blocks data 30564 1726882870.50258: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30564 1726882870.50259: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30564 1726882870.50260: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30564 1726882870.50438: done processing included file 30564 1726882870.50440: iterating over new_blocks loaded from include file 30564 1726882870.50441: in VariableManager get_vars() 30564 1726882870.50453: done with get_vars() 30564 1726882870.50454: filtering new block on tags 30564 1726882870.50482: done filtering new block on tags 30564 1726882870.50483: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node2 30564 1726882870.50487: extending task lists for all hosts with included blocks 30564 1726882870.50586: done extending task lists 30564 1726882870.50587: done processing included files 30564 1726882870.50588: results queue empty 30564 1726882870.50588: checking for any_errors_fatal 30564 1726882870.50590: done checking for any_errors_fatal 30564 1726882870.50591: checking for max_fail_percentage 30564 1726882870.50592: done checking for max_fail_percentage 30564 1726882870.50592: checking to see if all hosts have failed and the running result is not ok 30564 1726882870.50593: done checking to see if all hosts have failed 30564 1726882870.50593: getting the remaining hosts for this loop 30564 1726882870.50594: done getting the remaining hosts for this loop 30564 1726882870.50596: getting the next task for host managed_node2 30564 1726882870.50598: done getting next task for host managed_node2 30564 1726882870.50600: ^ task is: TASK: Gather current interface info 30564 1726882870.50602: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882870.50604: getting variables 30564 1726882870.50604: in VariableManager get_vars() 30564 1726882870.50611: Calling all_inventory to load vars for managed_node2 30564 1726882870.50613: Calling groups_inventory to load vars for managed_node2 30564 1726882870.50614: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882870.50618: Calling all_plugins_play to load vars for managed_node2 30564 1726882870.50619: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882870.50621: Calling groups_plugins_play to load vars for managed_node2 30564 1726882870.51313: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882870.52495: done with get_vars() 30564 1726882870.52517: done getting variables 30564 1726882870.52556: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 21:41:10 -0400 (0:00:00.062) 0:01:09.107 ****** 30564 1726882870.52592: entering _queue_task() for managed_node2/command 30564 1726882870.52905: worker is 1 (out of 1 available) 30564 1726882870.52916: exiting _queue_task() for managed_node2/command 30564 1726882870.52928: done queuing things up, now waiting for results queue to drain 30564 1726882870.52929: waiting for pending results... 30564 1726882870.53236: running TaskExecutor() for managed_node2/TASK: Gather current interface info 30564 1726882870.53344: in run() - task 0e448fcc-3ce9-4216-acec-0000000017a8 30564 1726882870.53358: variable 'ansible_search_path' from source: unknown 30564 1726882870.53362: variable 'ansible_search_path' from source: unknown 30564 1726882870.53401: calling self._execute() 30564 1726882870.53504: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882870.53508: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882870.53520: variable 'omit' from source: magic vars 30564 1726882870.53894: variable 'ansible_distribution_major_version' from source: facts 30564 1726882870.53906: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882870.53918: variable 'omit' from source: magic vars 30564 1726882870.53945: variable 'omit' from source: magic vars 30564 1726882870.53970: variable 'omit' from source: magic vars 30564 1726882870.54005: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882870.54041: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882870.54059: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882870.54077: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882870.54088: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882870.54111: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882870.54115: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882870.54118: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882870.54191: Set connection var ansible_timeout to 10 30564 1726882870.54195: Set connection var ansible_pipelining to False 30564 1726882870.54198: Set connection var ansible_shell_type to sh 30564 1726882870.54203: Set connection var ansible_shell_executable to /bin/sh 30564 1726882870.54210: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882870.54212: Set connection var ansible_connection to ssh 30564 1726882870.54230: variable 'ansible_shell_executable' from source: unknown 30564 1726882870.54233: variable 'ansible_connection' from source: unknown 30564 1726882870.54238: variable 'ansible_module_compression' from source: unknown 30564 1726882870.54241: variable 'ansible_shell_type' from source: unknown 30564 1726882870.54243: variable 'ansible_shell_executable' from source: unknown 30564 1726882870.54245: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882870.54247: variable 'ansible_pipelining' from source: unknown 30564 1726882870.54250: variable 'ansible_timeout' from source: unknown 30564 1726882870.54254: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882870.54377: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882870.54385: variable 'omit' from source: magic vars 30564 1726882870.54391: starting attempt loop 30564 1726882870.54394: running the handler 30564 1726882870.54408: _low_level_execute_command(): starting 30564 1726882870.54417: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882870.54908: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882870.54923: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882870.55019: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882870.55211: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882870.55316: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882870.55925: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882870.56960: stdout chunk (state=3): >>>/root <<< 30564 1726882870.57109: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882870.57179: stderr chunk (state=3): >>><<< 30564 1726882870.57192: stdout chunk (state=3): >>><<< 30564 1726882870.57241: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882870.57251: _low_level_execute_command(): starting 30564 1726882870.57255: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882870.5722013-33541-45560319481068 `" && echo ansible-tmp-1726882870.5722013-33541-45560319481068="` echo /root/.ansible/tmp/ansible-tmp-1726882870.5722013-33541-45560319481068 `" ) && sleep 0' 30564 1726882870.58226: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882870.58233: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882870.58243: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882870.58256: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882870.58296: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882870.58303: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882870.58313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882870.58325: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882870.58334: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882870.58343: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882870.58348: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882870.58357: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882870.58374: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882870.58381: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882870.58388: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882870.58397: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882870.58472: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882870.58488: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882870.58500: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882870.58628: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882870.60505: stdout chunk (state=3): >>>ansible-tmp-1726882870.5722013-33541-45560319481068=/root/.ansible/tmp/ansible-tmp-1726882870.5722013-33541-45560319481068 <<< 30564 1726882870.60613: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882870.60675: stderr chunk (state=3): >>><<< 30564 1726882870.60679: stdout chunk (state=3): >>><<< 30564 1726882870.60971: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882870.5722013-33541-45560319481068=/root/.ansible/tmp/ansible-tmp-1726882870.5722013-33541-45560319481068 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882870.60975: variable 'ansible_module_compression' from source: unknown 30564 1726882870.60977: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30564 1726882870.60979: variable 'ansible_facts' from source: unknown 30564 1726882870.60981: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882870.5722013-33541-45560319481068/AnsiballZ_command.py 30564 1726882870.61551: Sending initial data 30564 1726882870.61574: Sent initial data (155 bytes) 30564 1726882870.62473: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882870.62476: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882870.62507: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 30564 1726882870.62511: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882870.62514: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882870.62581: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882870.62584: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882870.62697: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882870.64469: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882870.64559: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882870.64660: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmpvvli1x_n /root/.ansible/tmp/ansible-tmp-1726882870.5722013-33541-45560319481068/AnsiballZ_command.py <<< 30564 1726882870.64754: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882870.65856: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882870.65938: stderr chunk (state=3): >>><<< 30564 1726882870.65941: stdout chunk (state=3): >>><<< 30564 1726882870.65965: done transferring module to remote 30564 1726882870.65980: _low_level_execute_command(): starting 30564 1726882870.65983: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882870.5722013-33541-45560319481068/ /root/.ansible/tmp/ansible-tmp-1726882870.5722013-33541-45560319481068/AnsiballZ_command.py && sleep 0' 30564 1726882870.66839: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882870.66848: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882870.66859: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882870.66877: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882870.66913: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882870.66923: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882870.66942: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882870.66957: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882870.66967: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882870.66977: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882870.66985: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882870.66994: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882870.67005: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882870.67017: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882870.67024: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882870.67035: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882870.67117: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882870.67133: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882870.67146: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882870.67289: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882870.69052: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882870.69103: stderr chunk (state=3): >>><<< 30564 1726882870.69130: stdout chunk (state=3): >>><<< 30564 1726882870.69149: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882870.69152: _low_level_execute_command(): starting 30564 1726882870.69156: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882870.5722013-33541-45560319481068/AnsiballZ_command.py && sleep 0' 30564 1726882870.69828: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882870.69836: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882870.69846: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882870.69859: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882870.69911: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882870.69919: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882870.69928: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882870.69946: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882870.69960: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882870.69965: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882870.69980: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882870.69998: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882870.70017: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882870.70025: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882870.70032: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882870.70041: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882870.70180: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882870.70195: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882870.70208: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882870.70356: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882870.83945: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo\nrpltstbr", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:41:10.833899", "end": "2024-09-20 21:41:10.837401", "delta": "0:00:00.003502", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30564 1726882870.85180: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882870.85283: stderr chunk (state=3): >>><<< 30564 1726882870.85291: stdout chunk (state=3): >>><<< 30564 1726882870.85312: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo\nrpltstbr", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:41:10.833899", "end": "2024-09-20 21:41:10.837401", "delta": "0:00:00.003502", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882870.85390: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882870.5722013-33541-45560319481068/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882870.85398: _low_level_execute_command(): starting 30564 1726882870.85401: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882870.5722013-33541-45560319481068/ > /dev/null 2>&1 && sleep 0' 30564 1726882870.86336: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882870.86343: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882870.86353: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882870.86371: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882870.86429: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882870.86492: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882870.86502: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882870.86514: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882870.86522: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882870.86528: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882870.86536: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882870.86546: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882870.86556: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882870.86564: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882870.86575: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882870.86580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882870.86646: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882870.86663: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882870.86680: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882870.86802: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882870.88660: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882870.88825: stderr chunk (state=3): >>><<< 30564 1726882870.88829: stdout chunk (state=3): >>><<< 30564 1726882870.88874: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882870.88885: handler run complete 30564 1726882870.89031: Evaluated conditional (False): False 30564 1726882870.89045: attempt loop complete, returning result 30564 1726882870.89048: _execute() done 30564 1726882870.89050: dumping result to json 30564 1726882870.89056: done dumping result, returning 30564 1726882870.89066: done running TaskExecutor() for managed_node2/TASK: Gather current interface info [0e448fcc-3ce9-4216-acec-0000000017a8] 30564 1726882870.89072: sending task result for task 0e448fcc-3ce9-4216-acec-0000000017a8 30564 1726882870.89366: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000017a8 30564 1726882870.89371: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003502", "end": "2024-09-20 21:41:10.837401", "rc": 0, "start": "2024-09-20 21:41:10.833899" } STDOUT: bonding_masters eth0 lo rpltstbr 30564 1726882870.89567: no more pending results, returning what we have 30564 1726882870.89574: results queue empty 30564 1726882870.89575: checking for any_errors_fatal 30564 1726882870.89577: done checking for any_errors_fatal 30564 1726882870.89577: checking for max_fail_percentage 30564 1726882870.89579: done checking for max_fail_percentage 30564 1726882870.89580: checking to see if all hosts have failed and the running result is not ok 30564 1726882870.89581: done checking to see if all hosts have failed 30564 1726882870.89582: getting the remaining hosts for this loop 30564 1726882870.89584: done getting the remaining hosts for this loop 30564 1726882870.89588: getting the next task for host managed_node2 30564 1726882870.89598: done getting next task for host managed_node2 30564 1726882870.89600: ^ task is: TASK: Set current_interfaces 30564 1726882870.89607: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882870.89612: getting variables 30564 1726882870.89614: in VariableManager get_vars() 30564 1726882870.89802: Calling all_inventory to load vars for managed_node2 30564 1726882870.89805: Calling groups_inventory to load vars for managed_node2 30564 1726882870.89809: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882870.89822: Calling all_plugins_play to load vars for managed_node2 30564 1726882870.89825: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882870.89827: Calling groups_plugins_play to load vars for managed_node2 30564 1726882870.92515: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882870.94509: done with get_vars() 30564 1726882870.94565: done getting variables 30564 1726882870.94628: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 21:41:10 -0400 (0:00:00.420) 0:01:09.527 ****** 30564 1726882870.94656: entering _queue_task() for managed_node2/set_fact 30564 1726882870.94994: worker is 1 (out of 1 available) 30564 1726882870.95007: exiting _queue_task() for managed_node2/set_fact 30564 1726882870.95032: done queuing things up, now waiting for results queue to drain 30564 1726882870.95034: waiting for pending results... 30564 1726882870.95301: running TaskExecutor() for managed_node2/TASK: Set current_interfaces 30564 1726882870.95404: in run() - task 0e448fcc-3ce9-4216-acec-0000000017a9 30564 1726882870.95443: variable 'ansible_search_path' from source: unknown 30564 1726882870.95450: variable 'ansible_search_path' from source: unknown 30564 1726882870.95503: calling self._execute() 30564 1726882870.95622: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882870.95628: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882870.95636: variable 'omit' from source: magic vars 30564 1726882870.95936: variable 'ansible_distribution_major_version' from source: facts 30564 1726882870.96240: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882870.96244: variable 'omit' from source: magic vars 30564 1726882870.96247: variable 'omit' from source: magic vars 30564 1726882870.96249: variable '_current_interfaces' from source: set_fact 30564 1726882870.96252: variable 'omit' from source: magic vars 30564 1726882870.96255: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882870.96258: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882870.96283: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882870.96297: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882870.96306: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882870.96329: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882870.96332: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882870.96335: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882870.96407: Set connection var ansible_timeout to 10 30564 1726882870.96411: Set connection var ansible_pipelining to False 30564 1726882870.96414: Set connection var ansible_shell_type to sh 30564 1726882870.96419: Set connection var ansible_shell_executable to /bin/sh 30564 1726882870.96426: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882870.96429: Set connection var ansible_connection to ssh 30564 1726882870.96448: variable 'ansible_shell_executable' from source: unknown 30564 1726882870.96456: variable 'ansible_connection' from source: unknown 30564 1726882870.96459: variable 'ansible_module_compression' from source: unknown 30564 1726882870.96461: variable 'ansible_shell_type' from source: unknown 30564 1726882870.96465: variable 'ansible_shell_executable' from source: unknown 30564 1726882870.96482: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882870.96485: variable 'ansible_pipelining' from source: unknown 30564 1726882870.96487: variable 'ansible_timeout' from source: unknown 30564 1726882870.96489: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882870.96584: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882870.96592: variable 'omit' from source: magic vars 30564 1726882870.96597: starting attempt loop 30564 1726882870.96601: running the handler 30564 1726882870.96610: handler run complete 30564 1726882870.96619: attempt loop complete, returning result 30564 1726882870.96621: _execute() done 30564 1726882870.96623: dumping result to json 30564 1726882870.96626: done dumping result, returning 30564 1726882870.96633: done running TaskExecutor() for managed_node2/TASK: Set current_interfaces [0e448fcc-3ce9-4216-acec-0000000017a9] 30564 1726882870.96638: sending task result for task 0e448fcc-3ce9-4216-acec-0000000017a9 30564 1726882870.96720: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000017a9 30564 1726882870.96723: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo", "rpltstbr" ] }, "changed": false } 30564 1726882870.96781: no more pending results, returning what we have 30564 1726882870.96784: results queue empty 30564 1726882870.96785: checking for any_errors_fatal 30564 1726882870.96796: done checking for any_errors_fatal 30564 1726882870.96797: checking for max_fail_percentage 30564 1726882870.96799: done checking for max_fail_percentage 30564 1726882870.96800: checking to see if all hosts have failed and the running result is not ok 30564 1726882870.96800: done checking to see if all hosts have failed 30564 1726882870.96801: getting the remaining hosts for this loop 30564 1726882870.96803: done getting the remaining hosts for this loop 30564 1726882870.96807: getting the next task for host managed_node2 30564 1726882870.96816: done getting next task for host managed_node2 30564 1726882870.96818: ^ task is: TASK: Show current_interfaces 30564 1726882870.96822: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882870.96826: getting variables 30564 1726882870.96828: in VariableManager get_vars() 30564 1726882870.96942: Calling all_inventory to load vars for managed_node2 30564 1726882870.96945: Calling groups_inventory to load vars for managed_node2 30564 1726882870.96948: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882870.96957: Calling all_plugins_play to load vars for managed_node2 30564 1726882870.96960: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882870.96963: Calling groups_plugins_play to load vars for managed_node2 30564 1726882870.98788: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882871.00151: done with get_vars() 30564 1726882871.00170: done getting variables 30564 1726882871.00210: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 21:41:11 -0400 (0:00:00.055) 0:01:09.583 ****** 30564 1726882871.00231: entering _queue_task() for managed_node2/debug 30564 1726882871.00501: worker is 1 (out of 1 available) 30564 1726882871.00514: exiting _queue_task() for managed_node2/debug 30564 1726882871.00526: done queuing things up, now waiting for results queue to drain 30564 1726882871.00528: waiting for pending results... 30564 1726882871.00752: running TaskExecutor() for managed_node2/TASK: Show current_interfaces 30564 1726882871.00844: in run() - task 0e448fcc-3ce9-4216-acec-00000000176e 30564 1726882871.00856: variable 'ansible_search_path' from source: unknown 30564 1726882871.00860: variable 'ansible_search_path' from source: unknown 30564 1726882871.00891: calling self._execute() 30564 1726882871.00983: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882871.00990: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882871.01011: variable 'omit' from source: magic vars 30564 1726882871.01311: variable 'ansible_distribution_major_version' from source: facts 30564 1726882871.01322: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882871.01327: variable 'omit' from source: magic vars 30564 1726882871.01356: variable 'omit' from source: magic vars 30564 1726882871.01428: variable 'current_interfaces' from source: set_fact 30564 1726882871.01451: variable 'omit' from source: magic vars 30564 1726882871.01486: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882871.01513: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882871.01550: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882871.01589: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882871.01620: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882871.01653: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882871.01657: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882871.01665: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882871.01744: Set connection var ansible_timeout to 10 30564 1726882871.01747: Set connection var ansible_pipelining to False 30564 1726882871.01750: Set connection var ansible_shell_type to sh 30564 1726882871.01755: Set connection var ansible_shell_executable to /bin/sh 30564 1726882871.01762: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882871.01767: Set connection var ansible_connection to ssh 30564 1726882871.01787: variable 'ansible_shell_executable' from source: unknown 30564 1726882871.01790: variable 'ansible_connection' from source: unknown 30564 1726882871.01793: variable 'ansible_module_compression' from source: unknown 30564 1726882871.01795: variable 'ansible_shell_type' from source: unknown 30564 1726882871.01797: variable 'ansible_shell_executable' from source: unknown 30564 1726882871.01799: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882871.01801: variable 'ansible_pipelining' from source: unknown 30564 1726882871.01805: variable 'ansible_timeout' from source: unknown 30564 1726882871.01809: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882871.02008: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882871.02020: variable 'omit' from source: magic vars 30564 1726882871.02032: starting attempt loop 30564 1726882871.02036: running the handler 30564 1726882871.02102: handler run complete 30564 1726882871.02112: attempt loop complete, returning result 30564 1726882871.02115: _execute() done 30564 1726882871.02118: dumping result to json 30564 1726882871.02121: done dumping result, returning 30564 1726882871.02128: done running TaskExecutor() for managed_node2/TASK: Show current_interfaces [0e448fcc-3ce9-4216-acec-00000000176e] 30564 1726882871.02133: sending task result for task 0e448fcc-3ce9-4216-acec-00000000176e 30564 1726882871.02260: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000176e 30564 1726882871.02262: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo', 'rpltstbr'] 30564 1726882871.02326: no more pending results, returning what we have 30564 1726882871.02328: results queue empty 30564 1726882871.02329: checking for any_errors_fatal 30564 1726882871.02358: done checking for any_errors_fatal 30564 1726882871.02359: checking for max_fail_percentage 30564 1726882871.02382: done checking for max_fail_percentage 30564 1726882871.02383: checking to see if all hosts have failed and the running result is not ok 30564 1726882871.02384: done checking to see if all hosts have failed 30564 1726882871.02404: getting the remaining hosts for this loop 30564 1726882871.02405: done getting the remaining hosts for this loop 30564 1726882871.02409: getting the next task for host managed_node2 30564 1726882871.02416: done getting next task for host managed_node2 30564 1726882871.02419: ^ task is: TASK: Setup 30564 1726882871.02422: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882871.02426: getting variables 30564 1726882871.02427: in VariableManager get_vars() 30564 1726882871.02479: Calling all_inventory to load vars for managed_node2 30564 1726882871.02481: Calling groups_inventory to load vars for managed_node2 30564 1726882871.02484: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882871.02491: Calling all_plugins_play to load vars for managed_node2 30564 1726882871.02493: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882871.02495: Calling groups_plugins_play to load vars for managed_node2 30564 1726882871.03328: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882871.04928: done with get_vars() 30564 1726882871.04962: done getting variables TASK [Setup] ******************************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:24 Friday 20 September 2024 21:41:11 -0400 (0:00:00.048) 0:01:09.632 ****** 30564 1726882871.05072: entering _queue_task() for managed_node2/include_tasks 30564 1726882871.05346: worker is 1 (out of 1 available) 30564 1726882871.05371: exiting _queue_task() for managed_node2/include_tasks 30564 1726882871.05386: done queuing things up, now waiting for results queue to drain 30564 1726882871.05388: waiting for pending results... 30564 1726882871.05749: running TaskExecutor() for managed_node2/TASK: Setup 30564 1726882871.05847: in run() - task 0e448fcc-3ce9-4216-acec-000000001747 30564 1726882871.05858: variable 'ansible_search_path' from source: unknown 30564 1726882871.05862: variable 'ansible_search_path' from source: unknown 30564 1726882871.05906: variable 'lsr_setup' from source: include params 30564 1726882871.06385: variable 'lsr_setup' from source: include params 30564 1726882871.06494: variable 'omit' from source: magic vars 30564 1726882871.06666: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882871.06684: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882871.06696: variable 'omit' from source: magic vars 30564 1726882871.07033: variable 'ansible_distribution_major_version' from source: facts 30564 1726882871.07037: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882871.07060: variable 'item' from source: unknown 30564 1726882871.07138: variable 'item' from source: unknown 30564 1726882871.07184: variable 'item' from source: unknown 30564 1726882871.07242: variable 'item' from source: unknown 30564 1726882871.07461: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882871.07479: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882871.07483: variable 'omit' from source: magic vars 30564 1726882871.07615: variable 'ansible_distribution_major_version' from source: facts 30564 1726882871.07628: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882871.07641: variable 'item' from source: unknown 30564 1726882871.07734: variable 'item' from source: unknown 30564 1726882871.07777: variable 'item' from source: unknown 30564 1726882871.07862: variable 'item' from source: unknown 30564 1726882871.08020: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882871.08037: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882871.08070: variable 'omit' from source: magic vars 30564 1726882871.08242: variable 'ansible_distribution_major_version' from source: facts 30564 1726882871.08254: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882871.08266: variable 'item' from source: unknown 30564 1726882871.08359: variable 'item' from source: unknown 30564 1726882871.08413: variable 'item' from source: unknown 30564 1726882871.08486: variable 'item' from source: unknown 30564 1726882871.08576: dumping result to json 30564 1726882871.08595: done dumping result, returning 30564 1726882871.08616: done running TaskExecutor() for managed_node2/TASK: Setup [0e448fcc-3ce9-4216-acec-000000001747] 30564 1726882871.08634: sending task result for task 0e448fcc-3ce9-4216-acec-000000001747 30564 1726882871.08747: no more pending results, returning what we have 30564 1726882871.08752: in VariableManager get_vars() 30564 1726882871.08807: Calling all_inventory to load vars for managed_node2 30564 1726882871.08811: Calling groups_inventory to load vars for managed_node2 30564 1726882871.08815: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882871.08838: Calling all_plugins_play to load vars for managed_node2 30564 1726882871.08842: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882871.08845: Calling groups_plugins_play to load vars for managed_node2 30564 1726882871.09884: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001747 30564 1726882871.09888: WORKER PROCESS EXITING 30564 1726882871.10911: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882871.13255: done with get_vars() 30564 1726882871.13284: variable 'ansible_search_path' from source: unknown 30564 1726882871.13288: variable 'ansible_search_path' from source: unknown 30564 1726882871.13353: variable 'ansible_search_path' from source: unknown 30564 1726882871.13354: variable 'ansible_search_path' from source: unknown 30564 1726882871.13398: variable 'ansible_search_path' from source: unknown 30564 1726882871.13399: variable 'ansible_search_path' from source: unknown 30564 1726882871.13442: we have included files to process 30564 1726882871.13446: generating all_blocks data 30564 1726882871.13447: done generating all_blocks data 30564 1726882871.13455: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 30564 1726882871.13456: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 30564 1726882871.13462: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 30564 1726882871.13793: done processing included file 30564 1726882871.13795: iterating over new_blocks loaded from include file 30564 1726882871.13798: in VariableManager get_vars() 30564 1726882871.13815: done with get_vars() 30564 1726882871.13817: filtering new block on tags 30564 1726882871.13860: done filtering new block on tags 30564 1726882871.13871: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml for managed_node2 => (item=tasks/create_bridge_profile.yml) 30564 1726882871.13880: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 30564 1726882871.13881: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 30564 1726882871.13884: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 30564 1726882871.13983: done processing included file 30564 1726882871.13986: iterating over new_blocks loaded from include file 30564 1726882871.13987: in VariableManager get_vars() 30564 1726882871.14008: done with get_vars() 30564 1726882871.14010: filtering new block on tags 30564 1726882871.14041: done filtering new block on tags 30564 1726882871.14043: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml for managed_node2 => (item=tasks/activate_profile.yml) 30564 1726882871.14046: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_profile.yml 30564 1726882871.14047: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_profile.yml 30564 1726882871.14050: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_profile.yml 30564 1726882871.14185: done processing included file 30564 1726882871.14187: iterating over new_blocks loaded from include file 30564 1726882871.14189: in VariableManager get_vars() 30564 1726882871.14204: done with get_vars() 30564 1726882871.14205: filtering new block on tags 30564 1726882871.14228: done filtering new block on tags 30564 1726882871.14230: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_profile.yml for managed_node2 => (item=tasks/remove_profile.yml) 30564 1726882871.14233: extending task lists for all hosts with included blocks 30564 1726882871.15155: done extending task lists 30564 1726882871.15161: done processing included files 30564 1726882871.15162: results queue empty 30564 1726882871.15166: checking for any_errors_fatal 30564 1726882871.15173: done checking for any_errors_fatal 30564 1726882871.15173: checking for max_fail_percentage 30564 1726882871.15175: done checking for max_fail_percentage 30564 1726882871.15175: checking to see if all hosts have failed and the running result is not ok 30564 1726882871.15176: done checking to see if all hosts have failed 30564 1726882871.15177: getting the remaining hosts for this loop 30564 1726882871.15178: done getting the remaining hosts for this loop 30564 1726882871.15181: getting the next task for host managed_node2 30564 1726882871.15185: done getting next task for host managed_node2 30564 1726882871.15187: ^ task is: TASK: Include network role 30564 1726882871.15190: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882871.15193: getting variables 30564 1726882871.15194: in VariableManager get_vars() 30564 1726882871.15203: Calling all_inventory to load vars for managed_node2 30564 1726882871.15205: Calling groups_inventory to load vars for managed_node2 30564 1726882871.15208: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882871.15213: Calling all_plugins_play to load vars for managed_node2 30564 1726882871.15218: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882871.15221: Calling groups_plugins_play to load vars for managed_node2 30564 1726882871.16757: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882871.18712: done with get_vars() 30564 1726882871.18743: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml:3 Friday 20 September 2024 21:41:11 -0400 (0:00:00.137) 0:01:09.769 ****** 30564 1726882871.18867: entering _queue_task() for managed_node2/include_role 30564 1726882871.20761: worker is 1 (out of 1 available) 30564 1726882871.20775: exiting _queue_task() for managed_node2/include_role 30564 1726882871.20823: done queuing things up, now waiting for results queue to drain 30564 1726882871.20825: waiting for pending results... 30564 1726882871.21441: running TaskExecutor() for managed_node2/TASK: Include network role 30564 1726882871.21609: in run() - task 0e448fcc-3ce9-4216-acec-0000000017d0 30564 1726882871.21650: variable 'ansible_search_path' from source: unknown 30564 1726882871.21662: variable 'ansible_search_path' from source: unknown 30564 1726882871.21723: calling self._execute() 30564 1726882871.21866: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882871.21889: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882871.21907: variable 'omit' from source: magic vars 30564 1726882871.22392: variable 'ansible_distribution_major_version' from source: facts 30564 1726882871.22426: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882871.22445: _execute() done 30564 1726882871.22454: dumping result to json 30564 1726882871.22474: done dumping result, returning 30564 1726882871.22488: done running TaskExecutor() for managed_node2/TASK: Include network role [0e448fcc-3ce9-4216-acec-0000000017d0] 30564 1726882871.22508: sending task result for task 0e448fcc-3ce9-4216-acec-0000000017d0 30564 1726882871.22705: no more pending results, returning what we have 30564 1726882871.22713: in VariableManager get_vars() 30564 1726882871.22777: Calling all_inventory to load vars for managed_node2 30564 1726882871.22785: Calling groups_inventory to load vars for managed_node2 30564 1726882871.22790: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882871.22813: Calling all_plugins_play to load vars for managed_node2 30564 1726882871.22818: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882871.22823: Calling groups_plugins_play to load vars for managed_node2 30564 1726882871.23892: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000017d0 30564 1726882871.23895: WORKER PROCESS EXITING 30564 1726882871.24976: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882871.27029: done with get_vars() 30564 1726882871.27050: variable 'ansible_search_path' from source: unknown 30564 1726882871.27051: variable 'ansible_search_path' from source: unknown 30564 1726882871.27253: variable 'omit' from source: magic vars 30564 1726882871.27296: variable 'omit' from source: magic vars 30564 1726882871.27312: variable 'omit' from source: magic vars 30564 1726882871.27316: we have included files to process 30564 1726882871.27317: generating all_blocks data 30564 1726882871.27319: done generating all_blocks data 30564 1726882871.27320: processing included file: fedora.linux_system_roles.network 30564 1726882871.27341: in VariableManager get_vars() 30564 1726882871.27356: done with get_vars() 30564 1726882871.27391: in VariableManager get_vars() 30564 1726882871.27409: done with get_vars() 30564 1726882871.27447: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 30564 1726882871.27581: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 30564 1726882871.27666: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 30564 1726882871.28139: in VariableManager get_vars() 30564 1726882871.28160: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30564 1726882871.30188: iterating over new_blocks loaded from include file 30564 1726882871.30191: in VariableManager get_vars() 30564 1726882871.30208: done with get_vars() 30564 1726882871.30210: filtering new block on tags 30564 1726882871.30513: done filtering new block on tags 30564 1726882871.30516: in VariableManager get_vars() 30564 1726882871.30531: done with get_vars() 30564 1726882871.30533: filtering new block on tags 30564 1726882871.30550: done filtering new block on tags 30564 1726882871.30551: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node2 30564 1726882871.30557: extending task lists for all hosts with included blocks 30564 1726882871.30730: done extending task lists 30564 1726882871.30731: done processing included files 30564 1726882871.30732: results queue empty 30564 1726882871.30733: checking for any_errors_fatal 30564 1726882871.30736: done checking for any_errors_fatal 30564 1726882871.30737: checking for max_fail_percentage 30564 1726882871.30738: done checking for max_fail_percentage 30564 1726882871.30739: checking to see if all hosts have failed and the running result is not ok 30564 1726882871.30740: done checking to see if all hosts have failed 30564 1726882871.30741: getting the remaining hosts for this loop 30564 1726882871.30742: done getting the remaining hosts for this loop 30564 1726882871.30745: getting the next task for host managed_node2 30564 1726882871.30749: done getting next task for host managed_node2 30564 1726882871.30752: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30564 1726882871.30756: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882871.30771: getting variables 30564 1726882871.30772: in VariableManager get_vars() 30564 1726882871.30786: Calling all_inventory to load vars for managed_node2 30564 1726882871.30788: Calling groups_inventory to load vars for managed_node2 30564 1726882871.30790: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882871.30796: Calling all_plugins_play to load vars for managed_node2 30564 1726882871.30798: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882871.30801: Calling groups_plugins_play to load vars for managed_node2 30564 1726882871.32084: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882871.33824: done with get_vars() 30564 1726882871.33845: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:41:11 -0400 (0:00:00.150) 0:01:09.920 ****** 30564 1726882871.33924: entering _queue_task() for managed_node2/include_tasks 30564 1726882871.34296: worker is 1 (out of 1 available) 30564 1726882871.34309: exiting _queue_task() for managed_node2/include_tasks 30564 1726882871.34321: done queuing things up, now waiting for results queue to drain 30564 1726882871.34323: waiting for pending results... 30564 1726882871.34626: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30564 1726882871.34787: in run() - task 0e448fcc-3ce9-4216-acec-00000000183a 30564 1726882871.34806: variable 'ansible_search_path' from source: unknown 30564 1726882871.34815: variable 'ansible_search_path' from source: unknown 30564 1726882871.34854: calling self._execute() 30564 1726882871.34955: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882871.34973: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882871.34993: variable 'omit' from source: magic vars 30564 1726882871.35379: variable 'ansible_distribution_major_version' from source: facts 30564 1726882871.35397: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882871.35410: _execute() done 30564 1726882871.35420: dumping result to json 30564 1726882871.35428: done dumping result, returning 30564 1726882871.35439: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0e448fcc-3ce9-4216-acec-00000000183a] 30564 1726882871.35450: sending task result for task 0e448fcc-3ce9-4216-acec-00000000183a 30564 1726882871.35597: no more pending results, returning what we have 30564 1726882871.35602: in VariableManager get_vars() 30564 1726882871.35649: Calling all_inventory to load vars for managed_node2 30564 1726882871.35652: Calling groups_inventory to load vars for managed_node2 30564 1726882871.35655: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882871.35673: Calling all_plugins_play to load vars for managed_node2 30564 1726882871.35678: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882871.35682: Calling groups_plugins_play to load vars for managed_node2 30564 1726882871.36686: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000183a 30564 1726882871.36689: WORKER PROCESS EXITING 30564 1726882871.42434: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882871.44595: done with get_vars() 30564 1726882871.44619: variable 'ansible_search_path' from source: unknown 30564 1726882871.44621: variable 'ansible_search_path' from source: unknown 30564 1726882871.44660: we have included files to process 30564 1726882871.44662: generating all_blocks data 30564 1726882871.44664: done generating all_blocks data 30564 1726882871.44670: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30564 1726882871.44671: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30564 1726882871.44673: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30564 1726882871.45240: done processing included file 30564 1726882871.45242: iterating over new_blocks loaded from include file 30564 1726882871.45244: in VariableManager get_vars() 30564 1726882871.45274: done with get_vars() 30564 1726882871.45276: filtering new block on tags 30564 1726882871.45306: done filtering new block on tags 30564 1726882871.45309: in VariableManager get_vars() 30564 1726882871.45331: done with get_vars() 30564 1726882871.45333: filtering new block on tags 30564 1726882871.45387: done filtering new block on tags 30564 1726882871.45390: in VariableManager get_vars() 30564 1726882871.45412: done with get_vars() 30564 1726882871.45414: filtering new block on tags 30564 1726882871.45456: done filtering new block on tags 30564 1726882871.45458: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 30564 1726882871.45465: extending task lists for all hosts with included blocks 30564 1726882871.47266: done extending task lists 30564 1726882871.47269: done processing included files 30564 1726882871.47270: results queue empty 30564 1726882871.47271: checking for any_errors_fatal 30564 1726882871.47275: done checking for any_errors_fatal 30564 1726882871.47275: checking for max_fail_percentage 30564 1726882871.47277: done checking for max_fail_percentage 30564 1726882871.47277: checking to see if all hosts have failed and the running result is not ok 30564 1726882871.47278: done checking to see if all hosts have failed 30564 1726882871.47279: getting the remaining hosts for this loop 30564 1726882871.47281: done getting the remaining hosts for this loop 30564 1726882871.47283: getting the next task for host managed_node2 30564 1726882871.47288: done getting next task for host managed_node2 30564 1726882871.47290: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30564 1726882871.47293: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882871.47303: getting variables 30564 1726882871.47304: in VariableManager get_vars() 30564 1726882871.47317: Calling all_inventory to load vars for managed_node2 30564 1726882871.47319: Calling groups_inventory to load vars for managed_node2 30564 1726882871.47321: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882871.47326: Calling all_plugins_play to load vars for managed_node2 30564 1726882871.47329: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882871.47331: Calling groups_plugins_play to load vars for managed_node2 30564 1726882871.48553: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882871.50325: done with get_vars() 30564 1726882871.50345: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:41:11 -0400 (0:00:00.164) 0:01:10.085 ****** 30564 1726882871.50421: entering _queue_task() for managed_node2/setup 30564 1726882871.50770: worker is 1 (out of 1 available) 30564 1726882871.50782: exiting _queue_task() for managed_node2/setup 30564 1726882871.50795: done queuing things up, now waiting for results queue to drain 30564 1726882871.50796: waiting for pending results... 30564 1726882871.51102: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30564 1726882871.51270: in run() - task 0e448fcc-3ce9-4216-acec-000000001897 30564 1726882871.51292: variable 'ansible_search_path' from source: unknown 30564 1726882871.51300: variable 'ansible_search_path' from source: unknown 30564 1726882871.51343: calling self._execute() 30564 1726882871.51457: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882871.51480: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882871.51497: variable 'omit' from source: magic vars 30564 1726882871.51899: variable 'ansible_distribution_major_version' from source: facts 30564 1726882871.51918: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882871.52152: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882871.54548: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882871.54635: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882871.54678: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882871.54721: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882871.54751: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882871.54841: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882871.54881: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882871.54913: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882871.54960: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882871.54983: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882871.55031: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882871.55059: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882871.55091: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882871.55131: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882871.55151: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882871.55302: variable '__network_required_facts' from source: role '' defaults 30564 1726882871.55314: variable 'ansible_facts' from source: unknown 30564 1726882871.56090: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 30564 1726882871.56100: when evaluation is False, skipping this task 30564 1726882871.56106: _execute() done 30564 1726882871.56113: dumping result to json 30564 1726882871.56124: done dumping result, returning 30564 1726882871.56133: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0e448fcc-3ce9-4216-acec-000000001897] 30564 1726882871.56142: sending task result for task 0e448fcc-3ce9-4216-acec-000000001897 skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30564 1726882871.56280: no more pending results, returning what we have 30564 1726882871.56286: results queue empty 30564 1726882871.56287: checking for any_errors_fatal 30564 1726882871.56289: done checking for any_errors_fatal 30564 1726882871.56289: checking for max_fail_percentage 30564 1726882871.56292: done checking for max_fail_percentage 30564 1726882871.56293: checking to see if all hosts have failed and the running result is not ok 30564 1726882871.56294: done checking to see if all hosts have failed 30564 1726882871.56295: getting the remaining hosts for this loop 30564 1726882871.56297: done getting the remaining hosts for this loop 30564 1726882871.56301: getting the next task for host managed_node2 30564 1726882871.56315: done getting next task for host managed_node2 30564 1726882871.56318: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 30564 1726882871.56326: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882871.56346: getting variables 30564 1726882871.56348: in VariableManager get_vars() 30564 1726882871.56393: Calling all_inventory to load vars for managed_node2 30564 1726882871.56395: Calling groups_inventory to load vars for managed_node2 30564 1726882871.56398: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882871.56408: Calling all_plugins_play to load vars for managed_node2 30564 1726882871.56410: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882871.56413: Calling groups_plugins_play to load vars for managed_node2 30564 1726882871.57620: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001897 30564 1726882871.57631: WORKER PROCESS EXITING 30564 1726882871.58122: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882871.59881: done with get_vars() 30564 1726882871.59903: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:41:11 -0400 (0:00:00.095) 0:01:10.181 ****** 30564 1726882871.60001: entering _queue_task() for managed_node2/stat 30564 1726882871.60291: worker is 1 (out of 1 available) 30564 1726882871.60306: exiting _queue_task() for managed_node2/stat 30564 1726882871.60319: done queuing things up, now waiting for results queue to drain 30564 1726882871.60320: waiting for pending results... 30564 1726882871.60619: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 30564 1726882871.60800: in run() - task 0e448fcc-3ce9-4216-acec-000000001899 30564 1726882871.60821: variable 'ansible_search_path' from source: unknown 30564 1726882871.60831: variable 'ansible_search_path' from source: unknown 30564 1726882871.60876: calling self._execute() 30564 1726882871.60980: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882871.60995: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882871.61009: variable 'omit' from source: magic vars 30564 1726882871.61394: variable 'ansible_distribution_major_version' from source: facts 30564 1726882871.61413: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882871.61593: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882871.61881: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882871.61930: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882871.61976: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882871.62017: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882871.62494: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882871.62528: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882871.62559: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882871.62595: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882871.62698: variable '__network_is_ostree' from source: set_fact 30564 1726882871.62709: Evaluated conditional (not __network_is_ostree is defined): False 30564 1726882871.62718: when evaluation is False, skipping this task 30564 1726882871.62726: _execute() done 30564 1726882871.62733: dumping result to json 30564 1726882871.62739: done dumping result, returning 30564 1726882871.62750: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0e448fcc-3ce9-4216-acec-000000001899] 30564 1726882871.62759: sending task result for task 0e448fcc-3ce9-4216-acec-000000001899 skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30564 1726882871.62917: no more pending results, returning what we have 30564 1726882871.62922: results queue empty 30564 1726882871.62923: checking for any_errors_fatal 30564 1726882871.62931: done checking for any_errors_fatal 30564 1726882871.62932: checking for max_fail_percentage 30564 1726882871.62934: done checking for max_fail_percentage 30564 1726882871.62935: checking to see if all hosts have failed and the running result is not ok 30564 1726882871.62935: done checking to see if all hosts have failed 30564 1726882871.62936: getting the remaining hosts for this loop 30564 1726882871.62938: done getting the remaining hosts for this loop 30564 1726882871.62942: getting the next task for host managed_node2 30564 1726882871.62952: done getting next task for host managed_node2 30564 1726882871.62955: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30564 1726882871.62962: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882871.62987: getting variables 30564 1726882871.62989: in VariableManager get_vars() 30564 1726882871.63029: Calling all_inventory to load vars for managed_node2 30564 1726882871.63032: Calling groups_inventory to load vars for managed_node2 30564 1726882871.63035: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882871.63045: Calling all_plugins_play to load vars for managed_node2 30564 1726882871.63048: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882871.63051: Calling groups_plugins_play to load vars for managed_node2 30564 1726882871.64085: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001899 30564 1726882871.64088: WORKER PROCESS EXITING 30564 1726882871.64858: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882871.66658: done with get_vars() 30564 1726882871.66685: done getting variables 30564 1726882871.66741: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:41:11 -0400 (0:00:00.067) 0:01:10.249 ****** 30564 1726882871.66781: entering _queue_task() for managed_node2/set_fact 30564 1726882871.67085: worker is 1 (out of 1 available) 30564 1726882871.67098: exiting _queue_task() for managed_node2/set_fact 30564 1726882871.67110: done queuing things up, now waiting for results queue to drain 30564 1726882871.67111: waiting for pending results... 30564 1726882871.67412: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30564 1726882871.67585: in run() - task 0e448fcc-3ce9-4216-acec-00000000189a 30564 1726882871.67605: variable 'ansible_search_path' from source: unknown 30564 1726882871.67613: variable 'ansible_search_path' from source: unknown 30564 1726882871.67651: calling self._execute() 30564 1726882871.67760: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882871.67780: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882871.67797: variable 'omit' from source: magic vars 30564 1726882871.68196: variable 'ansible_distribution_major_version' from source: facts 30564 1726882871.68219: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882871.68395: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882871.68692: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882871.68741: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882871.68784: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882871.68821: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882871.68940: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882871.68979: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882871.69011: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882871.69041: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882871.69145: variable '__network_is_ostree' from source: set_fact 30564 1726882871.69157: Evaluated conditional (not __network_is_ostree is defined): False 30564 1726882871.69166: when evaluation is False, skipping this task 30564 1726882871.69180: _execute() done 30564 1726882871.69188: dumping result to json 30564 1726882871.69194: done dumping result, returning 30564 1726882871.69204: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0e448fcc-3ce9-4216-acec-00000000189a] 30564 1726882871.69214: sending task result for task 0e448fcc-3ce9-4216-acec-00000000189a skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30564 1726882871.69353: no more pending results, returning what we have 30564 1726882871.69357: results queue empty 30564 1726882871.69358: checking for any_errors_fatal 30564 1726882871.69372: done checking for any_errors_fatal 30564 1726882871.69373: checking for max_fail_percentage 30564 1726882871.69375: done checking for max_fail_percentage 30564 1726882871.69376: checking to see if all hosts have failed and the running result is not ok 30564 1726882871.69377: done checking to see if all hosts have failed 30564 1726882871.69378: getting the remaining hosts for this loop 30564 1726882871.69380: done getting the remaining hosts for this loop 30564 1726882871.69384: getting the next task for host managed_node2 30564 1726882871.69396: done getting next task for host managed_node2 30564 1726882871.69401: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 30564 1726882871.69409: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882871.69431: getting variables 30564 1726882871.69433: in VariableManager get_vars() 30564 1726882871.69476: Calling all_inventory to load vars for managed_node2 30564 1726882871.69480: Calling groups_inventory to load vars for managed_node2 30564 1726882871.69483: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882871.69494: Calling all_plugins_play to load vars for managed_node2 30564 1726882871.69497: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882871.69500: Calling groups_plugins_play to load vars for managed_node2 30564 1726882871.70884: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000189a 30564 1726882871.70888: WORKER PROCESS EXITING 30564 1726882871.71390: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882871.73201: done with get_vars() 30564 1726882871.73224: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:41:11 -0400 (0:00:00.065) 0:01:10.314 ****** 30564 1726882871.73323: entering _queue_task() for managed_node2/service_facts 30564 1726882871.73611: worker is 1 (out of 1 available) 30564 1726882871.73624: exiting _queue_task() for managed_node2/service_facts 30564 1726882871.73637: done queuing things up, now waiting for results queue to drain 30564 1726882871.73638: waiting for pending results... 30564 1726882871.73944: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 30564 1726882871.74108: in run() - task 0e448fcc-3ce9-4216-acec-00000000189c 30564 1726882871.74127: variable 'ansible_search_path' from source: unknown 30564 1726882871.74134: variable 'ansible_search_path' from source: unknown 30564 1726882871.74177: calling self._execute() 30564 1726882871.74291: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882871.74305: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882871.74319: variable 'omit' from source: magic vars 30564 1726882871.74706: variable 'ansible_distribution_major_version' from source: facts 30564 1726882871.74724: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882871.74738: variable 'omit' from source: magic vars 30564 1726882871.74819: variable 'omit' from source: magic vars 30564 1726882871.74858: variable 'omit' from source: magic vars 30564 1726882871.74905: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882871.74946: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882871.74976: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882871.74998: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882871.75013: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882871.75047: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882871.75057: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882871.75070: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882871.75176: Set connection var ansible_timeout to 10 30564 1726882871.75186: Set connection var ansible_pipelining to False 30564 1726882871.75193: Set connection var ansible_shell_type to sh 30564 1726882871.75201: Set connection var ansible_shell_executable to /bin/sh 30564 1726882871.75212: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882871.75218: Set connection var ansible_connection to ssh 30564 1726882871.75245: variable 'ansible_shell_executable' from source: unknown 30564 1726882871.75252: variable 'ansible_connection' from source: unknown 30564 1726882871.75260: variable 'ansible_module_compression' from source: unknown 30564 1726882871.75271: variable 'ansible_shell_type' from source: unknown 30564 1726882871.75282: variable 'ansible_shell_executable' from source: unknown 30564 1726882871.75289: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882871.75296: variable 'ansible_pipelining' from source: unknown 30564 1726882871.75302: variable 'ansible_timeout' from source: unknown 30564 1726882871.75309: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882871.75508: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30564 1726882871.75524: variable 'omit' from source: magic vars 30564 1726882871.75532: starting attempt loop 30564 1726882871.75538: running the handler 30564 1726882871.75554: _low_level_execute_command(): starting 30564 1726882871.75570: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882871.76342: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882871.76356: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882871.76378: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882871.76397: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882871.76441: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882871.76454: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882871.76476: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882871.76495: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882871.76507: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882871.76518: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882871.76529: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882871.76542: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882871.76556: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882871.76572: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882871.76587: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882871.76600: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882871.76678: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882871.76697: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882871.76711: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882871.76853: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882871.78519: stdout chunk (state=3): >>>/root <<< 30564 1726882871.78623: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882871.78684: stderr chunk (state=3): >>><<< 30564 1726882871.78691: stdout chunk (state=3): >>><<< 30564 1726882871.78720: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882871.78730: _low_level_execute_command(): starting 30564 1726882871.78736: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882871.7871644-33589-30021977668720 `" && echo ansible-tmp-1726882871.7871644-33589-30021977668720="` echo /root/.ansible/tmp/ansible-tmp-1726882871.7871644-33589-30021977668720 `" ) && sleep 0' 30564 1726882871.79341: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882871.79350: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882871.79362: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882871.79380: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882871.79418: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882871.79426: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882871.79439: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882871.79473: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882871.79481: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882871.79488: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882871.79496: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882871.79505: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882871.79515: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882871.79523: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882871.79529: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882871.79539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882871.79608: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882871.79623: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882871.79627: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882871.79760: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882871.81626: stdout chunk (state=3): >>>ansible-tmp-1726882871.7871644-33589-30021977668720=/root/.ansible/tmp/ansible-tmp-1726882871.7871644-33589-30021977668720 <<< 30564 1726882871.81814: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882871.81818: stdout chunk (state=3): >>><<< 30564 1726882871.81820: stderr chunk (state=3): >>><<< 30564 1726882871.82074: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882871.7871644-33589-30021977668720=/root/.ansible/tmp/ansible-tmp-1726882871.7871644-33589-30021977668720 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882871.82077: variable 'ansible_module_compression' from source: unknown 30564 1726882871.82080: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 30564 1726882871.82082: variable 'ansible_facts' from source: unknown 30564 1726882871.82084: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882871.7871644-33589-30021977668720/AnsiballZ_service_facts.py 30564 1726882871.82226: Sending initial data 30564 1726882871.82229: Sent initial data (161 bytes) 30564 1726882871.83241: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882871.83256: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882871.83282: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882871.83300: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882871.83341: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882871.83353: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882871.83372: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882871.83397: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882871.83409: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882871.83421: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882871.83434: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882871.83448: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882871.83473: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882871.83491: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882871.83508: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882871.83523: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882871.83603: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882871.83624: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882871.83639: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882871.83779: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882871.85527: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882871.85621: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882871.85725: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmp9hr1951v /root/.ansible/tmp/ansible-tmp-1726882871.7871644-33589-30021977668720/AnsiballZ_service_facts.py <<< 30564 1726882871.86214: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882871.87531: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882871.87535: stderr chunk (state=3): >>><<< 30564 1726882871.87542: stdout chunk (state=3): >>><<< 30564 1726882871.87558: done transferring module to remote 30564 1726882871.87573: _low_level_execute_command(): starting 30564 1726882871.87576: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882871.7871644-33589-30021977668720/ /root/.ansible/tmp/ansible-tmp-1726882871.7871644-33589-30021977668720/AnsiballZ_service_facts.py && sleep 0' 30564 1726882871.88283: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882871.88297: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882871.88317: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882871.88335: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882871.88381: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882871.88394: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882871.88407: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882871.88430: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882871.88443: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882871.88454: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882871.88471: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882871.88486: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882871.88503: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882871.88516: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882871.88534: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882871.88548: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882871.88630: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882871.88656: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882871.88679: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882871.88806: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882871.90546: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882871.90624: stderr chunk (state=3): >>><<< 30564 1726882871.90635: stdout chunk (state=3): >>><<< 30564 1726882871.90732: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882871.90736: _low_level_execute_command(): starting 30564 1726882871.90739: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882871.7871644-33589-30021977668720/AnsiballZ_service_facts.py && sleep 0' 30564 1726882871.91320: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882871.91334: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882871.91349: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882871.91371: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882871.91419: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882871.91432: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882871.91447: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882871.91466: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882871.91483: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882871.91495: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882871.91514: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882871.91529: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882871.91546: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882871.91558: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882871.91575: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882871.91589: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882871.91675: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882871.91696: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882871.91711: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882871.91849: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882873.24613: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", <<< 30564 1726882873.24681: stdout chunk (state=3): >>>"source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-qu<<< 30564 1726882873.24687: stdout chunk (state=3): >>>it-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.servi<<< 30564 1726882873.24700: stdout chunk (state=3): >>>ce": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "ina<<< 30564 1726882873.24704: stdout chunk (state=3): >>>ctive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 30564 1726882873.25914: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882873.25985: stderr chunk (state=3): >>><<< 30564 1726882873.25989: stdout chunk (state=3): >>><<< 30564 1726882873.26019: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882873.26542: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882871.7871644-33589-30021977668720/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882873.26548: _low_level_execute_command(): starting 30564 1726882873.26553: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882871.7871644-33589-30021977668720/ > /dev/null 2>&1 && sleep 0' 30564 1726882873.27166: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882873.27179: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882873.27199: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882873.27219: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882873.27223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882873.27246: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882873.27251: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882873.27272: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882873.27284: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882873.27298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882873.27324: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882873.27328: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882873.27344: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882873.27347: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882873.27420: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882873.27440: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882873.27580: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882873.29388: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882873.29490: stderr chunk (state=3): >>><<< 30564 1726882873.29493: stdout chunk (state=3): >>><<< 30564 1726882873.29674: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882873.29677: handler run complete 30564 1726882873.29704: variable 'ansible_facts' from source: unknown 30564 1726882873.29818: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882873.30273: variable 'ansible_facts' from source: unknown 30564 1726882873.30396: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882873.30621: attempt loop complete, returning result 30564 1726882873.30624: _execute() done 30564 1726882873.30626: dumping result to json 30564 1726882873.30682: done dumping result, returning 30564 1726882873.30702: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [0e448fcc-3ce9-4216-acec-00000000189c] 30564 1726882873.30705: sending task result for task 0e448fcc-3ce9-4216-acec-00000000189c ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30564 1726882873.31712: no more pending results, returning what we have 30564 1726882873.31715: results queue empty 30564 1726882873.31716: checking for any_errors_fatal 30564 1726882873.31722: done checking for any_errors_fatal 30564 1726882873.31723: checking for max_fail_percentage 30564 1726882873.31725: done checking for max_fail_percentage 30564 1726882873.31726: checking to see if all hosts have failed and the running result is not ok 30564 1726882873.31726: done checking to see if all hosts have failed 30564 1726882873.31727: getting the remaining hosts for this loop 30564 1726882873.31728: done getting the remaining hosts for this loop 30564 1726882873.31732: getting the next task for host managed_node2 30564 1726882873.31739: done getting next task for host managed_node2 30564 1726882873.31743: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 30564 1726882873.31759: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882873.31774: getting variables 30564 1726882873.31776: in VariableManager get_vars() 30564 1726882873.31810: Calling all_inventory to load vars for managed_node2 30564 1726882873.31813: Calling groups_inventory to load vars for managed_node2 30564 1726882873.31819: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882873.31828: Calling all_plugins_play to load vars for managed_node2 30564 1726882873.31832: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882873.31835: Calling groups_plugins_play to load vars for managed_node2 30564 1726882873.32461: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000189c 30564 1726882873.32464: WORKER PROCESS EXITING 30564 1726882873.33315: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882873.35600: done with get_vars() 30564 1726882873.35622: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:41:13 -0400 (0:00:01.623) 0:01:11.938 ****** 30564 1726882873.35720: entering _queue_task() for managed_node2/package_facts 30564 1726882873.36024: worker is 1 (out of 1 available) 30564 1726882873.36038: exiting _queue_task() for managed_node2/package_facts 30564 1726882873.36052: done queuing things up, now waiting for results queue to drain 30564 1726882873.36053: waiting for pending results... 30564 1726882873.36345: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 30564 1726882873.36533: in run() - task 0e448fcc-3ce9-4216-acec-00000000189d 30564 1726882873.36554: variable 'ansible_search_path' from source: unknown 30564 1726882873.36561: variable 'ansible_search_path' from source: unknown 30564 1726882873.36610: calling self._execute() 30564 1726882873.36722: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882873.36734: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882873.36748: variable 'omit' from source: magic vars 30564 1726882873.37091: variable 'ansible_distribution_major_version' from source: facts 30564 1726882873.37104: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882873.37116: variable 'omit' from source: magic vars 30564 1726882873.37170: variable 'omit' from source: magic vars 30564 1726882873.37201: variable 'omit' from source: magic vars 30564 1726882873.37258: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882873.37303: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882873.37326: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882873.37360: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882873.37387: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882873.37428: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882873.37431: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882873.37434: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882873.37571: Set connection var ansible_timeout to 10 30564 1726882873.37585: Set connection var ansible_pipelining to False 30564 1726882873.37601: Set connection var ansible_shell_type to sh 30564 1726882873.37623: Set connection var ansible_shell_executable to /bin/sh 30564 1726882873.37640: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882873.37651: Set connection var ansible_connection to ssh 30564 1726882873.37696: variable 'ansible_shell_executable' from source: unknown 30564 1726882873.37715: variable 'ansible_connection' from source: unknown 30564 1726882873.37731: variable 'ansible_module_compression' from source: unknown 30564 1726882873.37746: variable 'ansible_shell_type' from source: unknown 30564 1726882873.37758: variable 'ansible_shell_executable' from source: unknown 30564 1726882873.37771: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882873.37782: variable 'ansible_pipelining' from source: unknown 30564 1726882873.37784: variable 'ansible_timeout' from source: unknown 30564 1726882873.37796: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882873.38039: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30564 1726882873.38073: variable 'omit' from source: magic vars 30564 1726882873.38090: starting attempt loop 30564 1726882873.38098: running the handler 30564 1726882873.38123: _low_level_execute_command(): starting 30564 1726882873.38144: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882873.38751: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882873.38771: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882873.38791: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882873.38820: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882873.38882: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882873.38898: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882873.38921: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882873.38947: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882873.38961: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882873.38978: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882873.38991: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882873.39004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882873.39018: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882873.39034: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882873.39052: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882873.39080: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882873.39178: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882873.39201: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882873.39225: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882873.39394: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882873.41017: stdout chunk (state=3): >>>/root <<< 30564 1726882873.41216: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882873.41219: stdout chunk (state=3): >>><<< 30564 1726882873.41222: stderr chunk (state=3): >>><<< 30564 1726882873.41341: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882873.41344: _low_level_execute_command(): starting 30564 1726882873.41348: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882873.4124088-33650-267633691540285 `" && echo ansible-tmp-1726882873.4124088-33650-267633691540285="` echo /root/.ansible/tmp/ansible-tmp-1726882873.4124088-33650-267633691540285 `" ) && sleep 0' 30564 1726882873.42173: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882873.42188: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882873.42224: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882873.42253: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882873.42312: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882873.42342: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882873.42357: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882873.42385: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882873.42398: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882873.42410: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882873.42423: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882873.42436: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882873.42459: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882873.42476: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882873.42491: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882873.42508: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882873.42595: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882873.42621: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882873.42638: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882873.42786: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882873.44668: stdout chunk (state=3): >>>ansible-tmp-1726882873.4124088-33650-267633691540285=/root/.ansible/tmp/ansible-tmp-1726882873.4124088-33650-267633691540285 <<< 30564 1726882873.44855: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882873.44858: stdout chunk (state=3): >>><<< 30564 1726882873.44860: stderr chunk (state=3): >>><<< 30564 1726882873.45176: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882873.4124088-33650-267633691540285=/root/.ansible/tmp/ansible-tmp-1726882873.4124088-33650-267633691540285 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882873.45185: variable 'ansible_module_compression' from source: unknown 30564 1726882873.45188: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 30564 1726882873.45193: variable 'ansible_facts' from source: unknown 30564 1726882873.45308: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882873.4124088-33650-267633691540285/AnsiballZ_package_facts.py 30564 1726882873.45522: Sending initial data 30564 1726882873.45529: Sent initial data (162 bytes) 30564 1726882873.46962: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882873.46975: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882873.46986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882873.47001: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882873.47037: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882873.47044: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882873.47054: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882873.47073: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882873.47081: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882873.47089: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882873.47097: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882873.47106: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882873.47118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882873.47126: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882873.47134: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882873.47142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882873.47216: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882873.47230: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882873.47245: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882873.47376: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882873.49132: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882873.49243: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882873.49331: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmp1yo4hnue /root/.ansible/tmp/ansible-tmp-1726882873.4124088-33650-267633691540285/AnsiballZ_package_facts.py <<< 30564 1726882873.49433: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882873.52307: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882873.52679: stderr chunk (state=3): >>><<< 30564 1726882873.52686: stdout chunk (state=3): >>><<< 30564 1726882873.52689: done transferring module to remote 30564 1726882873.52695: _low_level_execute_command(): starting 30564 1726882873.52700: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882873.4124088-33650-267633691540285/ /root/.ansible/tmp/ansible-tmp-1726882873.4124088-33650-267633691540285/AnsiballZ_package_facts.py && sleep 0' 30564 1726882873.53435: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882873.53457: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882873.53491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882873.53513: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882873.53559: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882873.53579: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882873.53609: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882873.53613: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882873.53615: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 30564 1726882873.53617: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882873.53677: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882873.53683: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882873.53790: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882873.55602: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882873.55662: stderr chunk (state=3): >>><<< 30564 1726882873.55668: stdout chunk (state=3): >>><<< 30564 1726882873.55767: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882873.55771: _low_level_execute_command(): starting 30564 1726882873.55774: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882873.4124088-33650-267633691540285/AnsiballZ_package_facts.py && sleep 0' 30564 1726882873.56392: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882873.56417: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882873.56442: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882873.56467: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882873.56518: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882873.56544: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882873.56559: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882873.56591: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882873.56605: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882873.56623: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882873.56652: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882873.56680: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882873.56697: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882873.56717: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882873.56731: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882873.56749: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882873.56839: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882873.56856: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882873.56880: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882873.57065: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882874.03331: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "e<<< 30564 1726882874.03493: stdout chunk (state=3): >>>poch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": nu<<< 30564 1726882874.03503: stdout chunk (state=3): >>>ll, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202<<< 30564 1726882874.03523: stdout chunk (state=3): >>>", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "a<<< 30564 1726882874.03529: stdout chunk (state=3): >>>rch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "sour<<< 30564 1726882874.03532: stdout chunk (state=3): >>>ce": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, <<< 30564 1726882874.03535: stdout chunk (state=3): >>>"arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300<<< 30564 1726882874.03559: stdout chunk (state=3): >>>", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64"<<< 30564 1726882874.03567: stdout chunk (state=3): >>>, "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 30564 1726882874.05088: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882874.05189: stderr chunk (state=3): >>><<< 30564 1726882874.05192: stdout chunk (state=3): >>><<< 30564 1726882874.05338: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882874.09097: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882873.4124088-33650-267633691540285/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882874.09116: _low_level_execute_command(): starting 30564 1726882874.09119: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882873.4124088-33650-267633691540285/ > /dev/null 2>&1 && sleep 0' 30564 1726882874.09773: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882874.09780: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882874.09790: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882874.09805: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882874.09849: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882874.09856: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882874.09871: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882874.09886: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882874.09892: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882874.09898: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882874.09906: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882874.09914: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882874.09929: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882874.09939: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882874.09946: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882874.09955: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882874.10027: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882874.10048: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882874.10059: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882874.10233: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882874.12138: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882874.12206: stderr chunk (state=3): >>><<< 30564 1726882874.12209: stdout chunk (state=3): >>><<< 30564 1726882874.12233: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882874.12236: handler run complete 30564 1726882874.13653: variable 'ansible_facts' from source: unknown 30564 1726882874.14298: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882874.16622: variable 'ansible_facts' from source: unknown 30564 1726882874.17399: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882874.19529: attempt loop complete, returning result 30564 1726882874.19544: _execute() done 30564 1726882874.19547: dumping result to json 30564 1726882874.19787: done dumping result, returning 30564 1726882874.19798: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0e448fcc-3ce9-4216-acec-00000000189d] 30564 1726882874.19804: sending task result for task 0e448fcc-3ce9-4216-acec-00000000189d ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30564 1726882874.22988: no more pending results, returning what we have 30564 1726882874.22991: results queue empty 30564 1726882874.22993: checking for any_errors_fatal 30564 1726882874.22998: done checking for any_errors_fatal 30564 1726882874.22999: checking for max_fail_percentage 30564 1726882874.23001: done checking for max_fail_percentage 30564 1726882874.23001: checking to see if all hosts have failed and the running result is not ok 30564 1726882874.23002: done checking to see if all hosts have failed 30564 1726882874.23003: getting the remaining hosts for this loop 30564 1726882874.23004: done getting the remaining hosts for this loop 30564 1726882874.23008: getting the next task for host managed_node2 30564 1726882874.23017: done getting next task for host managed_node2 30564 1726882874.23021: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 30564 1726882874.23026: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882874.23038: getting variables 30564 1726882874.23039: in VariableManager get_vars() 30564 1726882874.23097: Calling all_inventory to load vars for managed_node2 30564 1726882874.23100: Calling groups_inventory to load vars for managed_node2 30564 1726882874.23102: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882874.23115: Calling all_plugins_play to load vars for managed_node2 30564 1726882874.23118: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882874.23121: Calling groups_plugins_play to load vars for managed_node2 30564 1726882874.24439: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000189d 30564 1726882874.24443: WORKER PROCESS EXITING 30564 1726882874.27092: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882874.29127: done with get_vars() 30564 1726882874.29152: done getting variables 30564 1726882874.29214: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:41:14 -0400 (0:00:00.935) 0:01:12.873 ****** 30564 1726882874.29255: entering _queue_task() for managed_node2/debug 30564 1726882874.29627: worker is 1 (out of 1 available) 30564 1726882874.29642: exiting _queue_task() for managed_node2/debug 30564 1726882874.29658: done queuing things up, now waiting for results queue to drain 30564 1726882874.29659: waiting for pending results... 30564 1726882874.29988: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 30564 1726882874.30155: in run() - task 0e448fcc-3ce9-4216-acec-00000000183b 30564 1726882874.30181: variable 'ansible_search_path' from source: unknown 30564 1726882874.30189: variable 'ansible_search_path' from source: unknown 30564 1726882874.30235: calling self._execute() 30564 1726882874.30370: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882874.30384: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882874.30408: variable 'omit' from source: magic vars 30564 1726882874.30807: variable 'ansible_distribution_major_version' from source: facts 30564 1726882874.30824: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882874.30834: variable 'omit' from source: magic vars 30564 1726882874.30903: variable 'omit' from source: magic vars 30564 1726882874.31713: variable 'network_provider' from source: set_fact 30564 1726882874.31735: variable 'omit' from source: magic vars 30564 1726882874.31786: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882874.31830: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882874.31927: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882874.31950: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882874.32029: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882874.32062: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882874.32078: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882874.32129: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882874.32350: Set connection var ansible_timeout to 10 30564 1726882874.32362: Set connection var ansible_pipelining to False 30564 1726882874.32376: Set connection var ansible_shell_type to sh 30564 1726882874.32387: Set connection var ansible_shell_executable to /bin/sh 30564 1726882874.32399: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882874.32406: Set connection var ansible_connection to ssh 30564 1726882874.32467: variable 'ansible_shell_executable' from source: unknown 30564 1726882874.32497: variable 'ansible_connection' from source: unknown 30564 1726882874.32506: variable 'ansible_module_compression' from source: unknown 30564 1726882874.32514: variable 'ansible_shell_type' from source: unknown 30564 1726882874.32521: variable 'ansible_shell_executable' from source: unknown 30564 1726882874.32528: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882874.32540: variable 'ansible_pipelining' from source: unknown 30564 1726882874.32549: variable 'ansible_timeout' from source: unknown 30564 1726882874.32561: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882874.32730: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882874.32748: variable 'omit' from source: magic vars 30564 1726882874.32760: starting attempt loop 30564 1726882874.32777: running the handler 30564 1726882874.32832: handler run complete 30564 1726882874.32852: attempt loop complete, returning result 30564 1726882874.32859: _execute() done 30564 1726882874.32871: dumping result to json 30564 1726882874.32885: done dumping result, returning 30564 1726882874.32897: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [0e448fcc-3ce9-4216-acec-00000000183b] 30564 1726882874.32907: sending task result for task 0e448fcc-3ce9-4216-acec-00000000183b ok: [managed_node2] => {} MSG: Using network provider: nm 30564 1726882874.33090: no more pending results, returning what we have 30564 1726882874.33094: results queue empty 30564 1726882874.33095: checking for any_errors_fatal 30564 1726882874.33106: done checking for any_errors_fatal 30564 1726882874.33107: checking for max_fail_percentage 30564 1726882874.33108: done checking for max_fail_percentage 30564 1726882874.33109: checking to see if all hosts have failed and the running result is not ok 30564 1726882874.33110: done checking to see if all hosts have failed 30564 1726882874.33111: getting the remaining hosts for this loop 30564 1726882874.33113: done getting the remaining hosts for this loop 30564 1726882874.33117: getting the next task for host managed_node2 30564 1726882874.33127: done getting next task for host managed_node2 30564 1726882874.33131: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30564 1726882874.33137: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882874.33151: getting variables 30564 1726882874.33153: in VariableManager get_vars() 30564 1726882874.33197: Calling all_inventory to load vars for managed_node2 30564 1726882874.33200: Calling groups_inventory to load vars for managed_node2 30564 1726882874.33203: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882874.33214: Calling all_plugins_play to load vars for managed_node2 30564 1726882874.33217: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882874.33220: Calling groups_plugins_play to load vars for managed_node2 30564 1726882874.34185: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000183b 30564 1726882874.34188: WORKER PROCESS EXITING 30564 1726882874.35107: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882874.37110: done with get_vars() 30564 1726882874.37133: done getting variables 30564 1726882874.37200: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:41:14 -0400 (0:00:00.079) 0:01:12.953 ****** 30564 1726882874.37242: entering _queue_task() for managed_node2/fail 30564 1726882874.37575: worker is 1 (out of 1 available) 30564 1726882874.37590: exiting _queue_task() for managed_node2/fail 30564 1726882874.38177: done queuing things up, now waiting for results queue to drain 30564 1726882874.38178: waiting for pending results... 30564 1726882874.38853: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30564 1726882874.39014: in run() - task 0e448fcc-3ce9-4216-acec-00000000183c 30564 1726882874.39034: variable 'ansible_search_path' from source: unknown 30564 1726882874.39043: variable 'ansible_search_path' from source: unknown 30564 1726882874.39093: calling self._execute() 30564 1726882874.39213: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882874.39226: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882874.39241: variable 'omit' from source: magic vars 30564 1726882874.39660: variable 'ansible_distribution_major_version' from source: facts 30564 1726882874.39685: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882874.39818: variable 'network_state' from source: role '' defaults 30564 1726882874.39834: Evaluated conditional (network_state != {}): False 30564 1726882874.39841: when evaluation is False, skipping this task 30564 1726882874.39848: _execute() done 30564 1726882874.39858: dumping result to json 30564 1726882874.39876: done dumping result, returning 30564 1726882874.39889: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0e448fcc-3ce9-4216-acec-00000000183c] 30564 1726882874.39901: sending task result for task 0e448fcc-3ce9-4216-acec-00000000183c skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30564 1726882874.40056: no more pending results, returning what we have 30564 1726882874.40060: results queue empty 30564 1726882874.40061: checking for any_errors_fatal 30564 1726882874.40075: done checking for any_errors_fatal 30564 1726882874.40076: checking for max_fail_percentage 30564 1726882874.40078: done checking for max_fail_percentage 30564 1726882874.40079: checking to see if all hosts have failed and the running result is not ok 30564 1726882874.40080: done checking to see if all hosts have failed 30564 1726882874.40081: getting the remaining hosts for this loop 30564 1726882874.40083: done getting the remaining hosts for this loop 30564 1726882874.40087: getting the next task for host managed_node2 30564 1726882874.40096: done getting next task for host managed_node2 30564 1726882874.40100: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30564 1726882874.40107: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882874.40129: getting variables 30564 1726882874.40131: in VariableManager get_vars() 30564 1726882874.40177: Calling all_inventory to load vars for managed_node2 30564 1726882874.40181: Calling groups_inventory to load vars for managed_node2 30564 1726882874.40184: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882874.40197: Calling all_plugins_play to load vars for managed_node2 30564 1726882874.40201: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882874.40204: Calling groups_plugins_play to load vars for managed_node2 30564 1726882874.41199: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000183c 30564 1726882874.41202: WORKER PROCESS EXITING 30564 1726882874.43013: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882874.45400: done with get_vars() 30564 1726882874.45434: done getting variables 30564 1726882874.45516: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:41:14 -0400 (0:00:00.083) 0:01:13.036 ****** 30564 1726882874.45566: entering _queue_task() for managed_node2/fail 30564 1726882874.45989: worker is 1 (out of 1 available) 30564 1726882874.46023: exiting _queue_task() for managed_node2/fail 30564 1726882874.46036: done queuing things up, now waiting for results queue to drain 30564 1726882874.46037: waiting for pending results... 30564 1726882874.46356: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30564 1726882874.46518: in run() - task 0e448fcc-3ce9-4216-acec-00000000183d 30564 1726882874.46535: variable 'ansible_search_path' from source: unknown 30564 1726882874.46542: variable 'ansible_search_path' from source: unknown 30564 1726882874.46597: calling self._execute() 30564 1726882874.46720: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882874.46736: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882874.46749: variable 'omit' from source: magic vars 30564 1726882874.47186: variable 'ansible_distribution_major_version' from source: facts 30564 1726882874.47204: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882874.47333: variable 'network_state' from source: role '' defaults 30564 1726882874.47351: Evaluated conditional (network_state != {}): False 30564 1726882874.47358: when evaluation is False, skipping this task 30564 1726882874.47366: _execute() done 30564 1726882874.47377: dumping result to json 30564 1726882874.47387: done dumping result, returning 30564 1726882874.47396: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0e448fcc-3ce9-4216-acec-00000000183d] 30564 1726882874.47406: sending task result for task 0e448fcc-3ce9-4216-acec-00000000183d skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30564 1726882874.47549: no more pending results, returning what we have 30564 1726882874.47553: results queue empty 30564 1726882874.47554: checking for any_errors_fatal 30564 1726882874.47562: done checking for any_errors_fatal 30564 1726882874.47563: checking for max_fail_percentage 30564 1726882874.47567: done checking for max_fail_percentage 30564 1726882874.47570: checking to see if all hosts have failed and the running result is not ok 30564 1726882874.47571: done checking to see if all hosts have failed 30564 1726882874.47572: getting the remaining hosts for this loop 30564 1726882874.47574: done getting the remaining hosts for this loop 30564 1726882874.47578: getting the next task for host managed_node2 30564 1726882874.47587: done getting next task for host managed_node2 30564 1726882874.47591: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30564 1726882874.47597: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882874.47621: getting variables 30564 1726882874.47622: in VariableManager get_vars() 30564 1726882874.47667: Calling all_inventory to load vars for managed_node2 30564 1726882874.47672: Calling groups_inventory to load vars for managed_node2 30564 1726882874.47676: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882874.47688: Calling all_plugins_play to load vars for managed_node2 30564 1726882874.47691: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882874.47693: Calling groups_plugins_play to load vars for managed_node2 30564 1726882874.48721: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000183d 30564 1726882874.48725: WORKER PROCESS EXITING 30564 1726882874.51755: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882874.53733: done with get_vars() 30564 1726882874.53772: done getting variables 30564 1726882874.53834: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:41:14 -0400 (0:00:00.083) 0:01:13.120 ****** 30564 1726882874.53886: entering _queue_task() for managed_node2/fail 30564 1726882874.54248: worker is 1 (out of 1 available) 30564 1726882874.54262: exiting _queue_task() for managed_node2/fail 30564 1726882874.54278: done queuing things up, now waiting for results queue to drain 30564 1726882874.54280: waiting for pending results... 30564 1726882874.54599: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30564 1726882874.54760: in run() - task 0e448fcc-3ce9-4216-acec-00000000183e 30564 1726882874.54782: variable 'ansible_search_path' from source: unknown 30564 1726882874.54791: variable 'ansible_search_path' from source: unknown 30564 1726882874.54837: calling self._execute() 30564 1726882874.54951: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882874.54970: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882874.54985: variable 'omit' from source: magic vars 30564 1726882874.55361: variable 'ansible_distribution_major_version' from source: facts 30564 1726882874.55384: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882874.55574: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882874.58138: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882874.58218: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882874.58265: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882874.58309: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882874.58348: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882874.58441: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882874.58480: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882874.58513: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882874.59214: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882874.59327: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882874.59535: variable 'ansible_distribution_major_version' from source: facts 30564 1726882874.59554: Evaluated conditional (ansible_distribution_major_version | int > 9): False 30564 1726882874.59566: when evaluation is False, skipping this task 30564 1726882874.59575: _execute() done 30564 1726882874.59582: dumping result to json 30564 1726882874.59589: done dumping result, returning 30564 1726882874.59600: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0e448fcc-3ce9-4216-acec-00000000183e] 30564 1726882874.59611: sending task result for task 0e448fcc-3ce9-4216-acec-00000000183e 30564 1726882874.59730: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000183e skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 30564 1726882874.59787: no more pending results, returning what we have 30564 1726882874.59791: results queue empty 30564 1726882874.59792: checking for any_errors_fatal 30564 1726882874.59800: done checking for any_errors_fatal 30564 1726882874.59801: checking for max_fail_percentage 30564 1726882874.59803: done checking for max_fail_percentage 30564 1726882874.59804: checking to see if all hosts have failed and the running result is not ok 30564 1726882874.59805: done checking to see if all hosts have failed 30564 1726882874.59806: getting the remaining hosts for this loop 30564 1726882874.59807: done getting the remaining hosts for this loop 30564 1726882874.59812: getting the next task for host managed_node2 30564 1726882874.59820: done getting next task for host managed_node2 30564 1726882874.59825: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30564 1726882874.59830: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882874.59853: getting variables 30564 1726882874.59855: in VariableManager get_vars() 30564 1726882874.59900: Calling all_inventory to load vars for managed_node2 30564 1726882874.59903: Calling groups_inventory to load vars for managed_node2 30564 1726882874.59905: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882874.59916: Calling all_plugins_play to load vars for managed_node2 30564 1726882874.59919: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882874.59922: Calling groups_plugins_play to load vars for managed_node2 30564 1726882874.60942: WORKER PROCESS EXITING 30564 1726882874.62174: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882874.64386: done with get_vars() 30564 1726882874.64412: done getting variables 30564 1726882874.64473: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:41:14 -0400 (0:00:00.106) 0:01:13.226 ****** 30564 1726882874.64506: entering _queue_task() for managed_node2/dnf 30564 1726882874.64852: worker is 1 (out of 1 available) 30564 1726882874.64872: exiting _queue_task() for managed_node2/dnf 30564 1726882874.64885: done queuing things up, now waiting for results queue to drain 30564 1726882874.64886: waiting for pending results... 30564 1726882874.65195: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30564 1726882874.65325: in run() - task 0e448fcc-3ce9-4216-acec-00000000183f 30564 1726882874.65361: variable 'ansible_search_path' from source: unknown 30564 1726882874.65368: variable 'ansible_search_path' from source: unknown 30564 1726882874.65387: calling self._execute() 30564 1726882874.65875: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882874.65880: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882874.65884: variable 'omit' from source: magic vars 30564 1726882874.66132: variable 'ansible_distribution_major_version' from source: facts 30564 1726882874.66147: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882874.66360: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882874.70043: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882874.70123: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882874.70159: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882874.70199: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882874.70221: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882874.70308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882874.70338: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882874.70366: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882874.70769: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882874.70772: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882874.70775: variable 'ansible_distribution' from source: facts 30564 1726882874.70777: variable 'ansible_distribution_major_version' from source: facts 30564 1726882874.70780: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 30564 1726882874.70782: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882874.70825: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882874.70847: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882874.70877: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882874.70920: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882874.70933: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882874.70976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882874.71000: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882874.71030: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882874.71074: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882874.71090: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882874.71133: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882874.71156: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882874.71185: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882874.71228: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882874.71243: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882874.71399: variable 'network_connections' from source: include params 30564 1726882874.71410: variable 'interface' from source: play vars 30564 1726882874.71479: variable 'interface' from source: play vars 30564 1726882874.71542: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882874.71716: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882874.71750: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882874.71789: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882874.71830: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882874.71873: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882874.71902: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882874.71926: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882874.71952: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882874.72015: variable '__network_team_connections_defined' from source: role '' defaults 30564 1726882874.72268: variable 'network_connections' from source: include params 30564 1726882874.72276: variable 'interface' from source: play vars 30564 1726882874.72340: variable 'interface' from source: play vars 30564 1726882874.72377: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30564 1726882874.72380: when evaluation is False, skipping this task 30564 1726882874.72383: _execute() done 30564 1726882874.72385: dumping result to json 30564 1726882874.72387: done dumping result, returning 30564 1726882874.72396: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0e448fcc-3ce9-4216-acec-00000000183f] 30564 1726882874.72402: sending task result for task 0e448fcc-3ce9-4216-acec-00000000183f 30564 1726882874.72508: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000183f 30564 1726882874.72510: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30564 1726882874.72584: no more pending results, returning what we have 30564 1726882874.72588: results queue empty 30564 1726882874.72589: checking for any_errors_fatal 30564 1726882874.72597: done checking for any_errors_fatal 30564 1726882874.72597: checking for max_fail_percentage 30564 1726882874.72600: done checking for max_fail_percentage 30564 1726882874.72601: checking to see if all hosts have failed and the running result is not ok 30564 1726882874.72601: done checking to see if all hosts have failed 30564 1726882874.72602: getting the remaining hosts for this loop 30564 1726882874.72604: done getting the remaining hosts for this loop 30564 1726882874.72608: getting the next task for host managed_node2 30564 1726882874.72617: done getting next task for host managed_node2 30564 1726882874.72621: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30564 1726882874.72626: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882874.72647: getting variables 30564 1726882874.72649: in VariableManager get_vars() 30564 1726882874.72697: Calling all_inventory to load vars for managed_node2 30564 1726882874.72700: Calling groups_inventory to load vars for managed_node2 30564 1726882874.72703: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882874.72714: Calling all_plugins_play to load vars for managed_node2 30564 1726882874.72718: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882874.72721: Calling groups_plugins_play to load vars for managed_node2 30564 1726882874.74538: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882874.76299: done with get_vars() 30564 1726882874.76321: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30564 1726882874.76399: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:41:14 -0400 (0:00:00.119) 0:01:13.345 ****** 30564 1726882874.76430: entering _queue_task() for managed_node2/yum 30564 1726882874.76710: worker is 1 (out of 1 available) 30564 1726882874.76722: exiting _queue_task() for managed_node2/yum 30564 1726882874.76732: done queuing things up, now waiting for results queue to drain 30564 1726882874.76734: waiting for pending results... 30564 1726882874.77035: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30564 1726882874.77165: in run() - task 0e448fcc-3ce9-4216-acec-000000001840 30564 1726882874.77184: variable 'ansible_search_path' from source: unknown 30564 1726882874.77189: variable 'ansible_search_path' from source: unknown 30564 1726882874.77221: calling self._execute() 30564 1726882874.77433: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882874.77438: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882874.77447: variable 'omit' from source: magic vars 30564 1726882874.78311: variable 'ansible_distribution_major_version' from source: facts 30564 1726882874.78324: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882874.78730: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882874.81826: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882874.81898: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882874.81932: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882874.81976: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882874.82001: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882874.82075: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882874.82109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882874.82133: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882874.82179: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882874.82197: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882874.82292: variable 'ansible_distribution_major_version' from source: facts 30564 1726882874.82310: Evaluated conditional (ansible_distribution_major_version | int < 8): False 30564 1726882874.82313: when evaluation is False, skipping this task 30564 1726882874.82316: _execute() done 30564 1726882874.82320: dumping result to json 30564 1726882874.82322: done dumping result, returning 30564 1726882874.82330: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0e448fcc-3ce9-4216-acec-000000001840] 30564 1726882874.82333: sending task result for task 0e448fcc-3ce9-4216-acec-000000001840 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 30564 1726882874.82479: no more pending results, returning what we have 30564 1726882874.82483: results queue empty 30564 1726882874.82484: checking for any_errors_fatal 30564 1726882874.82493: done checking for any_errors_fatal 30564 1726882874.82494: checking for max_fail_percentage 30564 1726882874.82496: done checking for max_fail_percentage 30564 1726882874.82497: checking to see if all hosts have failed and the running result is not ok 30564 1726882874.82498: done checking to see if all hosts have failed 30564 1726882874.82499: getting the remaining hosts for this loop 30564 1726882874.82500: done getting the remaining hosts for this loop 30564 1726882874.82504: getting the next task for host managed_node2 30564 1726882874.82512: done getting next task for host managed_node2 30564 1726882874.82516: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30564 1726882874.82523: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882874.82548: getting variables 30564 1726882874.82550: in VariableManager get_vars() 30564 1726882874.82600: Calling all_inventory to load vars for managed_node2 30564 1726882874.82603: Calling groups_inventory to load vars for managed_node2 30564 1726882874.82606: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882874.82617: Calling all_plugins_play to load vars for managed_node2 30564 1726882874.82620: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882874.82624: Calling groups_plugins_play to load vars for managed_node2 30564 1726882874.83379: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001840 30564 1726882874.83383: WORKER PROCESS EXITING 30564 1726882874.84497: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882874.86419: done with get_vars() 30564 1726882874.86442: done getting variables 30564 1726882874.86507: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:41:14 -0400 (0:00:00.101) 0:01:13.446 ****** 30564 1726882874.86543: entering _queue_task() for managed_node2/fail 30564 1726882874.86841: worker is 1 (out of 1 available) 30564 1726882874.86853: exiting _queue_task() for managed_node2/fail 30564 1726882874.86870: done queuing things up, now waiting for results queue to drain 30564 1726882874.86871: waiting for pending results... 30564 1726882874.87159: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30564 1726882874.87318: in run() - task 0e448fcc-3ce9-4216-acec-000000001841 30564 1726882874.87337: variable 'ansible_search_path' from source: unknown 30564 1726882874.87344: variable 'ansible_search_path' from source: unknown 30564 1726882874.87386: calling self._execute() 30564 1726882874.87489: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882874.87501: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882874.87517: variable 'omit' from source: magic vars 30564 1726882874.87902: variable 'ansible_distribution_major_version' from source: facts 30564 1726882874.87920: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882874.88044: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882874.88250: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882874.91437: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882874.91624: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882874.91714: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882874.91808: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882874.91909: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882874.92107: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882874.92141: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882874.92177: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882874.92249: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882874.92333: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882874.92388: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882874.92559: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882874.92595: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882874.92753: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882874.92781: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882874.92826: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882874.92856: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882874.92892: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882874.93014: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882874.93034: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882874.93375: variable 'network_connections' from source: include params 30564 1726882874.93529: variable 'interface' from source: play vars 30564 1726882874.93601: variable 'interface' from source: play vars 30564 1726882874.93795: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882874.94100: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882874.94209: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882874.94312: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882874.94344: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882874.94430: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882874.94520: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882874.94550: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882874.94632: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882874.94698: variable '__network_team_connections_defined' from source: role '' defaults 30564 1726882874.95298: variable 'network_connections' from source: include params 30564 1726882874.95308: variable 'interface' from source: play vars 30564 1726882874.95485: variable 'interface' from source: play vars 30564 1726882874.95521: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30564 1726882874.95529: when evaluation is False, skipping this task 30564 1726882874.95537: _execute() done 30564 1726882874.95545: dumping result to json 30564 1726882874.95552: done dumping result, returning 30564 1726882874.95567: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-4216-acec-000000001841] 30564 1726882874.95588: sending task result for task 0e448fcc-3ce9-4216-acec-000000001841 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30564 1726882874.95842: no more pending results, returning what we have 30564 1726882874.95846: results queue empty 30564 1726882874.95848: checking for any_errors_fatal 30564 1726882874.95855: done checking for any_errors_fatal 30564 1726882874.95856: checking for max_fail_percentage 30564 1726882874.95858: done checking for max_fail_percentage 30564 1726882874.95859: checking to see if all hosts have failed and the running result is not ok 30564 1726882874.95860: done checking to see if all hosts have failed 30564 1726882874.95861: getting the remaining hosts for this loop 30564 1726882874.95863: done getting the remaining hosts for this loop 30564 1726882874.95870: getting the next task for host managed_node2 30564 1726882874.95880: done getting next task for host managed_node2 30564 1726882874.95884: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 30564 1726882874.95890: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882874.95912: getting variables 30564 1726882874.95914: in VariableManager get_vars() 30564 1726882874.95954: Calling all_inventory to load vars for managed_node2 30564 1726882874.95957: Calling groups_inventory to load vars for managed_node2 30564 1726882874.95959: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882874.95974: Calling all_plugins_play to load vars for managed_node2 30564 1726882874.95978: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882874.95981: Calling groups_plugins_play to load vars for managed_node2 30564 1726882874.96984: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001841 30564 1726882874.96987: WORKER PROCESS EXITING 30564 1726882874.98627: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882875.01278: done with get_vars() 30564 1726882875.01305: done getting variables 30564 1726882875.01365: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:41:15 -0400 (0:00:00.148) 0:01:13.595 ****** 30564 1726882875.01405: entering _queue_task() for managed_node2/package 30564 1726882875.01715: worker is 1 (out of 1 available) 30564 1726882875.01729: exiting _queue_task() for managed_node2/package 30564 1726882875.01741: done queuing things up, now waiting for results queue to drain 30564 1726882875.01742: waiting for pending results... 30564 1726882875.02751: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 30564 1726882875.03069: in run() - task 0e448fcc-3ce9-4216-acec-000000001842 30564 1726882875.03144: variable 'ansible_search_path' from source: unknown 30564 1726882875.03154: variable 'ansible_search_path' from source: unknown 30564 1726882875.03199: calling self._execute() 30564 1726882875.03416: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882875.03462: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882875.03502: variable 'omit' from source: magic vars 30564 1726882875.04292: variable 'ansible_distribution_major_version' from source: facts 30564 1726882875.04309: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882875.04725: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882875.05321: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882875.05400: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882875.05508: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882875.05634: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882875.05860: variable 'network_packages' from source: role '' defaults 30564 1726882875.06085: variable '__network_provider_setup' from source: role '' defaults 30564 1726882875.06103: variable '__network_service_name_default_nm' from source: role '' defaults 30564 1726882875.06351: variable '__network_service_name_default_nm' from source: role '' defaults 30564 1726882875.06366: variable '__network_packages_default_nm' from source: role '' defaults 30564 1726882875.06776: variable '__network_packages_default_nm' from source: role '' defaults 30564 1726882875.06941: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882875.18626: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882875.19123: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882875.19166: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882875.19241: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882875.19657: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882875.19729: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882875.19760: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882875.19793: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882875.19836: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882875.19855: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882875.19904: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882875.19933: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882875.19962: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882875.20007: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882875.20026: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882875.20247: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30564 1726882875.20643: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882875.20882: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882875.20909: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882875.20944: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882875.20980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882875.21084: variable 'ansible_python' from source: facts 30564 1726882875.21203: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30564 1726882875.21284: variable '__network_wpa_supplicant_required' from source: role '' defaults 30564 1726882875.21365: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30564 1726882875.21479: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882875.21510: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882875.21542: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882875.21637: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882875.21657: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882875.21708: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882875.21743: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882875.21775: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882875.21822: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882875.21842: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882875.21996: variable 'network_connections' from source: include params 30564 1726882875.22013: variable 'interface' from source: play vars 30564 1726882875.22121: variable 'interface' from source: play vars 30564 1726882875.22199: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882875.22236: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882875.22273: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882875.22311: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882875.22357: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882875.22750: variable 'network_connections' from source: include params 30564 1726882875.22760: variable 'interface' from source: play vars 30564 1726882875.22870: variable 'interface' from source: play vars 30564 1726882875.22956: variable '__network_packages_default_wireless' from source: role '' defaults 30564 1726882875.23045: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882875.23372: variable 'network_connections' from source: include params 30564 1726882875.23383: variable 'interface' from source: play vars 30564 1726882875.23452: variable 'interface' from source: play vars 30564 1726882875.23484: variable '__network_packages_default_team' from source: role '' defaults 30564 1726882875.23569: variable '__network_team_connections_defined' from source: role '' defaults 30564 1726882875.23902: variable 'network_connections' from source: include params 30564 1726882875.23912: variable 'interface' from source: play vars 30564 1726882875.23978: variable 'interface' from source: play vars 30564 1726882875.24044: variable '__network_service_name_default_initscripts' from source: role '' defaults 30564 1726882875.24150: variable '__network_service_name_default_initscripts' from source: role '' defaults 30564 1726882875.24163: variable '__network_packages_default_initscripts' from source: role '' defaults 30564 1726882875.24232: variable '__network_packages_default_initscripts' from source: role '' defaults 30564 1726882875.24456: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30564 1726882875.24949: variable 'network_connections' from source: include params 30564 1726882875.24958: variable 'interface' from source: play vars 30564 1726882875.25023: variable 'interface' from source: play vars 30564 1726882875.25037: variable 'ansible_distribution' from source: facts 30564 1726882875.25046: variable '__network_rh_distros' from source: role '' defaults 30564 1726882875.25055: variable 'ansible_distribution_major_version' from source: facts 30564 1726882875.25091: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30564 1726882875.25260: variable 'ansible_distribution' from source: facts 30564 1726882875.25273: variable '__network_rh_distros' from source: role '' defaults 30564 1726882875.25302: variable 'ansible_distribution_major_version' from source: facts 30564 1726882875.25318: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30564 1726882875.25489: variable 'ansible_distribution' from source: facts 30564 1726882875.25497: variable '__network_rh_distros' from source: role '' defaults 30564 1726882875.25505: variable 'ansible_distribution_major_version' from source: facts 30564 1726882875.25544: variable 'network_provider' from source: set_fact 30564 1726882875.25565: variable 'ansible_facts' from source: unknown 30564 1726882875.26305: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 30564 1726882875.26312: when evaluation is False, skipping this task 30564 1726882875.26319: _execute() done 30564 1726882875.26324: dumping result to json 30564 1726882875.26331: done dumping result, returning 30564 1726882875.26342: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [0e448fcc-3ce9-4216-acec-000000001842] 30564 1726882875.26350: sending task result for task 0e448fcc-3ce9-4216-acec-000000001842 skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 30564 1726882875.26507: no more pending results, returning what we have 30564 1726882875.26511: results queue empty 30564 1726882875.26512: checking for any_errors_fatal 30564 1726882875.26520: done checking for any_errors_fatal 30564 1726882875.26521: checking for max_fail_percentage 30564 1726882875.26522: done checking for max_fail_percentage 30564 1726882875.26524: checking to see if all hosts have failed and the running result is not ok 30564 1726882875.26524: done checking to see if all hosts have failed 30564 1726882875.26525: getting the remaining hosts for this loop 30564 1726882875.26527: done getting the remaining hosts for this loop 30564 1726882875.26531: getting the next task for host managed_node2 30564 1726882875.26540: done getting next task for host managed_node2 30564 1726882875.26544: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30564 1726882875.26550: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882875.26571: getting variables 30564 1726882875.26573: in VariableManager get_vars() 30564 1726882875.26610: Calling all_inventory to load vars for managed_node2 30564 1726882875.26613: Calling groups_inventory to load vars for managed_node2 30564 1726882875.26620: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882875.26630: Calling all_plugins_play to load vars for managed_node2 30564 1726882875.26633: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882875.26635: Calling groups_plugins_play to load vars for managed_node2 30564 1726882875.28341: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001842 30564 1726882875.28344: WORKER PROCESS EXITING 30564 1726882875.41032: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882875.44782: done with get_vars() 30564 1726882875.44814: done getting variables 30564 1726882875.44869: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:41:15 -0400 (0:00:00.437) 0:01:14.033 ****** 30564 1726882875.45185: entering _queue_task() for managed_node2/package 30564 1726882875.45538: worker is 1 (out of 1 available) 30564 1726882875.45552: exiting _queue_task() for managed_node2/package 30564 1726882875.45567: done queuing things up, now waiting for results queue to drain 30564 1726882875.45570: waiting for pending results... 30564 1726882875.46726: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30564 1726882875.47034: in run() - task 0e448fcc-3ce9-4216-acec-000000001843 30564 1726882875.47118: variable 'ansible_search_path' from source: unknown 30564 1726882875.47128: variable 'ansible_search_path' from source: unknown 30564 1726882875.47171: calling self._execute() 30564 1726882875.47486: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882875.47499: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882875.47516: variable 'omit' from source: magic vars 30564 1726882875.48446: variable 'ansible_distribution_major_version' from source: facts 30564 1726882875.48471: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882875.48609: variable 'network_state' from source: role '' defaults 30564 1726882875.48747: Evaluated conditional (network_state != {}): False 30564 1726882875.48755: when evaluation is False, skipping this task 30564 1726882875.48763: _execute() done 30564 1726882875.48777: dumping result to json 30564 1726882875.48802: done dumping result, returning 30564 1726882875.48854: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0e448fcc-3ce9-4216-acec-000000001843] 30564 1726882875.48871: sending task result for task 0e448fcc-3ce9-4216-acec-000000001843 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30564 1726882875.49037: no more pending results, returning what we have 30564 1726882875.49042: results queue empty 30564 1726882875.49043: checking for any_errors_fatal 30564 1726882875.49052: done checking for any_errors_fatal 30564 1726882875.49052: checking for max_fail_percentage 30564 1726882875.49054: done checking for max_fail_percentage 30564 1726882875.49055: checking to see if all hosts have failed and the running result is not ok 30564 1726882875.49056: done checking to see if all hosts have failed 30564 1726882875.49057: getting the remaining hosts for this loop 30564 1726882875.49059: done getting the remaining hosts for this loop 30564 1726882875.49065: getting the next task for host managed_node2 30564 1726882875.49076: done getting next task for host managed_node2 30564 1726882875.49081: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30564 1726882875.49089: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882875.49115: getting variables 30564 1726882875.49117: in VariableManager get_vars() 30564 1726882875.49158: Calling all_inventory to load vars for managed_node2 30564 1726882875.49160: Calling groups_inventory to load vars for managed_node2 30564 1726882875.49163: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882875.49179: Calling all_plugins_play to load vars for managed_node2 30564 1726882875.49181: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882875.49184: Calling groups_plugins_play to load vars for managed_node2 30564 1726882875.50487: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001843 30564 1726882875.50491: WORKER PROCESS EXITING 30564 1726882875.51961: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882875.54962: done with get_vars() 30564 1726882875.54990: done getting variables 30564 1726882875.55049: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:41:15 -0400 (0:00:00.099) 0:01:14.132 ****** 30564 1726882875.55089: entering _queue_task() for managed_node2/package 30564 1726882875.56135: worker is 1 (out of 1 available) 30564 1726882875.56147: exiting _queue_task() for managed_node2/package 30564 1726882875.56159: done queuing things up, now waiting for results queue to drain 30564 1726882875.56160: waiting for pending results... 30564 1726882875.57109: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30564 1726882875.57504: in run() - task 0e448fcc-3ce9-4216-acec-000000001844 30564 1726882875.57526: variable 'ansible_search_path' from source: unknown 30564 1726882875.57536: variable 'ansible_search_path' from source: unknown 30564 1726882875.57585: calling self._execute() 30564 1726882875.57815: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882875.57884: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882875.57902: variable 'omit' from source: magic vars 30564 1726882875.58655: variable 'ansible_distribution_major_version' from source: facts 30564 1726882875.58815: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882875.59054: variable 'network_state' from source: role '' defaults 30564 1726882875.59065: Evaluated conditional (network_state != {}): False 30564 1726882875.59069: when evaluation is False, skipping this task 30564 1726882875.59075: _execute() done 30564 1726882875.59078: dumping result to json 30564 1726882875.59081: done dumping result, returning 30564 1726882875.59090: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0e448fcc-3ce9-4216-acec-000000001844] 30564 1726882875.59096: sending task result for task 0e448fcc-3ce9-4216-acec-000000001844 30564 1726882875.60194: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001844 30564 1726882875.60197: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30564 1726882875.60241: no more pending results, returning what we have 30564 1726882875.60244: results queue empty 30564 1726882875.60245: checking for any_errors_fatal 30564 1726882875.60253: done checking for any_errors_fatal 30564 1726882875.60254: checking for max_fail_percentage 30564 1726882875.60255: done checking for max_fail_percentage 30564 1726882875.60256: checking to see if all hosts have failed and the running result is not ok 30564 1726882875.60257: done checking to see if all hosts have failed 30564 1726882875.60258: getting the remaining hosts for this loop 30564 1726882875.60260: done getting the remaining hosts for this loop 30564 1726882875.60266: getting the next task for host managed_node2 30564 1726882875.60273: done getting next task for host managed_node2 30564 1726882875.60279: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30564 1726882875.60284: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882875.60307: getting variables 30564 1726882875.60309: in VariableManager get_vars() 30564 1726882875.60347: Calling all_inventory to load vars for managed_node2 30564 1726882875.60350: Calling groups_inventory to load vars for managed_node2 30564 1726882875.60352: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882875.60367: Calling all_plugins_play to load vars for managed_node2 30564 1726882875.60369: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882875.60373: Calling groups_plugins_play to load vars for managed_node2 30564 1726882875.62155: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882875.64013: done with get_vars() 30564 1726882875.64040: done getting variables 30564 1726882875.64110: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:41:15 -0400 (0:00:00.090) 0:01:14.222 ****** 30564 1726882875.64148: entering _queue_task() for managed_node2/service 30564 1726882875.64497: worker is 1 (out of 1 available) 30564 1726882875.64516: exiting _queue_task() for managed_node2/service 30564 1726882875.64528: done queuing things up, now waiting for results queue to drain 30564 1726882875.64529: waiting for pending results... 30564 1726882875.64839: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30564 1726882875.65004: in run() - task 0e448fcc-3ce9-4216-acec-000000001845 30564 1726882875.65025: variable 'ansible_search_path' from source: unknown 30564 1726882875.65034: variable 'ansible_search_path' from source: unknown 30564 1726882875.65086: calling self._execute() 30564 1726882875.65204: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882875.65217: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882875.65232: variable 'omit' from source: magic vars 30564 1726882875.65661: variable 'ansible_distribution_major_version' from source: facts 30564 1726882875.65683: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882875.65828: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882875.66046: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882875.68809: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882875.68881: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882875.68933: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882875.68976: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882875.69014: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882875.69104: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882875.69139: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882875.69173: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882875.69227: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882875.69248: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882875.69298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882875.69334: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882875.69367: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882875.69413: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882875.69440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882875.69487: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882875.69515: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882875.69552: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882875.69602: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882875.69620: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882875.70510: variable 'network_connections' from source: include params 30564 1726882875.70531: variable 'interface' from source: play vars 30564 1726882875.70697: variable 'interface' from source: play vars 30564 1726882875.70986: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882875.71184: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882875.71230: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882875.71279: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882875.71324: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882875.71371: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882875.71408: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882875.71441: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882875.71474: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882875.71546: variable '__network_team_connections_defined' from source: role '' defaults 30564 1726882875.71812: variable 'network_connections' from source: include params 30564 1726882875.71832: variable 'interface' from source: play vars 30564 1726882875.71899: variable 'interface' from source: play vars 30564 1726882875.71945: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30564 1726882875.71956: when evaluation is False, skipping this task 30564 1726882875.71968: _execute() done 30564 1726882875.71976: dumping result to json 30564 1726882875.71986: done dumping result, returning 30564 1726882875.71999: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-4216-acec-000000001845] 30564 1726882875.72012: sending task result for task 0e448fcc-3ce9-4216-acec-000000001845 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30564 1726882875.72222: no more pending results, returning what we have 30564 1726882875.72227: results queue empty 30564 1726882875.72228: checking for any_errors_fatal 30564 1726882875.72237: done checking for any_errors_fatal 30564 1726882875.72238: checking for max_fail_percentage 30564 1726882875.72240: done checking for max_fail_percentage 30564 1726882875.72241: checking to see if all hosts have failed and the running result is not ok 30564 1726882875.72242: done checking to see if all hosts have failed 30564 1726882875.72242: getting the remaining hosts for this loop 30564 1726882875.72245: done getting the remaining hosts for this loop 30564 1726882875.72249: getting the next task for host managed_node2 30564 1726882875.72258: done getting next task for host managed_node2 30564 1726882875.72270: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30564 1726882875.72276: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882875.72299: getting variables 30564 1726882875.72302: in VariableManager get_vars() 30564 1726882875.72347: Calling all_inventory to load vars for managed_node2 30564 1726882875.72350: Calling groups_inventory to load vars for managed_node2 30564 1726882875.72353: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882875.72367: Calling all_plugins_play to load vars for managed_node2 30564 1726882875.72370: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882875.72374: Calling groups_plugins_play to load vars for managed_node2 30564 1726882875.73456: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001845 30564 1726882875.73460: WORKER PROCESS EXITING 30564 1726882875.74454: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882875.76549: done with get_vars() 30564 1726882875.76577: done getting variables 30564 1726882875.76646: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:41:15 -0400 (0:00:00.125) 0:01:14.348 ****** 30564 1726882875.76688: entering _queue_task() for managed_node2/service 30564 1726882875.77030: worker is 1 (out of 1 available) 30564 1726882875.77053: exiting _queue_task() for managed_node2/service 30564 1726882875.77067: done queuing things up, now waiting for results queue to drain 30564 1726882875.77069: waiting for pending results... 30564 1726882875.77378: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30564 1726882875.77589: in run() - task 0e448fcc-3ce9-4216-acec-000000001846 30564 1726882875.77631: variable 'ansible_search_path' from source: unknown 30564 1726882875.77644: variable 'ansible_search_path' from source: unknown 30564 1726882875.77686: calling self._execute() 30564 1726882875.77799: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882875.77812: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882875.77838: variable 'omit' from source: magic vars 30564 1726882875.78231: variable 'ansible_distribution_major_version' from source: facts 30564 1726882875.78254: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882875.78431: variable 'network_provider' from source: set_fact 30564 1726882875.78442: variable 'network_state' from source: role '' defaults 30564 1726882875.78456: Evaluated conditional (network_provider == "nm" or network_state != {}): True 30564 1726882875.78470: variable 'omit' from source: magic vars 30564 1726882875.78543: variable 'omit' from source: magic vars 30564 1726882875.78576: variable 'network_service_name' from source: role '' defaults 30564 1726882875.78657: variable 'network_service_name' from source: role '' defaults 30564 1726882875.78778: variable '__network_provider_setup' from source: role '' defaults 30564 1726882875.78789: variable '__network_service_name_default_nm' from source: role '' defaults 30564 1726882875.78868: variable '__network_service_name_default_nm' from source: role '' defaults 30564 1726882875.78883: variable '__network_packages_default_nm' from source: role '' defaults 30564 1726882875.78957: variable '__network_packages_default_nm' from source: role '' defaults 30564 1726882875.79210: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882875.82655: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882875.82750: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882875.82797: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882875.82839: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882875.82882: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882875.82974: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882875.83011: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882875.83044: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882875.83103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882875.83124: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882875.83180: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882875.83213: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882875.83242: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882875.83298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882875.83320: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882875.83594: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30564 1726882875.83735: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882875.83767: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882875.83797: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882875.83852: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882875.83878: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882875.83985: variable 'ansible_python' from source: facts 30564 1726882875.84008: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30564 1726882875.84107: variable '__network_wpa_supplicant_required' from source: role '' defaults 30564 1726882875.84200: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30564 1726882875.84338: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882875.84374: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882875.84405: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882875.84445: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882875.84459: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882875.84518: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882875.84556: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882875.84592: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882875.84636: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882875.84651: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882875.84802: variable 'network_connections' from source: include params 30564 1726882875.84813: variable 'interface' from source: play vars 30564 1726882875.84888: variable 'interface' from source: play vars 30564 1726882875.85009: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882875.85221: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882875.85283: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882875.85327: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882875.85382: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882875.85446: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882875.85489: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882875.85523: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882875.85561: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882875.85618: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882875.85932: variable 'network_connections' from source: include params 30564 1726882875.85943: variable 'interface' from source: play vars 30564 1726882875.86024: variable 'interface' from source: play vars 30564 1726882875.86073: variable '__network_packages_default_wireless' from source: role '' defaults 30564 1726882875.86160: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882875.86480: variable 'network_connections' from source: include params 30564 1726882875.86490: variable 'interface' from source: play vars 30564 1726882875.86565: variable 'interface' from source: play vars 30564 1726882875.86597: variable '__network_packages_default_team' from source: role '' defaults 30564 1726882875.86687: variable '__network_team_connections_defined' from source: role '' defaults 30564 1726882875.87033: variable 'network_connections' from source: include params 30564 1726882875.87043: variable 'interface' from source: play vars 30564 1726882875.87223: variable 'interface' from source: play vars 30564 1726882875.87294: variable '__network_service_name_default_initscripts' from source: role '' defaults 30564 1726882875.87475: variable '__network_service_name_default_initscripts' from source: role '' defaults 30564 1726882875.87487: variable '__network_packages_default_initscripts' from source: role '' defaults 30564 1726882875.87650: variable '__network_packages_default_initscripts' from source: role '' defaults 30564 1726882875.88115: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30564 1726882875.89130: variable 'network_connections' from source: include params 30564 1726882875.89141: variable 'interface' from source: play vars 30564 1726882875.89215: variable 'interface' from source: play vars 30564 1726882875.89230: variable 'ansible_distribution' from source: facts 30564 1726882875.89238: variable '__network_rh_distros' from source: role '' defaults 30564 1726882875.89248: variable 'ansible_distribution_major_version' from source: facts 30564 1726882875.89291: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30564 1726882875.89475: variable 'ansible_distribution' from source: facts 30564 1726882875.89486: variable '__network_rh_distros' from source: role '' defaults 30564 1726882875.89496: variable 'ansible_distribution_major_version' from source: facts 30564 1726882875.89514: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30564 1726882875.89699: variable 'ansible_distribution' from source: facts 30564 1726882875.89708: variable '__network_rh_distros' from source: role '' defaults 30564 1726882875.89720: variable 'ansible_distribution_major_version' from source: facts 30564 1726882875.89762: variable 'network_provider' from source: set_fact 30564 1726882875.89793: variable 'omit' from source: magic vars 30564 1726882875.89830: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882875.89867: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882875.89893: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882875.89920: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882875.89936: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882875.89975: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882875.89985: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882875.89992: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882875.90103: Set connection var ansible_timeout to 10 30564 1726882875.90115: Set connection var ansible_pipelining to False 30564 1726882875.90126: Set connection var ansible_shell_type to sh 30564 1726882875.90137: Set connection var ansible_shell_executable to /bin/sh 30564 1726882875.90150: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882875.90159: Set connection var ansible_connection to ssh 30564 1726882875.90196: variable 'ansible_shell_executable' from source: unknown 30564 1726882875.90204: variable 'ansible_connection' from source: unknown 30564 1726882875.90211: variable 'ansible_module_compression' from source: unknown 30564 1726882875.90218: variable 'ansible_shell_type' from source: unknown 30564 1726882875.90224: variable 'ansible_shell_executable' from source: unknown 30564 1726882875.90236: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882875.90244: variable 'ansible_pipelining' from source: unknown 30564 1726882875.90250: variable 'ansible_timeout' from source: unknown 30564 1726882875.90258: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882875.90378: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882875.90400: variable 'omit' from source: magic vars 30564 1726882875.90421: starting attempt loop 30564 1726882875.90430: running the handler 30564 1726882875.90637: variable 'ansible_facts' from source: unknown 30564 1726882875.92035: _low_level_execute_command(): starting 30564 1726882875.92047: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882875.92944: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882875.92960: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882875.92979: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882875.92999: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882875.93046: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882875.93059: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882875.93080: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882875.93098: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882875.93113: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882875.93129: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882875.93142: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882875.93157: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882875.93175: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882875.93187: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882875.93200: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882875.93213: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882875.93295: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882875.93318: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882875.93342: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882875.93544: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882875.95131: stdout chunk (state=3): >>>/root <<< 30564 1726882875.95310: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882875.95313: stdout chunk (state=3): >>><<< 30564 1726882875.95315: stderr chunk (state=3): >>><<< 30564 1726882875.95370: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882875.95374: _low_level_execute_command(): starting 30564 1726882875.95377: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882875.9533124-33779-179036978810507 `" && echo ansible-tmp-1726882875.9533124-33779-179036978810507="` echo /root/.ansible/tmp/ansible-tmp-1726882875.9533124-33779-179036978810507 `" ) && sleep 0' 30564 1726882875.96057: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882875.96060: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882875.96106: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882875.96109: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration <<< 30564 1726882875.96112: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882875.96114: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882875.96195: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882875.96214: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882875.96320: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882875.98191: stdout chunk (state=3): >>>ansible-tmp-1726882875.9533124-33779-179036978810507=/root/.ansible/tmp/ansible-tmp-1726882875.9533124-33779-179036978810507 <<< 30564 1726882875.98393: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882875.98396: stdout chunk (state=3): >>><<< 30564 1726882875.98398: stderr chunk (state=3): >>><<< 30564 1726882875.98775: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882875.9533124-33779-179036978810507=/root/.ansible/tmp/ansible-tmp-1726882875.9533124-33779-179036978810507 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882875.98779: variable 'ansible_module_compression' from source: unknown 30564 1726882875.98781: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 30564 1726882875.98787: variable 'ansible_facts' from source: unknown 30564 1726882875.98789: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882875.9533124-33779-179036978810507/AnsiballZ_systemd.py 30564 1726882875.98949: Sending initial data 30564 1726882875.98952: Sent initial data (156 bytes) 30564 1726882875.99954: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882875.99974: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882875.99992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882876.00009: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882876.00049: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882876.00069: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882876.00087: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882876.00108: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882876.00119: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882876.00129: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882876.00140: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882876.00152: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882876.00175: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882876.00196: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882876.00207: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882876.00219: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882876.00291: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882876.00315: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882876.00330: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882876.00462: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882876.02203: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882876.02305: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882876.02407: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmpsyxkan85 /root/.ansible/tmp/ansible-tmp-1726882875.9533124-33779-179036978810507/AnsiballZ_systemd.py <<< 30564 1726882876.02500: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882876.05742: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882876.05749: stderr chunk (state=3): >>><<< 30564 1726882876.05752: stdout chunk (state=3): >>><<< 30564 1726882876.05774: done transferring module to remote 30564 1726882876.05785: _low_level_execute_command(): starting 30564 1726882876.05790: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882875.9533124-33779-179036978810507/ /root/.ansible/tmp/ansible-tmp-1726882875.9533124-33779-179036978810507/AnsiballZ_systemd.py && sleep 0' 30564 1726882876.06381: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882876.06389: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882876.06400: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882876.06413: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882876.06452: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882876.06455: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882876.06464: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882876.06481: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882876.06488: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882876.06495: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882876.06503: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882876.06512: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882876.06523: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882876.06530: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882876.06538: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882876.06545: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882876.06617: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882876.06634: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882876.06645: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882876.06775: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882876.08569: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882876.08639: stderr chunk (state=3): >>><<< 30564 1726882876.08642: stdout chunk (state=3): >>><<< 30564 1726882876.08658: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882876.08662: _low_level_execute_command(): starting 30564 1726882876.08669: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882875.9533124-33779-179036978810507/AnsiballZ_systemd.py && sleep 0' 30564 1726882876.09441: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882876.09460: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882876.09465: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882876.09484: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882876.09545: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882876.09549: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882876.09559: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882876.09587: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882876.09595: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882876.09611: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882876.09614: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882876.09634: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882876.09651: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882876.09665: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882876.09688: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882876.09691: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882876.09798: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882876.09814: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882876.09827: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882876.10031: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882876.35106: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6692", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ExecMainStartTimestampMonotonic": "202392137", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6692", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3602", "MemoryCurrent": "9170944", "MemoryAvailable": "infinity", "CPUUsageNSec": "2245746000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft"<<< 30564 1726882876.35113: stdout chunk (state=3): >>>: "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service multi-user.target network.target shutdown.target cloud-init.service", "After": "cloud-init-local.service dbus-broker.service network-pre.target system.slice dbus.socket systemd-journald.socket basic.target sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:57 EDT", "StateChangeTimestampMonotonic": "316658837", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveExitTimestampMonotonic": "202392395", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveEnterTimestampMonotonic": "202472383", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveExitTimestampMonotonic": "202362940", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveEnterTimestampMonotonic": "202381901", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ConditionTimestampMonotonic": "202382734", "AssertTimestamp": "Fri 2024-09-20 21:31:03 EDT", "AssertTimestampMonotonic": "202382737", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "55e27919215348fab37a11b7ea324f90", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 30564 1726882876.36715: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882876.36722: stderr chunk (state=3): >>><<< 30564 1726882876.36724: stdout chunk (state=3): >>><<< 30564 1726882876.36727: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6692", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ExecMainStartTimestampMonotonic": "202392137", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6692", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3602", "MemoryCurrent": "9170944", "MemoryAvailable": "infinity", "CPUUsageNSec": "2245746000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service multi-user.target network.target shutdown.target cloud-init.service", "After": "cloud-init-local.service dbus-broker.service network-pre.target system.slice dbus.socket systemd-journald.socket basic.target sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:57 EDT", "StateChangeTimestampMonotonic": "316658837", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveExitTimestampMonotonic": "202392395", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveEnterTimestampMonotonic": "202472383", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveExitTimestampMonotonic": "202362940", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveEnterTimestampMonotonic": "202381901", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ConditionTimestampMonotonic": "202382734", "AssertTimestamp": "Fri 2024-09-20 21:31:03 EDT", "AssertTimestampMonotonic": "202382737", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "55e27919215348fab37a11b7ea324f90", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882876.36926: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882875.9533124-33779-179036978810507/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882876.36981: _low_level_execute_command(): starting 30564 1726882876.36984: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882875.9533124-33779-179036978810507/ > /dev/null 2>&1 && sleep 0' 30564 1726882876.40896: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882876.40911: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882876.40924: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882876.40950: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882876.40996: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882876.41020: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882876.41056: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882876.41080: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882876.41092: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882876.41102: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882876.41222: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882876.41236: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882876.41251: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882876.41269: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882876.41286: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882876.41301: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882876.41379: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882876.41406: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882876.41420: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882876.41543: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882876.43425: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882876.43768: stderr chunk (state=3): >>><<< 30564 1726882876.43771: stdout chunk (state=3): >>><<< 30564 1726882876.43829: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882876.43833: handler run complete 30564 1726882876.43967: attempt loop complete, returning result 30564 1726882876.43970: _execute() done 30564 1726882876.44114: dumping result to json 30564 1726882876.44117: done dumping result, returning 30564 1726882876.44119: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0e448fcc-3ce9-4216-acec-000000001846] 30564 1726882876.44121: sending task result for task 0e448fcc-3ce9-4216-acec-000000001846 ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30564 1726882876.44809: no more pending results, returning what we have 30564 1726882876.44812: results queue empty 30564 1726882876.44813: checking for any_errors_fatal 30564 1726882876.44823: done checking for any_errors_fatal 30564 1726882876.44824: checking for max_fail_percentage 30564 1726882876.44826: done checking for max_fail_percentage 30564 1726882876.44827: checking to see if all hosts have failed and the running result is not ok 30564 1726882876.44828: done checking to see if all hosts have failed 30564 1726882876.44829: getting the remaining hosts for this loop 30564 1726882876.44840: done getting the remaining hosts for this loop 30564 1726882876.44844: getting the next task for host managed_node2 30564 1726882876.44854: done getting next task for host managed_node2 30564 1726882876.44858: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30564 1726882876.44866: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882876.44879: getting variables 30564 1726882876.44881: in VariableManager get_vars() 30564 1726882876.44918: Calling all_inventory to load vars for managed_node2 30564 1726882876.44921: Calling groups_inventory to load vars for managed_node2 30564 1726882876.44924: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882876.44935: Calling all_plugins_play to load vars for managed_node2 30564 1726882876.44938: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882876.44941: Calling groups_plugins_play to load vars for managed_node2 30564 1726882876.45738: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001846 30564 1726882876.45747: WORKER PROCESS EXITING 30564 1726882876.47847: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882876.50842: done with get_vars() 30564 1726882876.50875: done getting variables 30564 1726882876.50941: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:41:16 -0400 (0:00:00.745) 0:01:15.093 ****** 30564 1726882876.51211: entering _queue_task() for managed_node2/service 30564 1726882876.52069: worker is 1 (out of 1 available) 30564 1726882876.52082: exiting _queue_task() for managed_node2/service 30564 1726882876.52095: done queuing things up, now waiting for results queue to drain 30564 1726882876.52096: waiting for pending results... 30564 1726882876.53324: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30564 1726882876.53485: in run() - task 0e448fcc-3ce9-4216-acec-000000001847 30564 1726882876.53508: variable 'ansible_search_path' from source: unknown 30564 1726882876.53517: variable 'ansible_search_path' from source: unknown 30564 1726882876.53563: calling self._execute() 30564 1726882876.53684: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882876.53696: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882876.53713: variable 'omit' from source: magic vars 30564 1726882876.54135: variable 'ansible_distribution_major_version' from source: facts 30564 1726882876.54154: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882876.54287: variable 'network_provider' from source: set_fact 30564 1726882876.54303: Evaluated conditional (network_provider == "nm"): True 30564 1726882876.54421: variable '__network_wpa_supplicant_required' from source: role '' defaults 30564 1726882876.54525: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30564 1726882876.54716: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882876.57295: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882876.57368: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882876.57407: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882876.57446: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882876.57487: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882876.57590: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882876.57624: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882876.57657: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882876.57714: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882876.57735: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882876.57831: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882876.57859: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882876.57928: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882876.58006: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882876.58035: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882876.58114: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882876.58148: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882876.58187: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882876.59086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882876.59105: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882876.59269: variable 'network_connections' from source: include params 30564 1726882876.59502: variable 'interface' from source: play vars 30564 1726882876.59576: variable 'interface' from source: play vars 30564 1726882876.59685: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882876.59888: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882876.59935: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882876.59995: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882876.60025: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882876.60083: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882876.60109: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882876.60153: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882876.60194: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882876.60244: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882876.60510: variable 'network_connections' from source: include params 30564 1726882876.60519: variable 'interface' from source: play vars 30564 1726882876.60584: variable 'interface' from source: play vars 30564 1726882876.60627: Evaluated conditional (__network_wpa_supplicant_required): False 30564 1726882876.60636: when evaluation is False, skipping this task 30564 1726882876.60643: _execute() done 30564 1726882876.60650: dumping result to json 30564 1726882876.60657: done dumping result, returning 30564 1726882876.60672: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0e448fcc-3ce9-4216-acec-000000001847] 30564 1726882876.60700: sending task result for task 0e448fcc-3ce9-4216-acec-000000001847 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 30564 1726882876.60858: no more pending results, returning what we have 30564 1726882876.60863: results queue empty 30564 1726882876.60866: checking for any_errors_fatal 30564 1726882876.60890: done checking for any_errors_fatal 30564 1726882876.60891: checking for max_fail_percentage 30564 1726882876.60893: done checking for max_fail_percentage 30564 1726882876.60894: checking to see if all hosts have failed and the running result is not ok 30564 1726882876.60895: done checking to see if all hosts have failed 30564 1726882876.60896: getting the remaining hosts for this loop 30564 1726882876.60898: done getting the remaining hosts for this loop 30564 1726882876.60902: getting the next task for host managed_node2 30564 1726882876.60911: done getting next task for host managed_node2 30564 1726882876.60916: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 30564 1726882876.60921: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882876.60942: getting variables 30564 1726882876.60943: in VariableManager get_vars() 30564 1726882876.60987: Calling all_inventory to load vars for managed_node2 30564 1726882876.60991: Calling groups_inventory to load vars for managed_node2 30564 1726882876.60993: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882876.61004: Calling all_plugins_play to load vars for managed_node2 30564 1726882876.61007: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882876.61010: Calling groups_plugins_play to load vars for managed_node2 30564 1726882876.62037: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001847 30564 1726882876.62041: WORKER PROCESS EXITING 30564 1726882876.63060: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882876.64907: done with get_vars() 30564 1726882876.64932: done getting variables 30564 1726882876.64994: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:41:16 -0400 (0:00:00.138) 0:01:15.231 ****** 30564 1726882876.65025: entering _queue_task() for managed_node2/service 30564 1726882876.65341: worker is 1 (out of 1 available) 30564 1726882876.65355: exiting _queue_task() for managed_node2/service 30564 1726882876.65368: done queuing things up, now waiting for results queue to drain 30564 1726882876.65369: waiting for pending results... 30564 1726882876.65668: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 30564 1726882876.65822: in run() - task 0e448fcc-3ce9-4216-acec-000000001848 30564 1726882876.65839: variable 'ansible_search_path' from source: unknown 30564 1726882876.65850: variable 'ansible_search_path' from source: unknown 30564 1726882876.65890: calling self._execute() 30564 1726882876.66005: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882876.66017: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882876.66037: variable 'omit' from source: magic vars 30564 1726882876.66430: variable 'ansible_distribution_major_version' from source: facts 30564 1726882876.66449: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882876.66578: variable 'network_provider' from source: set_fact 30564 1726882876.66590: Evaluated conditional (network_provider == "initscripts"): False 30564 1726882876.66598: when evaluation is False, skipping this task 30564 1726882876.66610: _execute() done 30564 1726882876.66617: dumping result to json 30564 1726882876.66625: done dumping result, returning 30564 1726882876.66635: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [0e448fcc-3ce9-4216-acec-000000001848] 30564 1726882876.66647: sending task result for task 0e448fcc-3ce9-4216-acec-000000001848 30564 1726882876.66767: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001848 30564 1726882876.66776: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30564 1726882876.66828: no more pending results, returning what we have 30564 1726882876.66832: results queue empty 30564 1726882876.66834: checking for any_errors_fatal 30564 1726882876.66843: done checking for any_errors_fatal 30564 1726882876.66844: checking for max_fail_percentage 30564 1726882876.66846: done checking for max_fail_percentage 30564 1726882876.66847: checking to see if all hosts have failed and the running result is not ok 30564 1726882876.66848: done checking to see if all hosts have failed 30564 1726882876.66849: getting the remaining hosts for this loop 30564 1726882876.66851: done getting the remaining hosts for this loop 30564 1726882876.66855: getting the next task for host managed_node2 30564 1726882876.66866: done getting next task for host managed_node2 30564 1726882876.66870: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30564 1726882876.66877: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882876.66902: getting variables 30564 1726882876.66904: in VariableManager get_vars() 30564 1726882876.66947: Calling all_inventory to load vars for managed_node2 30564 1726882876.66950: Calling groups_inventory to load vars for managed_node2 30564 1726882876.66953: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882876.66967: Calling all_plugins_play to load vars for managed_node2 30564 1726882876.66971: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882876.66974: Calling groups_plugins_play to load vars for managed_node2 30564 1726882876.68934: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882876.70750: done with get_vars() 30564 1726882876.70777: done getting variables 30564 1726882876.70835: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:41:16 -0400 (0:00:00.058) 0:01:15.290 ****** 30564 1726882876.70879: entering _queue_task() for managed_node2/copy 30564 1726882876.71181: worker is 1 (out of 1 available) 30564 1726882876.71193: exiting _queue_task() for managed_node2/copy 30564 1726882876.71207: done queuing things up, now waiting for results queue to drain 30564 1726882876.71208: waiting for pending results... 30564 1726882876.71521: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30564 1726882876.71690: in run() - task 0e448fcc-3ce9-4216-acec-000000001849 30564 1726882876.71709: variable 'ansible_search_path' from source: unknown 30564 1726882876.71718: variable 'ansible_search_path' from source: unknown 30564 1726882876.71763: calling self._execute() 30564 1726882876.71875: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882876.71888: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882876.71902: variable 'omit' from source: magic vars 30564 1726882876.72295: variable 'ansible_distribution_major_version' from source: facts 30564 1726882876.72318: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882876.72443: variable 'network_provider' from source: set_fact 30564 1726882876.72456: Evaluated conditional (network_provider == "initscripts"): False 30564 1726882876.72467: when evaluation is False, skipping this task 30564 1726882876.72476: _execute() done 30564 1726882876.72488: dumping result to json 30564 1726882876.72496: done dumping result, returning 30564 1726882876.72509: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0e448fcc-3ce9-4216-acec-000000001849] 30564 1726882876.72524: sending task result for task 0e448fcc-3ce9-4216-acec-000000001849 skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 30564 1726882876.72691: no more pending results, returning what we have 30564 1726882876.72695: results queue empty 30564 1726882876.72697: checking for any_errors_fatal 30564 1726882876.72706: done checking for any_errors_fatal 30564 1726882876.72707: checking for max_fail_percentage 30564 1726882876.72709: done checking for max_fail_percentage 30564 1726882876.72710: checking to see if all hosts have failed and the running result is not ok 30564 1726882876.72711: done checking to see if all hosts have failed 30564 1726882876.72712: getting the remaining hosts for this loop 30564 1726882876.72714: done getting the remaining hosts for this loop 30564 1726882876.72718: getting the next task for host managed_node2 30564 1726882876.72728: done getting next task for host managed_node2 30564 1726882876.72732: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30564 1726882876.72739: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882876.72765: getting variables 30564 1726882876.72767: in VariableManager get_vars() 30564 1726882876.72809: Calling all_inventory to load vars for managed_node2 30564 1726882876.72812: Calling groups_inventory to load vars for managed_node2 30564 1726882876.72815: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882876.72827: Calling all_plugins_play to load vars for managed_node2 30564 1726882876.72830: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882876.72833: Calling groups_plugins_play to load vars for managed_node2 30564 1726882876.73815: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001849 30564 1726882876.73819: WORKER PROCESS EXITING 30564 1726882876.74606: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882876.76567: done with get_vars() 30564 1726882876.76591: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:41:16 -0400 (0:00:00.058) 0:01:15.348 ****** 30564 1726882876.76686: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 30564 1726882876.77009: worker is 1 (out of 1 available) 30564 1726882876.77022: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 30564 1726882876.77035: done queuing things up, now waiting for results queue to drain 30564 1726882876.77036: waiting for pending results... 30564 1726882876.77345: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30564 1726882876.77497: in run() - task 0e448fcc-3ce9-4216-acec-00000000184a 30564 1726882876.77522: variable 'ansible_search_path' from source: unknown 30564 1726882876.77530: variable 'ansible_search_path' from source: unknown 30564 1726882876.77572: calling self._execute() 30564 1726882876.77689: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882876.77706: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882876.77721: variable 'omit' from source: magic vars 30564 1726882876.78121: variable 'ansible_distribution_major_version' from source: facts 30564 1726882876.78146: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882876.78155: variable 'omit' from source: magic vars 30564 1726882876.78227: variable 'omit' from source: magic vars 30564 1726882876.78404: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882876.80818: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882876.80895: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882876.80936: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882876.80988: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882876.81019: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882876.81114: variable 'network_provider' from source: set_fact 30564 1726882876.81253: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882876.81292: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882876.81328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882876.81375: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882876.81398: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882876.81480: variable 'omit' from source: magic vars 30564 1726882876.81601: variable 'omit' from source: magic vars 30564 1726882876.81717: variable 'network_connections' from source: include params 30564 1726882876.81730: variable 'interface' from source: play vars 30564 1726882876.81791: variable 'interface' from source: play vars 30564 1726882876.81948: variable 'omit' from source: magic vars 30564 1726882876.81969: variable '__lsr_ansible_managed' from source: task vars 30564 1726882876.82031: variable '__lsr_ansible_managed' from source: task vars 30564 1726882876.82235: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 30564 1726882876.82458: Loaded config def from plugin (lookup/template) 30564 1726882876.82471: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 30564 1726882876.82513: File lookup term: get_ansible_managed.j2 30564 1726882876.82520: variable 'ansible_search_path' from source: unknown 30564 1726882876.82530: evaluation_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 30564 1726882876.82546: search_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 30564 1726882876.82569: variable 'ansible_search_path' from source: unknown 30564 1726882876.93980: variable 'ansible_managed' from source: unknown 30564 1726882876.94197: variable 'omit' from source: magic vars 30564 1726882876.94264: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882876.94297: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882876.94817: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882876.95166: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882876.95181: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882876.95214: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882876.95222: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882876.95229: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882876.95327: Set connection var ansible_timeout to 10 30564 1726882876.95353: Set connection var ansible_pipelining to False 30564 1726882876.95365: Set connection var ansible_shell_type to sh 30564 1726882876.95375: Set connection var ansible_shell_executable to /bin/sh 30564 1726882876.95385: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882876.95390: Set connection var ansible_connection to ssh 30564 1726882876.95417: variable 'ansible_shell_executable' from source: unknown 30564 1726882876.95443: variable 'ansible_connection' from source: unknown 30564 1726882876.95453: variable 'ansible_module_compression' from source: unknown 30564 1726882876.95460: variable 'ansible_shell_type' from source: unknown 30564 1726882876.95476: variable 'ansible_shell_executable' from source: unknown 30564 1726882876.95486: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882876.95495: variable 'ansible_pipelining' from source: unknown 30564 1726882876.95501: variable 'ansible_timeout' from source: unknown 30564 1726882876.95509: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882876.95843: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30564 1726882876.95874: variable 'omit' from source: magic vars 30564 1726882876.95887: starting attempt loop 30564 1726882876.95895: running the handler 30564 1726882876.95914: _low_level_execute_command(): starting 30564 1726882876.95925: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882876.97322: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882876.97336: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882876.97349: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882876.97369: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882876.97416: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882876.97427: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882876.97440: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882876.97456: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882876.97469: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882876.97482: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882876.97499: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882876.97514: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882876.97530: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882876.97544: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882876.97555: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882876.97572: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882876.97651: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882876.97676: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882876.97693: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882876.97828: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882876.99486: stdout chunk (state=3): >>>/root <<< 30564 1726882876.99656: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882876.99693: stdout chunk (state=3): >>><<< 30564 1726882876.99697: stderr chunk (state=3): >>><<< 30564 1726882876.99772: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882876.99776: _low_level_execute_command(): starting 30564 1726882876.99779: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882876.9973521-33829-69905922308427 `" && echo ansible-tmp-1726882876.9973521-33829-69905922308427="` echo /root/.ansible/tmp/ansible-tmp-1726882876.9973521-33829-69905922308427 `" ) && sleep 0' 30564 1726882877.00425: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882877.00437: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882877.00449: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882877.00497: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882877.00546: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882877.00556: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882877.00569: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882877.00585: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882877.00594: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882877.00606: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882877.00620: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882877.00632: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882877.00650: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882877.00660: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882877.00673: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882877.00696: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882877.00782: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882877.00801: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882877.00814: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882877.00945: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882877.02830: stdout chunk (state=3): >>>ansible-tmp-1726882876.9973521-33829-69905922308427=/root/.ansible/tmp/ansible-tmp-1726882876.9973521-33829-69905922308427 <<< 30564 1726882877.02977: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882877.03154: stderr chunk (state=3): >>><<< 30564 1726882877.03158: stdout chunk (state=3): >>><<< 30564 1726882877.03235: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882876.9973521-33829-69905922308427=/root/.ansible/tmp/ansible-tmp-1726882876.9973521-33829-69905922308427 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882877.03243: variable 'ansible_module_compression' from source: unknown 30564 1726882877.03472: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 30564 1726882877.03523: variable 'ansible_facts' from source: unknown 30564 1726882877.03637: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882876.9973521-33829-69905922308427/AnsiballZ_network_connections.py 30564 1726882877.03985: Sending initial data 30564 1726882877.04010: Sent initial data (167 bytes) 30564 1726882877.06239: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882877.06275: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882877.06318: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882877.06344: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882877.06408: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882877.06430: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882877.06445: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882877.06465: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882877.06478: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882877.06497: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882877.06528: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882877.06548: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882877.06567: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882877.06581: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882877.06593: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882877.06607: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882877.06708: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882877.06731: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882877.06748: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882877.06895: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882877.08658: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882877.08751: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882877.08853: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmpxld34u11 /root/.ansible/tmp/ansible-tmp-1726882876.9973521-33829-69905922308427/AnsiballZ_network_connections.py <<< 30564 1726882877.08948: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882877.11500: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882877.11685: stderr chunk (state=3): >>><<< 30564 1726882877.11689: stdout chunk (state=3): >>><<< 30564 1726882877.11711: done transferring module to remote 30564 1726882877.11722: _low_level_execute_command(): starting 30564 1726882877.11727: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882876.9973521-33829-69905922308427/ /root/.ansible/tmp/ansible-tmp-1726882876.9973521-33829-69905922308427/AnsiballZ_network_connections.py && sleep 0' 30564 1726882877.12418: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882877.12428: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882877.12442: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882877.12456: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882877.12502: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882877.12510: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882877.12520: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882877.12534: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882877.12592: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882877.12596: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882877.12629: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882877.12639: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882877.12663: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882877.12686: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882877.12693: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882877.12703: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882877.12886: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882877.12903: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882877.12913: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882877.13034: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882877.14825: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882877.14971: stderr chunk (state=3): >>><<< 30564 1726882877.15021: stdout chunk (state=3): >>><<< 30564 1726882877.15107: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882877.15114: _low_level_execute_command(): starting 30564 1726882877.15117: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882876.9973521-33829-69905922308427/AnsiballZ_network_connections.py && sleep 0' 30564 1726882877.16387: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882877.16402: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882877.16418: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882877.16436: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882877.16506: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882877.16516: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882877.16531: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882877.16549: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882877.16561: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882877.16576: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882877.16595: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882877.16609: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882877.16626: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882877.16638: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882877.16652: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882877.16668: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882877.16784: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882877.16815: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882877.16833: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882877.16981: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882877.43147: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 891d4ab6-2d22-4634-8d3b-2e935067cc98\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 30564 1726882877.44706: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882877.44796: stderr chunk (state=3): >>><<< 30564 1726882877.44800: stdout chunk (state=3): >>><<< 30564 1726882877.44947: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 891d4ab6-2d22-4634-8d3b-2e935067cc98\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882877.44951: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'persistent_state': 'present', 'type': 'bridge', 'ip': {'dhcp4': False, 'auto6': False}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882876.9973521-33829-69905922308427/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882877.44954: _low_level_execute_command(): starting 30564 1726882877.44956: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882876.9973521-33829-69905922308427/ > /dev/null 2>&1 && sleep 0' 30564 1726882877.45609: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882877.45618: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882877.45628: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882877.45642: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882877.45684: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882877.45691: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882877.45703: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882877.45715: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882877.45722: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882877.45729: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882877.45737: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882877.45746: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882877.45758: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882877.45767: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882877.45777: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882877.45787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882877.45855: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882877.45868: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882877.45885: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882877.46012: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882877.47936: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882877.48856: stderr chunk (state=3): >>><<< 30564 1726882877.48860: stdout chunk (state=3): >>><<< 30564 1726882877.48927: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882877.48930: handler run complete 30564 1726882877.48933: attempt loop complete, returning result 30564 1726882877.48935: _execute() done 30564 1726882877.48956: dumping result to json 30564 1726882877.48959: done dumping result, returning 30564 1726882877.48988: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0e448fcc-3ce9-4216-acec-00000000184a] 30564 1726882877.49028: sending task result for task 0e448fcc-3ce9-4216-acec-00000000184a changed: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 891d4ab6-2d22-4634-8d3b-2e935067cc98 30564 1726882877.49400: no more pending results, returning what we have 30564 1726882877.49403: results queue empty 30564 1726882877.49405: checking for any_errors_fatal 30564 1726882877.49420: done checking for any_errors_fatal 30564 1726882877.49422: checking for max_fail_percentage 30564 1726882877.49424: done checking for max_fail_percentage 30564 1726882877.49425: checking to see if all hosts have failed and the running result is not ok 30564 1726882877.49426: done checking to see if all hosts have failed 30564 1726882877.49427: getting the remaining hosts for this loop 30564 1726882877.49429: done getting the remaining hosts for this loop 30564 1726882877.49433: getting the next task for host managed_node2 30564 1726882877.49538: done getting next task for host managed_node2 30564 1726882877.49543: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 30564 1726882877.49547: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882877.49608: getting variables 30564 1726882877.49611: in VariableManager get_vars() 30564 1726882877.49693: Calling all_inventory to load vars for managed_node2 30564 1726882877.49696: Calling groups_inventory to load vars for managed_node2 30564 1726882877.49789: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882877.49802: Calling all_plugins_play to load vars for managed_node2 30564 1726882877.49805: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882877.49817: Calling groups_plugins_play to load vars for managed_node2 30564 1726882877.50381: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000184a 30564 1726882877.50385: WORKER PROCESS EXITING 30564 1726882877.51770: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882877.54994: done with get_vars() 30564 1726882877.55043: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:41:17 -0400 (0:00:00.785) 0:01:16.133 ****** 30564 1726882877.55224: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 30564 1726882877.55596: worker is 1 (out of 1 available) 30564 1726882877.55608: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 30564 1726882877.55621: done queuing things up, now waiting for results queue to drain 30564 1726882877.55622: waiting for pending results... 30564 1726882877.55947: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 30564 1726882877.56110: in run() - task 0e448fcc-3ce9-4216-acec-00000000184b 30564 1726882877.56134: variable 'ansible_search_path' from source: unknown 30564 1726882877.56142: variable 'ansible_search_path' from source: unknown 30564 1726882877.56192: calling self._execute() 30564 1726882877.56541: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882877.56554: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882877.56571: variable 'omit' from source: magic vars 30564 1726882877.57049: variable 'ansible_distribution_major_version' from source: facts 30564 1726882877.57076: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882877.57232: variable 'network_state' from source: role '' defaults 30564 1726882877.57248: Evaluated conditional (network_state != {}): False 30564 1726882877.57257: when evaluation is False, skipping this task 30564 1726882877.57267: _execute() done 30564 1726882877.57275: dumping result to json 30564 1726882877.57283: done dumping result, returning 30564 1726882877.57298: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [0e448fcc-3ce9-4216-acec-00000000184b] 30564 1726882877.57310: sending task result for task 0e448fcc-3ce9-4216-acec-00000000184b skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30564 1726882877.57476: no more pending results, returning what we have 30564 1726882877.57482: results queue empty 30564 1726882877.57484: checking for any_errors_fatal 30564 1726882877.57495: done checking for any_errors_fatal 30564 1726882877.57496: checking for max_fail_percentage 30564 1726882877.57498: done checking for max_fail_percentage 30564 1726882877.57499: checking to see if all hosts have failed and the running result is not ok 30564 1726882877.57500: done checking to see if all hosts have failed 30564 1726882877.57501: getting the remaining hosts for this loop 30564 1726882877.57502: done getting the remaining hosts for this loop 30564 1726882877.57507: getting the next task for host managed_node2 30564 1726882877.57538: done getting next task for host managed_node2 30564 1726882877.57543: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30564 1726882877.57549: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882877.57598: getting variables 30564 1726882877.57601: in VariableManager get_vars() 30564 1726882877.57737: Calling all_inventory to load vars for managed_node2 30564 1726882877.57740: Calling groups_inventory to load vars for managed_node2 30564 1726882877.57744: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882877.57756: Calling all_plugins_play to load vars for managed_node2 30564 1726882877.57760: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882877.57788: Calling groups_plugins_play to load vars for managed_node2 30564 1726882877.58904: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000184b 30564 1726882877.58915: WORKER PROCESS EXITING 30564 1726882877.60668: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882877.64540: done with get_vars() 30564 1726882877.65275: done getting variables 30564 1726882877.65333: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:41:17 -0400 (0:00:00.101) 0:01:16.235 ****** 30564 1726882877.65370: entering _queue_task() for managed_node2/debug 30564 1726882877.65692: worker is 1 (out of 1 available) 30564 1726882877.65705: exiting _queue_task() for managed_node2/debug 30564 1726882877.65717: done queuing things up, now waiting for results queue to drain 30564 1726882877.65718: waiting for pending results... 30564 1726882877.66776: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30564 1726882877.67040: in run() - task 0e448fcc-3ce9-4216-acec-00000000184c 30564 1726882877.67056: variable 'ansible_search_path' from source: unknown 30564 1726882877.67061: variable 'ansible_search_path' from source: unknown 30564 1726882877.67219: calling self._execute() 30564 1726882877.67438: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882877.67442: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882877.67455: variable 'omit' from source: magic vars 30564 1726882877.69297: variable 'ansible_distribution_major_version' from source: facts 30564 1726882877.69311: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882877.69317: variable 'omit' from source: magic vars 30564 1726882877.69389: variable 'omit' from source: magic vars 30564 1726882877.69422: variable 'omit' from source: magic vars 30564 1726882877.69464: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882877.69507: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882877.69526: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882877.69545: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882877.69559: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882877.69595: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882877.69599: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882877.69602: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882877.69821: Set connection var ansible_timeout to 10 30564 1726882877.69827: Set connection var ansible_pipelining to False 30564 1726882877.69830: Set connection var ansible_shell_type to sh 30564 1726882877.69835: Set connection var ansible_shell_executable to /bin/sh 30564 1726882877.69845: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882877.69847: Set connection var ansible_connection to ssh 30564 1726882877.69874: variable 'ansible_shell_executable' from source: unknown 30564 1726882877.69877: variable 'ansible_connection' from source: unknown 30564 1726882877.69881: variable 'ansible_module_compression' from source: unknown 30564 1726882877.69883: variable 'ansible_shell_type' from source: unknown 30564 1726882877.69886: variable 'ansible_shell_executable' from source: unknown 30564 1726882877.69889: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882877.69891: variable 'ansible_pipelining' from source: unknown 30564 1726882877.69893: variable 'ansible_timeout' from source: unknown 30564 1726882877.69898: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882877.70206: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882877.70217: variable 'omit' from source: magic vars 30564 1726882877.70222: starting attempt loop 30564 1726882877.70225: running the handler 30564 1726882877.70549: variable '__network_connections_result' from source: set_fact 30564 1726882877.70694: handler run complete 30564 1726882877.70712: attempt loop complete, returning result 30564 1726882877.70716: _execute() done 30564 1726882877.70720: dumping result to json 30564 1726882877.70723: done dumping result, returning 30564 1726882877.70730: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0e448fcc-3ce9-4216-acec-00000000184c] 30564 1726882877.70736: sending task result for task 0e448fcc-3ce9-4216-acec-00000000184c 30564 1726882877.70839: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000184c 30564 1726882877.70843: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 891d4ab6-2d22-4634-8d3b-2e935067cc98" ] } 30564 1726882877.70920: no more pending results, returning what we have 30564 1726882877.70923: results queue empty 30564 1726882877.70924: checking for any_errors_fatal 30564 1726882877.70933: done checking for any_errors_fatal 30564 1726882877.70934: checking for max_fail_percentage 30564 1726882877.70936: done checking for max_fail_percentage 30564 1726882877.70937: checking to see if all hosts have failed and the running result is not ok 30564 1726882877.70937: done checking to see if all hosts have failed 30564 1726882877.70938: getting the remaining hosts for this loop 30564 1726882877.70940: done getting the remaining hosts for this loop 30564 1726882877.70944: getting the next task for host managed_node2 30564 1726882877.70952: done getting next task for host managed_node2 30564 1726882877.70956: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30564 1726882877.70961: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882877.70976: getting variables 30564 1726882877.70978: in VariableManager get_vars() 30564 1726882877.71016: Calling all_inventory to load vars for managed_node2 30564 1726882877.71019: Calling groups_inventory to load vars for managed_node2 30564 1726882877.71022: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882877.71034: Calling all_plugins_play to load vars for managed_node2 30564 1726882877.71038: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882877.71041: Calling groups_plugins_play to load vars for managed_node2 30564 1726882877.73524: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882877.77599: done with get_vars() 30564 1726882877.77626: done getting variables 30564 1726882877.78299: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:41:17 -0400 (0:00:00.129) 0:01:16.364 ****** 30564 1726882877.78342: entering _queue_task() for managed_node2/debug 30564 1726882877.78675: worker is 1 (out of 1 available) 30564 1726882877.78687: exiting _queue_task() for managed_node2/debug 30564 1726882877.78700: done queuing things up, now waiting for results queue to drain 30564 1726882877.78701: waiting for pending results... 30564 1726882877.79597: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30564 1726882877.79836: in run() - task 0e448fcc-3ce9-4216-acec-00000000184d 30564 1726882877.79974: variable 'ansible_search_path' from source: unknown 30564 1726882877.79978: variable 'ansible_search_path' from source: unknown 30564 1726882877.80010: calling self._execute() 30564 1726882877.80226: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882877.80232: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882877.80242: variable 'omit' from source: magic vars 30564 1726882877.81180: variable 'ansible_distribution_major_version' from source: facts 30564 1726882877.81194: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882877.81201: variable 'omit' from source: magic vars 30564 1726882877.81389: variable 'omit' from source: magic vars 30564 1726882877.81422: variable 'omit' from source: magic vars 30564 1726882877.81465: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882877.81623: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882877.81644: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882877.81662: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882877.81676: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882877.82639: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882877.82642: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882877.82645: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882877.82871: Set connection var ansible_timeout to 10 30564 1726882877.82874: Set connection var ansible_pipelining to False 30564 1726882877.82877: Set connection var ansible_shell_type to sh 30564 1726882877.82881: Set connection var ansible_shell_executable to /bin/sh 30564 1726882877.82889: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882877.82892: Set connection var ansible_connection to ssh 30564 1726882877.82918: variable 'ansible_shell_executable' from source: unknown 30564 1726882877.82921: variable 'ansible_connection' from source: unknown 30564 1726882877.82924: variable 'ansible_module_compression' from source: unknown 30564 1726882877.82927: variable 'ansible_shell_type' from source: unknown 30564 1726882877.82929: variable 'ansible_shell_executable' from source: unknown 30564 1726882877.83253: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882877.83261: variable 'ansible_pipelining' from source: unknown 30564 1726882877.83266: variable 'ansible_timeout' from source: unknown 30564 1726882877.83271: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882877.83418: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882877.83428: variable 'omit' from source: magic vars 30564 1726882877.83433: starting attempt loop 30564 1726882877.83437: running the handler 30564 1726882877.83495: variable '__network_connections_result' from source: set_fact 30564 1726882877.83572: variable '__network_connections_result' from source: set_fact 30564 1726882877.83694: handler run complete 30564 1726882877.83838: attempt loop complete, returning result 30564 1726882877.83841: _execute() done 30564 1726882877.83844: dumping result to json 30564 1726882877.83846: done dumping result, returning 30564 1726882877.83856: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0e448fcc-3ce9-4216-acec-00000000184d] 30564 1726882877.83862: sending task result for task 0e448fcc-3ce9-4216-acec-00000000184d 30564 1726882877.83980: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000184d 30564 1726882877.83984: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 891d4ab6-2d22-4634-8d3b-2e935067cc98\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 891d4ab6-2d22-4634-8d3b-2e935067cc98" ] } } 30564 1726882877.84081: no more pending results, returning what we have 30564 1726882877.84084: results queue empty 30564 1726882877.84085: checking for any_errors_fatal 30564 1726882877.84092: done checking for any_errors_fatal 30564 1726882877.84093: checking for max_fail_percentage 30564 1726882877.84094: done checking for max_fail_percentage 30564 1726882877.84095: checking to see if all hosts have failed and the running result is not ok 30564 1726882877.84096: done checking to see if all hosts have failed 30564 1726882877.84097: getting the remaining hosts for this loop 30564 1726882877.84098: done getting the remaining hosts for this loop 30564 1726882877.84102: getting the next task for host managed_node2 30564 1726882877.84110: done getting next task for host managed_node2 30564 1726882877.84115: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30564 1726882877.84119: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882877.84130: getting variables 30564 1726882877.84132: in VariableManager get_vars() 30564 1726882877.84189: Calling all_inventory to load vars for managed_node2 30564 1726882877.84192: Calling groups_inventory to load vars for managed_node2 30564 1726882877.84195: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882877.84206: Calling all_plugins_play to load vars for managed_node2 30564 1726882877.84209: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882877.84213: Calling groups_plugins_play to load vars for managed_node2 30564 1726882877.86396: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882877.89897: done with get_vars() 30564 1726882877.89928: done getting variables 30564 1726882877.89998: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:41:17 -0400 (0:00:00.116) 0:01:16.481 ****** 30564 1726882877.90036: entering _queue_task() for managed_node2/debug 30564 1726882877.90413: worker is 1 (out of 1 available) 30564 1726882877.90426: exiting _queue_task() for managed_node2/debug 30564 1726882877.90439: done queuing things up, now waiting for results queue to drain 30564 1726882877.90440: waiting for pending results... 30564 1726882877.90757: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30564 1726882877.90912: in run() - task 0e448fcc-3ce9-4216-acec-00000000184e 30564 1726882877.90934: variable 'ansible_search_path' from source: unknown 30564 1726882877.90942: variable 'ansible_search_path' from source: unknown 30564 1726882877.90987: calling self._execute() 30564 1726882877.91094: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882877.91111: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882877.91126: variable 'omit' from source: magic vars 30564 1726882877.91521: variable 'ansible_distribution_major_version' from source: facts 30564 1726882877.91546: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882877.91681: variable 'network_state' from source: role '' defaults 30564 1726882877.91696: Evaluated conditional (network_state != {}): False 30564 1726882877.91704: when evaluation is False, skipping this task 30564 1726882877.91711: _execute() done 30564 1726882877.91760: dumping result to json 30564 1726882877.91775: done dumping result, returning 30564 1726882877.91787: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0e448fcc-3ce9-4216-acec-00000000184e] 30564 1726882877.91799: sending task result for task 0e448fcc-3ce9-4216-acec-00000000184e skipping: [managed_node2] => { "false_condition": "network_state != {}" } 30564 1726882877.91952: no more pending results, returning what we have 30564 1726882877.91957: results queue empty 30564 1726882877.91958: checking for any_errors_fatal 30564 1726882877.91974: done checking for any_errors_fatal 30564 1726882877.91975: checking for max_fail_percentage 30564 1726882877.91977: done checking for max_fail_percentage 30564 1726882877.91978: checking to see if all hosts have failed and the running result is not ok 30564 1726882877.91979: done checking to see if all hosts have failed 30564 1726882877.91980: getting the remaining hosts for this loop 30564 1726882877.91982: done getting the remaining hosts for this loop 30564 1726882877.91986: getting the next task for host managed_node2 30564 1726882877.91996: done getting next task for host managed_node2 30564 1726882877.92000: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 30564 1726882877.92007: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882877.92032: getting variables 30564 1726882877.92034: in VariableManager get_vars() 30564 1726882877.92082: Calling all_inventory to load vars for managed_node2 30564 1726882877.92085: Calling groups_inventory to load vars for managed_node2 30564 1726882877.92088: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882877.92101: Calling all_plugins_play to load vars for managed_node2 30564 1726882877.92104: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882877.92108: Calling groups_plugins_play to load vars for managed_node2 30564 1726882877.93749: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000184e 30564 1726882877.93753: WORKER PROCESS EXITING 30564 1726882877.94972: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882877.96672: done with get_vars() 30564 1726882877.96696: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:41:17 -0400 (0:00:00.067) 0:01:16.549 ****** 30564 1726882877.96796: entering _queue_task() for managed_node2/ping 30564 1726882877.97229: worker is 1 (out of 1 available) 30564 1726882877.97241: exiting _queue_task() for managed_node2/ping 30564 1726882877.97253: done queuing things up, now waiting for results queue to drain 30564 1726882877.97254: waiting for pending results... 30564 1726882877.98537: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 30564 1726882877.98699: in run() - task 0e448fcc-3ce9-4216-acec-00000000184f 30564 1726882877.98717: variable 'ansible_search_path' from source: unknown 30564 1726882877.98724: variable 'ansible_search_path' from source: unknown 30564 1726882877.98782: calling self._execute() 30564 1726882877.98985: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882877.98997: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882877.99013: variable 'omit' from source: magic vars 30564 1726882877.99409: variable 'ansible_distribution_major_version' from source: facts 30564 1726882877.99453: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882877.99467: variable 'omit' from source: magic vars 30564 1726882877.99608: variable 'omit' from source: magic vars 30564 1726882877.99644: variable 'omit' from source: magic vars 30564 1726882877.99805: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882877.99844: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882877.99986: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882878.00008: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882878.00024: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882878.00058: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882878.00069: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882878.00078: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882878.00184: Set connection var ansible_timeout to 10 30564 1726882878.00315: Set connection var ansible_pipelining to False 30564 1726882878.00322: Set connection var ansible_shell_type to sh 30564 1726882878.00332: Set connection var ansible_shell_executable to /bin/sh 30564 1726882878.00344: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882878.00350: Set connection var ansible_connection to ssh 30564 1726882878.00381: variable 'ansible_shell_executable' from source: unknown 30564 1726882878.00420: variable 'ansible_connection' from source: unknown 30564 1726882878.00530: variable 'ansible_module_compression' from source: unknown 30564 1726882878.00538: variable 'ansible_shell_type' from source: unknown 30564 1726882878.00545: variable 'ansible_shell_executable' from source: unknown 30564 1726882878.00552: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882878.00560: variable 'ansible_pipelining' from source: unknown 30564 1726882878.00569: variable 'ansible_timeout' from source: unknown 30564 1726882878.00578: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882878.00987: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30564 1726882878.01004: variable 'omit' from source: magic vars 30564 1726882878.01041: starting attempt loop 30564 1726882878.01064: running the handler 30564 1726882878.01088: _low_level_execute_command(): starting 30564 1726882878.01100: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882878.01842: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882878.01861: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882878.01884: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882878.01904: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882878.01949: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882878.01968: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882878.01984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882878.02007: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882878.02021: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882878.02033: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882878.02046: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882878.02061: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882878.02082: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882878.02096: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882878.02109: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882878.02122: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882878.02202: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882878.02225: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882878.02240: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882878.02380: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882878.04042: stdout chunk (state=3): >>>/root <<< 30564 1726882878.04234: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882878.04237: stdout chunk (state=3): >>><<< 30564 1726882878.04239: stderr chunk (state=3): >>><<< 30564 1726882878.04360: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882878.04365: _low_level_execute_command(): starting 30564 1726882878.04368: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882878.0425646-33882-197289462118824 `" && echo ansible-tmp-1726882878.0425646-33882-197289462118824="` echo /root/.ansible/tmp/ansible-tmp-1726882878.0425646-33882-197289462118824 `" ) && sleep 0' 30564 1726882878.04916: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882878.04925: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882878.04936: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882878.04949: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882878.04991: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882878.04998: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882878.05010: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882878.05020: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882878.05028: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882878.05035: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882878.05042: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882878.05052: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882878.05066: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882878.05078: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882878.05085: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882878.05095: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882878.05166: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882878.05183: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882878.05194: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882878.05320: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882878.07205: stdout chunk (state=3): >>>ansible-tmp-1726882878.0425646-33882-197289462118824=/root/.ansible/tmp/ansible-tmp-1726882878.0425646-33882-197289462118824 <<< 30564 1726882878.07332: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882878.07403: stderr chunk (state=3): >>><<< 30564 1726882878.07406: stdout chunk (state=3): >>><<< 30564 1726882878.07426: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882878.0425646-33882-197289462118824=/root/.ansible/tmp/ansible-tmp-1726882878.0425646-33882-197289462118824 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882878.07486: variable 'ansible_module_compression' from source: unknown 30564 1726882878.07537: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 30564 1726882878.07587: variable 'ansible_facts' from source: unknown 30564 1726882878.07670: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882878.0425646-33882-197289462118824/AnsiballZ_ping.py 30564 1726882878.07871: Sending initial data 30564 1726882878.07876: Sent initial data (153 bytes) 30564 1726882878.09128: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882878.09136: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882878.09152: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882878.09173: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882878.09229: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882878.09232: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882878.09252: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882878.09276: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882878.09296: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882878.09318: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882878.09339: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882878.09361: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882878.09391: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882878.09409: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882878.09429: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882878.09449: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882878.09557: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882878.09578: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882878.09603: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882878.09758: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882878.11479: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882878.11583: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882878.11688: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmpasc6niwz /root/.ansible/tmp/ansible-tmp-1726882878.0425646-33882-197289462118824/AnsiballZ_ping.py <<< 30564 1726882878.11786: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882878.13184: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882878.13371: stderr chunk (state=3): >>><<< 30564 1726882878.13374: stdout chunk (state=3): >>><<< 30564 1726882878.13485: done transferring module to remote 30564 1726882878.13499: _low_level_execute_command(): starting 30564 1726882878.13505: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882878.0425646-33882-197289462118824/ /root/.ansible/tmp/ansible-tmp-1726882878.0425646-33882-197289462118824/AnsiballZ_ping.py && sleep 0' 30564 1726882878.14140: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882878.14155: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882878.14176: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882878.14200: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882878.14253: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882878.14267: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882878.14290: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882878.14318: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882878.14332: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882878.14344: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882878.14367: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882878.14396: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882878.14417: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882878.14430: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882878.14441: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882878.14459: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882878.14537: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882878.14553: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882878.14571: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882878.14715: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882878.16511: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882878.16514: stdout chunk (state=3): >>><<< 30564 1726882878.16516: stderr chunk (state=3): >>><<< 30564 1726882878.16612: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882878.16617: _low_level_execute_command(): starting 30564 1726882878.16622: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882878.0425646-33882-197289462118824/AnsiballZ_ping.py && sleep 0' 30564 1726882878.17208: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882878.17220: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882878.17233: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882878.17249: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882878.17295: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882878.17308: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882878.17323: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882878.17341: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882878.17354: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882878.17371: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882878.17385: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882878.17400: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882878.17420: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882878.17434: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882878.17445: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882878.17459: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882878.17536: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882878.17565: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882878.17571: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882878.17719: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882878.30520: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 30564 1726882878.31582: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882878.31586: stdout chunk (state=3): >>><<< 30564 1726882878.31592: stderr chunk (state=3): >>><<< 30564 1726882878.31612: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882878.31635: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882878.0425646-33882-197289462118824/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882878.31643: _low_level_execute_command(): starting 30564 1726882878.31647: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882878.0425646-33882-197289462118824/ > /dev/null 2>&1 && sleep 0' 30564 1726882878.32294: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882878.32304: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882878.32314: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882878.32329: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882878.32372: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882878.32378: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882878.32389: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882878.32403: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882878.32411: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882878.32420: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882878.32426: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882878.32436: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882878.32447: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882878.32455: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882878.32462: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882878.32474: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882878.32542: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882878.32557: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882878.32570: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882878.32697: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882878.34578: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882878.34581: stdout chunk (state=3): >>><<< 30564 1726882878.34583: stderr chunk (state=3): >>><<< 30564 1726882878.34975: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882878.34979: handler run complete 30564 1726882878.34982: attempt loop complete, returning result 30564 1726882878.34984: _execute() done 30564 1726882878.34986: dumping result to json 30564 1726882878.34988: done dumping result, returning 30564 1726882878.34990: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0e448fcc-3ce9-4216-acec-00000000184f] 30564 1726882878.34993: sending task result for task 0e448fcc-3ce9-4216-acec-00000000184f 30564 1726882878.35062: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000184f 30564 1726882878.35071: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 30564 1726882878.35138: no more pending results, returning what we have 30564 1726882878.35142: results queue empty 30564 1726882878.35143: checking for any_errors_fatal 30564 1726882878.35149: done checking for any_errors_fatal 30564 1726882878.35150: checking for max_fail_percentage 30564 1726882878.35151: done checking for max_fail_percentage 30564 1726882878.35152: checking to see if all hosts have failed and the running result is not ok 30564 1726882878.35153: done checking to see if all hosts have failed 30564 1726882878.35154: getting the remaining hosts for this loop 30564 1726882878.35155: done getting the remaining hosts for this loop 30564 1726882878.35158: getting the next task for host managed_node2 30564 1726882878.35175: done getting next task for host managed_node2 30564 1726882878.35178: ^ task is: TASK: meta (role_complete) 30564 1726882878.35183: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882878.35198: getting variables 30564 1726882878.35199: in VariableManager get_vars() 30564 1726882878.35241: Calling all_inventory to load vars for managed_node2 30564 1726882878.35244: Calling groups_inventory to load vars for managed_node2 30564 1726882878.35246: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882878.35257: Calling all_plugins_play to load vars for managed_node2 30564 1726882878.35260: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882878.35263: Calling groups_plugins_play to load vars for managed_node2 30564 1726882878.36712: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882878.38060: done with get_vars() 30564 1726882878.38085: done getting variables 30564 1726882878.38143: done queuing things up, now waiting for results queue to drain 30564 1726882878.38144: results queue empty 30564 1726882878.38145: checking for any_errors_fatal 30564 1726882878.38146: done checking for any_errors_fatal 30564 1726882878.38147: checking for max_fail_percentage 30564 1726882878.38148: done checking for max_fail_percentage 30564 1726882878.38148: checking to see if all hosts have failed and the running result is not ok 30564 1726882878.38149: done checking to see if all hosts have failed 30564 1726882878.38149: getting the remaining hosts for this loop 30564 1726882878.38150: done getting the remaining hosts for this loop 30564 1726882878.38152: getting the next task for host managed_node2 30564 1726882878.38155: done getting next task for host managed_node2 30564 1726882878.38156: ^ task is: TASK: Show result 30564 1726882878.38158: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882878.38159: getting variables 30564 1726882878.38160: in VariableManager get_vars() 30564 1726882878.38172: Calling all_inventory to load vars for managed_node2 30564 1726882878.38173: Calling groups_inventory to load vars for managed_node2 30564 1726882878.38175: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882878.38179: Calling all_plugins_play to load vars for managed_node2 30564 1726882878.38181: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882878.38182: Calling groups_plugins_play to load vars for managed_node2 30564 1726882878.38956: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882878.40131: done with get_vars() 30564 1726882878.40150: done getting variables 30564 1726882878.40191: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show result] ************************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml:14 Friday 20 September 2024 21:41:18 -0400 (0:00:00.434) 0:01:16.983 ****** 30564 1726882878.40218: entering _queue_task() for managed_node2/debug 30564 1726882878.40526: worker is 1 (out of 1 available) 30564 1726882878.40537: exiting _queue_task() for managed_node2/debug 30564 1726882878.40548: done queuing things up, now waiting for results queue to drain 30564 1726882878.40549: waiting for pending results... 30564 1726882878.40812: running TaskExecutor() for managed_node2/TASK: Show result 30564 1726882878.40888: in run() - task 0e448fcc-3ce9-4216-acec-0000000017d1 30564 1726882878.40901: variable 'ansible_search_path' from source: unknown 30564 1726882878.40905: variable 'ansible_search_path' from source: unknown 30564 1726882878.40933: calling self._execute() 30564 1726882878.41014: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882878.41018: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882878.41026: variable 'omit' from source: magic vars 30564 1726882878.41311: variable 'ansible_distribution_major_version' from source: facts 30564 1726882878.41321: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882878.41329: variable 'omit' from source: magic vars 30564 1726882878.41360: variable 'omit' from source: magic vars 30564 1726882878.41387: variable 'omit' from source: magic vars 30564 1726882878.41421: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882878.41448: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882878.41466: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882878.41483: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882878.41493: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882878.41516: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882878.41519: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882878.41521: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882878.41595: Set connection var ansible_timeout to 10 30564 1726882878.41598: Set connection var ansible_pipelining to False 30564 1726882878.41601: Set connection var ansible_shell_type to sh 30564 1726882878.41606: Set connection var ansible_shell_executable to /bin/sh 30564 1726882878.41613: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882878.41615: Set connection var ansible_connection to ssh 30564 1726882878.41633: variable 'ansible_shell_executable' from source: unknown 30564 1726882878.41636: variable 'ansible_connection' from source: unknown 30564 1726882878.41640: variable 'ansible_module_compression' from source: unknown 30564 1726882878.41642: variable 'ansible_shell_type' from source: unknown 30564 1726882878.41645: variable 'ansible_shell_executable' from source: unknown 30564 1726882878.41647: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882878.41650: variable 'ansible_pipelining' from source: unknown 30564 1726882878.41652: variable 'ansible_timeout' from source: unknown 30564 1726882878.41654: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882878.41754: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882878.41765: variable 'omit' from source: magic vars 30564 1726882878.41776: starting attempt loop 30564 1726882878.41780: running the handler 30564 1726882878.41815: variable '__network_connections_result' from source: set_fact 30564 1726882878.41869: variable '__network_connections_result' from source: set_fact 30564 1726882878.41955: handler run complete 30564 1726882878.41979: attempt loop complete, returning result 30564 1726882878.41982: _execute() done 30564 1726882878.41986: dumping result to json 30564 1726882878.41988: done dumping result, returning 30564 1726882878.41994: done running TaskExecutor() for managed_node2/TASK: Show result [0e448fcc-3ce9-4216-acec-0000000017d1] 30564 1726882878.42000: sending task result for task 0e448fcc-3ce9-4216-acec-0000000017d1 30564 1726882878.42092: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000017d1 30564 1726882878.42095: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 891d4ab6-2d22-4634-8d3b-2e935067cc98\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 891d4ab6-2d22-4634-8d3b-2e935067cc98" ] } } 30564 1726882878.42180: no more pending results, returning what we have 30564 1726882878.42183: results queue empty 30564 1726882878.42184: checking for any_errors_fatal 30564 1726882878.42186: done checking for any_errors_fatal 30564 1726882878.42187: checking for max_fail_percentage 30564 1726882878.42189: done checking for max_fail_percentage 30564 1726882878.42190: checking to see if all hosts have failed and the running result is not ok 30564 1726882878.42190: done checking to see if all hosts have failed 30564 1726882878.42191: getting the remaining hosts for this loop 30564 1726882878.42193: done getting the remaining hosts for this loop 30564 1726882878.42197: getting the next task for host managed_node2 30564 1726882878.42208: done getting next task for host managed_node2 30564 1726882878.42211: ^ task is: TASK: Include network role 30564 1726882878.42214: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882878.42218: getting variables 30564 1726882878.42219: in VariableManager get_vars() 30564 1726882878.42248: Calling all_inventory to load vars for managed_node2 30564 1726882878.42251: Calling groups_inventory to load vars for managed_node2 30564 1726882878.42254: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882878.42265: Calling all_plugins_play to load vars for managed_node2 30564 1726882878.42268: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882878.42271: Calling groups_plugins_play to load vars for managed_node2 30564 1726882878.43195: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882878.44111: done with get_vars() 30564 1726882878.44126: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml:3 Friday 20 September 2024 21:41:18 -0400 (0:00:00.039) 0:01:17.023 ****** 30564 1726882878.44192: entering _queue_task() for managed_node2/include_role 30564 1726882878.44391: worker is 1 (out of 1 available) 30564 1726882878.44404: exiting _queue_task() for managed_node2/include_role 30564 1726882878.44416: done queuing things up, now waiting for results queue to drain 30564 1726882878.44417: waiting for pending results... 30564 1726882878.44598: running TaskExecutor() for managed_node2/TASK: Include network role 30564 1726882878.44670: in run() - task 0e448fcc-3ce9-4216-acec-0000000017d5 30564 1726882878.44686: variable 'ansible_search_path' from source: unknown 30564 1726882878.44690: variable 'ansible_search_path' from source: unknown 30564 1726882878.44716: calling self._execute() 30564 1726882878.44793: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882878.44796: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882878.44806: variable 'omit' from source: magic vars 30564 1726882878.45090: variable 'ansible_distribution_major_version' from source: facts 30564 1726882878.45102: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882878.45107: _execute() done 30564 1726882878.45110: dumping result to json 30564 1726882878.45113: done dumping result, returning 30564 1726882878.45119: done running TaskExecutor() for managed_node2/TASK: Include network role [0e448fcc-3ce9-4216-acec-0000000017d5] 30564 1726882878.45124: sending task result for task 0e448fcc-3ce9-4216-acec-0000000017d5 30564 1726882878.45228: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000017d5 30564 1726882878.45231: WORKER PROCESS EXITING 30564 1726882878.45262: no more pending results, returning what we have 30564 1726882878.45268: in VariableManager get_vars() 30564 1726882878.45304: Calling all_inventory to load vars for managed_node2 30564 1726882878.45307: Calling groups_inventory to load vars for managed_node2 30564 1726882878.45310: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882878.45319: Calling all_plugins_play to load vars for managed_node2 30564 1726882878.45322: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882878.45325: Calling groups_plugins_play to load vars for managed_node2 30564 1726882878.47400: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882878.50290: done with get_vars() 30564 1726882878.50313: variable 'ansible_search_path' from source: unknown 30564 1726882878.50315: variable 'ansible_search_path' from source: unknown 30564 1726882878.50471: variable 'omit' from source: magic vars 30564 1726882878.50511: variable 'omit' from source: magic vars 30564 1726882878.50526: variable 'omit' from source: magic vars 30564 1726882878.50530: we have included files to process 30564 1726882878.50531: generating all_blocks data 30564 1726882878.50532: done generating all_blocks data 30564 1726882878.50538: processing included file: fedora.linux_system_roles.network 30564 1726882878.50558: in VariableManager get_vars() 30564 1726882878.50576: done with get_vars() 30564 1726882878.50604: in VariableManager get_vars() 30564 1726882878.50621: done with get_vars() 30564 1726882878.50657: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 30564 1726882878.50786: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 30564 1726882878.50873: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 30564 1726882878.51378: in VariableManager get_vars() 30564 1726882878.51398: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30564 1726882878.53379: iterating over new_blocks loaded from include file 30564 1726882878.53381: in VariableManager get_vars() 30564 1726882878.53399: done with get_vars() 30564 1726882878.53401: filtering new block on tags 30564 1726882878.59690: done filtering new block on tags 30564 1726882878.59695: in VariableManager get_vars() 30564 1726882878.59713: done with get_vars() 30564 1726882878.59715: filtering new block on tags 30564 1726882878.59867: done filtering new block on tags 30564 1726882878.59869: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node2 30564 1726882878.59899: extending task lists for all hosts with included blocks 30564 1726882878.60020: done extending task lists 30564 1726882878.60022: done processing included files 30564 1726882878.60022: results queue empty 30564 1726882878.60023: checking for any_errors_fatal 30564 1726882878.60027: done checking for any_errors_fatal 30564 1726882878.60027: checking for max_fail_percentage 30564 1726882878.60028: done checking for max_fail_percentage 30564 1726882878.60029: checking to see if all hosts have failed and the running result is not ok 30564 1726882878.60030: done checking to see if all hosts have failed 30564 1726882878.60031: getting the remaining hosts for this loop 30564 1726882878.60032: done getting the remaining hosts for this loop 30564 1726882878.60034: getting the next task for host managed_node2 30564 1726882878.60038: done getting next task for host managed_node2 30564 1726882878.60041: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30564 1726882878.60044: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882878.60054: getting variables 30564 1726882878.60056: in VariableManager get_vars() 30564 1726882878.60071: Calling all_inventory to load vars for managed_node2 30564 1726882878.60073: Calling groups_inventory to load vars for managed_node2 30564 1726882878.60075: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882878.60080: Calling all_plugins_play to load vars for managed_node2 30564 1726882878.60083: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882878.60086: Calling groups_plugins_play to load vars for managed_node2 30564 1726882878.62482: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882878.65038: done with get_vars() 30564 1726882878.65062: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:41:18 -0400 (0:00:00.209) 0:01:17.232 ****** 30564 1726882878.65141: entering _queue_task() for managed_node2/include_tasks 30564 1726882878.65470: worker is 1 (out of 1 available) 30564 1726882878.65484: exiting _queue_task() for managed_node2/include_tasks 30564 1726882878.65497: done queuing things up, now waiting for results queue to drain 30564 1726882878.65498: waiting for pending results... 30564 1726882878.65799: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30564 1726882878.65939: in run() - task 0e448fcc-3ce9-4216-acec-0000000019bf 30564 1726882878.65961: variable 'ansible_search_path' from source: unknown 30564 1726882878.65970: variable 'ansible_search_path' from source: unknown 30564 1726882878.66007: calling self._execute() 30564 1726882878.66118: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882878.66131: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882878.66145: variable 'omit' from source: magic vars 30564 1726882878.66670: variable 'ansible_distribution_major_version' from source: facts 30564 1726882878.66691: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882878.66703: _execute() done 30564 1726882878.66713: dumping result to json 30564 1726882878.66720: done dumping result, returning 30564 1726882878.66730: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0e448fcc-3ce9-4216-acec-0000000019bf] 30564 1726882878.66827: sending task result for task 0e448fcc-3ce9-4216-acec-0000000019bf 30564 1726882878.66988: no more pending results, returning what we have 30564 1726882878.66993: in VariableManager get_vars() 30564 1726882878.67046: Calling all_inventory to load vars for managed_node2 30564 1726882878.67049: Calling groups_inventory to load vars for managed_node2 30564 1726882878.67052: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882878.67067: Calling all_plugins_play to load vars for managed_node2 30564 1726882878.67071: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882878.67074: Calling groups_plugins_play to load vars for managed_node2 30564 1726882878.68256: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000019bf 30564 1726882878.68260: WORKER PROCESS EXITING 30564 1726882878.69587: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882878.73209: done with get_vars() 30564 1726882878.73233: variable 'ansible_search_path' from source: unknown 30564 1726882878.73234: variable 'ansible_search_path' from source: unknown 30564 1726882878.73278: we have included files to process 30564 1726882878.73280: generating all_blocks data 30564 1726882878.73281: done generating all_blocks data 30564 1726882878.73285: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30564 1726882878.73286: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30564 1726882878.73288: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30564 1726882878.74126: done processing included file 30564 1726882878.74128: iterating over new_blocks loaded from include file 30564 1726882878.74130: in VariableManager get_vars() 30564 1726882878.74155: done with get_vars() 30564 1726882878.74156: filtering new block on tags 30564 1726882878.74193: done filtering new block on tags 30564 1726882878.74197: in VariableManager get_vars() 30564 1726882878.74222: done with get_vars() 30564 1726882878.74223: filtering new block on tags 30564 1726882878.74272: done filtering new block on tags 30564 1726882878.74275: in VariableManager get_vars() 30564 1726882878.74302: done with get_vars() 30564 1726882878.74304: filtering new block on tags 30564 1726882878.74348: done filtering new block on tags 30564 1726882878.74351: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 30564 1726882878.74356: extending task lists for all hosts with included blocks 30564 1726882878.76309: done extending task lists 30564 1726882878.76311: done processing included files 30564 1726882878.76311: results queue empty 30564 1726882878.76312: checking for any_errors_fatal 30564 1726882878.76315: done checking for any_errors_fatal 30564 1726882878.76316: checking for max_fail_percentage 30564 1726882878.76317: done checking for max_fail_percentage 30564 1726882878.76318: checking to see if all hosts have failed and the running result is not ok 30564 1726882878.76318: done checking to see if all hosts have failed 30564 1726882878.76319: getting the remaining hosts for this loop 30564 1726882878.76321: done getting the remaining hosts for this loop 30564 1726882878.76323: getting the next task for host managed_node2 30564 1726882878.76328: done getting next task for host managed_node2 30564 1726882878.76331: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30564 1726882878.76335: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882878.76346: getting variables 30564 1726882878.76348: in VariableManager get_vars() 30564 1726882878.76362: Calling all_inventory to load vars for managed_node2 30564 1726882878.76366: Calling groups_inventory to load vars for managed_node2 30564 1726882878.76368: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882878.76373: Calling all_plugins_play to load vars for managed_node2 30564 1726882878.76375: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882878.76378: Calling groups_plugins_play to load vars for managed_node2 30564 1726882878.78315: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882878.81493: done with get_vars() 30564 1726882878.81516: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:41:18 -0400 (0:00:00.164) 0:01:17.397 ****** 30564 1726882878.81599: entering _queue_task() for managed_node2/setup 30564 1726882878.82158: worker is 1 (out of 1 available) 30564 1726882878.82171: exiting _queue_task() for managed_node2/setup 30564 1726882878.82184: done queuing things up, now waiting for results queue to drain 30564 1726882878.82185: waiting for pending results... 30564 1726882878.83007: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30564 1726882878.83177: in run() - task 0e448fcc-3ce9-4216-acec-000000001a16 30564 1726882878.83292: variable 'ansible_search_path' from source: unknown 30564 1726882878.83297: variable 'ansible_search_path' from source: unknown 30564 1726882878.83330: calling self._execute() 30564 1726882878.83535: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882878.83540: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882878.83549: variable 'omit' from source: magic vars 30564 1726882878.84372: variable 'ansible_distribution_major_version' from source: facts 30564 1726882878.84389: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882878.84850: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882878.89713: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882878.89901: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882878.89936: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882878.89972: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882878.90101: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882878.90183: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882878.90213: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882878.90352: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882878.90396: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882878.90410: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882878.90579: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882878.90605: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882878.90629: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882878.90789: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882878.90803: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882878.91202: variable '__network_required_facts' from source: role '' defaults 30564 1726882878.91212: variable 'ansible_facts' from source: unknown 30564 1726882878.92820: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 30564 1726882878.92937: when evaluation is False, skipping this task 30564 1726882878.92941: _execute() done 30564 1726882878.92943: dumping result to json 30564 1726882878.92946: done dumping result, returning 30564 1726882878.92954: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0e448fcc-3ce9-4216-acec-000000001a16] 30564 1726882878.92959: sending task result for task 0e448fcc-3ce9-4216-acec-000000001a16 30564 1726882878.93062: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001a16 30564 1726882878.93067: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30564 1726882878.93115: no more pending results, returning what we have 30564 1726882878.93119: results queue empty 30564 1726882878.93121: checking for any_errors_fatal 30564 1726882878.93123: done checking for any_errors_fatal 30564 1726882878.93124: checking for max_fail_percentage 30564 1726882878.93126: done checking for max_fail_percentage 30564 1726882878.93127: checking to see if all hosts have failed and the running result is not ok 30564 1726882878.93128: done checking to see if all hosts have failed 30564 1726882878.93129: getting the remaining hosts for this loop 30564 1726882878.93131: done getting the remaining hosts for this loop 30564 1726882878.93135: getting the next task for host managed_node2 30564 1726882878.93149: done getting next task for host managed_node2 30564 1726882878.93153: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 30564 1726882878.93159: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882878.93185: getting variables 30564 1726882878.93187: in VariableManager get_vars() 30564 1726882878.93227: Calling all_inventory to load vars for managed_node2 30564 1726882878.93230: Calling groups_inventory to load vars for managed_node2 30564 1726882878.93232: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882878.93242: Calling all_plugins_play to load vars for managed_node2 30564 1726882878.93244: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882878.93253: Calling groups_plugins_play to load vars for managed_node2 30564 1726882878.95389: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882879.00224: done with get_vars() 30564 1726882879.00272: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:41:19 -0400 (0:00:00.188) 0:01:17.585 ****** 30564 1726882879.00432: entering _queue_task() for managed_node2/stat 30564 1726882879.01107: worker is 1 (out of 1 available) 30564 1726882879.01120: exiting _queue_task() for managed_node2/stat 30564 1726882879.01134: done queuing things up, now waiting for results queue to drain 30564 1726882879.01136: waiting for pending results... 30564 1726882879.01998: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 30564 1726882879.02398: in run() - task 0e448fcc-3ce9-4216-acec-000000001a18 30564 1726882879.02412: variable 'ansible_search_path' from source: unknown 30564 1726882879.02416: variable 'ansible_search_path' from source: unknown 30564 1726882879.02451: calling self._execute() 30564 1726882879.02665: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882879.02671: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882879.02784: variable 'omit' from source: magic vars 30564 1726882879.03642: variable 'ansible_distribution_major_version' from source: facts 30564 1726882879.03656: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882879.04079: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882879.04647: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882879.04806: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882879.04976: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882879.04996: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882879.05229: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882879.05254: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882879.05286: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882879.05425: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882879.05521: variable '__network_is_ostree' from source: set_fact 30564 1726882879.05748: Evaluated conditional (not __network_is_ostree is defined): False 30564 1726882879.05751: when evaluation is False, skipping this task 30564 1726882879.05754: _execute() done 30564 1726882879.05759: dumping result to json 30564 1726882879.05761: done dumping result, returning 30564 1726882879.05771: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0e448fcc-3ce9-4216-acec-000000001a18] 30564 1726882879.05779: sending task result for task 0e448fcc-3ce9-4216-acec-000000001a18 30564 1726882879.05875: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001a18 30564 1726882879.05879: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30564 1726882879.05928: no more pending results, returning what we have 30564 1726882879.05932: results queue empty 30564 1726882879.05933: checking for any_errors_fatal 30564 1726882879.05943: done checking for any_errors_fatal 30564 1726882879.05943: checking for max_fail_percentage 30564 1726882879.05945: done checking for max_fail_percentage 30564 1726882879.05946: checking to see if all hosts have failed and the running result is not ok 30564 1726882879.05947: done checking to see if all hosts have failed 30564 1726882879.05948: getting the remaining hosts for this loop 30564 1726882879.05950: done getting the remaining hosts for this loop 30564 1726882879.05953: getting the next task for host managed_node2 30564 1726882879.05962: done getting next task for host managed_node2 30564 1726882879.05968: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30564 1726882879.05974: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882879.05998: getting variables 30564 1726882879.06000: in VariableManager get_vars() 30564 1726882879.06039: Calling all_inventory to load vars for managed_node2 30564 1726882879.06041: Calling groups_inventory to load vars for managed_node2 30564 1726882879.06043: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882879.06053: Calling all_plugins_play to load vars for managed_node2 30564 1726882879.06055: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882879.06058: Calling groups_plugins_play to load vars for managed_node2 30564 1726882879.08844: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882879.12729: done with get_vars() 30564 1726882879.12757: done getting variables 30564 1726882879.12919: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:41:19 -0400 (0:00:00.125) 0:01:17.710 ****** 30564 1726882879.12956: entering _queue_task() for managed_node2/set_fact 30564 1726882879.13828: worker is 1 (out of 1 available) 30564 1726882879.13841: exiting _queue_task() for managed_node2/set_fact 30564 1726882879.13854: done queuing things up, now waiting for results queue to drain 30564 1726882879.13972: waiting for pending results... 30564 1726882879.15038: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30564 1726882879.15429: in run() - task 0e448fcc-3ce9-4216-acec-000000001a19 30564 1726882879.15443: variable 'ansible_search_path' from source: unknown 30564 1726882879.15447: variable 'ansible_search_path' from source: unknown 30564 1726882879.15486: calling self._execute() 30564 1726882879.15701: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882879.15706: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882879.15716: variable 'omit' from source: magic vars 30564 1726882879.16559: variable 'ansible_distribution_major_version' from source: facts 30564 1726882879.16578: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882879.16999: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882879.17641: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882879.17826: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882879.17857: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882879.17895: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882879.18258: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882879.18290: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882879.18316: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882879.18383: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882879.18595: variable '__network_is_ostree' from source: set_fact 30564 1726882879.18604: Evaluated conditional (not __network_is_ostree is defined): False 30564 1726882879.18607: when evaluation is False, skipping this task 30564 1726882879.18609: _execute() done 30564 1726882879.18612: dumping result to json 30564 1726882879.18614: done dumping result, returning 30564 1726882879.18620: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0e448fcc-3ce9-4216-acec-000000001a19] 30564 1726882879.18626: sending task result for task 0e448fcc-3ce9-4216-acec-000000001a19 skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30564 1726882879.18797: no more pending results, returning what we have 30564 1726882879.18801: results queue empty 30564 1726882879.18803: checking for any_errors_fatal 30564 1726882879.18809: done checking for any_errors_fatal 30564 1726882879.18810: checking for max_fail_percentage 30564 1726882879.18812: done checking for max_fail_percentage 30564 1726882879.18813: checking to see if all hosts have failed and the running result is not ok 30564 1726882879.18814: done checking to see if all hosts have failed 30564 1726882879.18815: getting the remaining hosts for this loop 30564 1726882879.18817: done getting the remaining hosts for this loop 30564 1726882879.18822: getting the next task for host managed_node2 30564 1726882879.18834: done getting next task for host managed_node2 30564 1726882879.18839: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 30564 1726882879.18845: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882879.18876: getting variables 30564 1726882879.18878: in VariableManager get_vars() 30564 1726882879.18924: Calling all_inventory to load vars for managed_node2 30564 1726882879.18927: Calling groups_inventory to load vars for managed_node2 30564 1726882879.18930: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882879.18941: Calling all_plugins_play to load vars for managed_node2 30564 1726882879.18944: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882879.18947: Calling groups_plugins_play to load vars for managed_node2 30564 1726882879.19942: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001a19 30564 1726882879.19946: WORKER PROCESS EXITING 30564 1726882879.22203: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882879.26725: done with get_vars() 30564 1726882879.26750: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:41:19 -0400 (0:00:00.140) 0:01:17.850 ****** 30564 1726882879.26971: entering _queue_task() for managed_node2/service_facts 30564 1726882879.28005: worker is 1 (out of 1 available) 30564 1726882879.28047: exiting _queue_task() for managed_node2/service_facts 30564 1726882879.28061: done queuing things up, now waiting for results queue to drain 30564 1726882879.28062: waiting for pending results... 30564 1726882879.29834: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 30564 1726882879.30168: in run() - task 0e448fcc-3ce9-4216-acec-000000001a1b 30564 1726882879.30192: variable 'ansible_search_path' from source: unknown 30564 1726882879.30200: variable 'ansible_search_path' from source: unknown 30564 1726882879.30250: calling self._execute() 30564 1726882879.30784: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882879.30796: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882879.30812: variable 'omit' from source: magic vars 30564 1726882879.31386: variable 'ansible_distribution_major_version' from source: facts 30564 1726882879.31551: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882879.31567: variable 'omit' from source: magic vars 30564 1726882879.31650: variable 'omit' from source: magic vars 30564 1726882879.31717: variable 'omit' from source: magic vars 30564 1726882879.32146: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882879.32188: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882879.32213: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882879.32235: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882879.32252: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882879.32291: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882879.32300: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882879.32310: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882879.32417: Set connection var ansible_timeout to 10 30564 1726882879.32429: Set connection var ansible_pipelining to False 30564 1726882879.32436: Set connection var ansible_shell_type to sh 30564 1726882879.32446: Set connection var ansible_shell_executable to /bin/sh 30564 1726882879.32469: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882879.32478: Set connection var ansible_connection to ssh 30564 1726882879.32510: variable 'ansible_shell_executable' from source: unknown 30564 1726882879.32528: variable 'ansible_connection' from source: unknown 30564 1726882879.32536: variable 'ansible_module_compression' from source: unknown 30564 1726882879.32543: variable 'ansible_shell_type' from source: unknown 30564 1726882879.32550: variable 'ansible_shell_executable' from source: unknown 30564 1726882879.32557: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882879.32568: variable 'ansible_pipelining' from source: unknown 30564 1726882879.32590: variable 'ansible_timeout' from source: unknown 30564 1726882879.32601: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882879.33182: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30564 1726882879.33199: variable 'omit' from source: magic vars 30564 1726882879.33209: starting attempt loop 30564 1726882879.33216: running the handler 30564 1726882879.33233: _low_level_execute_command(): starting 30564 1726882879.33247: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882879.35785: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882879.35789: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882879.35947: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 30564 1726882879.35952: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882879.35956: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882879.36085: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882879.36331: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882879.36338: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882879.36452: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882879.38113: stdout chunk (state=3): >>>/root <<< 30564 1726882879.38214: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882879.38289: stderr chunk (state=3): >>><<< 30564 1726882879.38293: stdout chunk (state=3): >>><<< 30564 1726882879.38404: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882879.38408: _low_level_execute_command(): starting 30564 1726882879.38410: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882879.383114-33957-241685347625477 `" && echo ansible-tmp-1726882879.383114-33957-241685347625477="` echo /root/.ansible/tmp/ansible-tmp-1726882879.383114-33957-241685347625477 `" ) && sleep 0' 30564 1726882879.39727: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882879.39730: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882879.39755: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882879.39860: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882879.39889: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882879.39930: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882879.39944: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882879.39957: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882879.39966: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882879.39975: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882879.39987: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882879.40034: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882879.40047: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882879.40050: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882879.40199: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882879.40238: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882879.40255: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882879.40391: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882879.42571: stdout chunk (state=3): >>>ansible-tmp-1726882879.383114-33957-241685347625477=/root/.ansible/tmp/ansible-tmp-1726882879.383114-33957-241685347625477 <<< 30564 1726882879.42575: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882879.42577: stdout chunk (state=3): >>><<< 30564 1726882879.42580: stderr chunk (state=3): >>><<< 30564 1726882879.42582: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882879.383114-33957-241685347625477=/root/.ansible/tmp/ansible-tmp-1726882879.383114-33957-241685347625477 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882879.42584: variable 'ansible_module_compression' from source: unknown 30564 1726882879.42586: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 30564 1726882879.42588: variable 'ansible_facts' from source: unknown 30564 1726882879.42658: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882879.383114-33957-241685347625477/AnsiballZ_service_facts.py 30564 1726882879.43499: Sending initial data 30564 1726882879.43503: Sent initial data (161 bytes) 30564 1726882879.46132: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882879.46140: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882879.46151: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882879.46168: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882879.46211: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882879.46223: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882879.46238: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882879.46252: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882879.46259: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882879.46268: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882879.46281: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882879.46291: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882879.46304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882879.46313: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882879.46319: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882879.46333: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882879.46413: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882879.46561: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882879.46581: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882879.46709: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882879.48524: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882879.48617: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882879.48718: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmp9_876cs5 /root/.ansible/tmp/ansible-tmp-1726882879.383114-33957-241685347625477/AnsiballZ_service_facts.py <<< 30564 1726882879.48820: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882879.50302: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882879.50381: stderr chunk (state=3): >>><<< 30564 1726882879.50385: stdout chunk (state=3): >>><<< 30564 1726882879.50406: done transferring module to remote 30564 1726882879.50418: _low_level_execute_command(): starting 30564 1726882879.50423: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882879.383114-33957-241685347625477/ /root/.ansible/tmp/ansible-tmp-1726882879.383114-33957-241685347625477/AnsiballZ_service_facts.py && sleep 0' 30564 1726882879.52279: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882879.52287: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882879.52299: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882879.52314: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882879.52359: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882879.52449: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882879.52459: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882879.52478: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882879.52486: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882879.52493: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882879.52501: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882879.52510: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882879.52522: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882879.52530: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882879.52537: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882879.52548: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882879.52628: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882879.52685: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882879.52698: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882879.52904: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882879.54729: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882879.54732: stdout chunk (state=3): >>><<< 30564 1726882879.54739: stderr chunk (state=3): >>><<< 30564 1726882879.54752: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882879.54755: _low_level_execute_command(): starting 30564 1726882879.54761: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882879.383114-33957-241685347625477/AnsiballZ_service_facts.py && sleep 0' 30564 1726882879.56552: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882879.56557: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882879.56662: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882879.56669: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration <<< 30564 1726882879.56683: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882879.56708: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882879.56722: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 30564 1726882879.56728: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882879.56811: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882879.56933: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882879.57068: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882880.88870: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 30564 1726882880.89774: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882880.89882: stderr chunk (state=3): >>><<< 30564 1726882880.89897: stdout chunk (state=3): >>><<< 30564 1726882880.89938: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882880.92946: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882879.383114-33957-241685347625477/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882880.92953: _low_level_execute_command(): starting 30564 1726882880.92959: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882879.383114-33957-241685347625477/ > /dev/null 2>&1 && sleep 0' 30564 1726882880.93558: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882880.93600: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882880.93622: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882880.93732: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882880.93750: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882880.93895: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882880.95711: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882880.95756: stderr chunk (state=3): >>><<< 30564 1726882880.95759: stdout chunk (state=3): >>><<< 30564 1726882880.95795: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882880.95798: handler run complete 30564 1726882880.95914: variable 'ansible_facts' from source: unknown 30564 1726882880.96022: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882880.96276: variable 'ansible_facts' from source: unknown 30564 1726882880.96354: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882880.96462: attempt loop complete, returning result 30564 1726882880.96467: _execute() done 30564 1726882880.96474: dumping result to json 30564 1726882880.96505: done dumping result, returning 30564 1726882880.96515: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [0e448fcc-3ce9-4216-acec-000000001a1b] 30564 1726882880.96523: sending task result for task 0e448fcc-3ce9-4216-acec-000000001a1b ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30564 1726882880.97313: no more pending results, returning what we have 30564 1726882880.97317: results queue empty 30564 1726882880.97318: checking for any_errors_fatal 30564 1726882880.97325: done checking for any_errors_fatal 30564 1726882880.97326: checking for max_fail_percentage 30564 1726882880.97328: done checking for max_fail_percentage 30564 1726882880.97329: checking to see if all hosts have failed and the running result is not ok 30564 1726882880.97329: done checking to see if all hosts have failed 30564 1726882880.97330: getting the remaining hosts for this loop 30564 1726882880.97332: done getting the remaining hosts for this loop 30564 1726882880.97336: getting the next task for host managed_node2 30564 1726882880.97346: done getting next task for host managed_node2 30564 1726882880.97349: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 30564 1726882880.97356: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882880.97372: getting variables 30564 1726882880.97374: in VariableManager get_vars() 30564 1726882880.97415: Calling all_inventory to load vars for managed_node2 30564 1726882880.97418: Calling groups_inventory to load vars for managed_node2 30564 1726882880.97420: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882880.97432: Calling all_plugins_play to load vars for managed_node2 30564 1726882880.97434: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882880.97438: Calling groups_plugins_play to load vars for managed_node2 30564 1726882880.99336: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001a1b 30564 1726882880.99339: WORKER PROCESS EXITING 30564 1726882881.01813: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882881.05148: done with get_vars() 30564 1726882881.05181: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:41:21 -0400 (0:00:01.783) 0:01:19.634 ****** 30564 1726882881.05329: entering _queue_task() for managed_node2/package_facts 30564 1726882881.05750: worker is 1 (out of 1 available) 30564 1726882881.05773: exiting _queue_task() for managed_node2/package_facts 30564 1726882881.05790: done queuing things up, now waiting for results queue to drain 30564 1726882881.05794: waiting for pending results... 30564 1726882881.06124: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 30564 1726882881.06339: in run() - task 0e448fcc-3ce9-4216-acec-000000001a1c 30564 1726882881.06355: variable 'ansible_search_path' from source: unknown 30564 1726882881.06359: variable 'ansible_search_path' from source: unknown 30564 1726882881.06400: calling self._execute() 30564 1726882881.07249: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882881.07257: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882881.07269: variable 'omit' from source: magic vars 30564 1726882881.08221: variable 'ansible_distribution_major_version' from source: facts 30564 1726882881.08235: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882881.08241: variable 'omit' from source: magic vars 30564 1726882881.08484: variable 'omit' from source: magic vars 30564 1726882881.08622: variable 'omit' from source: magic vars 30564 1726882881.08669: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882881.08711: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882881.08786: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882881.08815: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882881.08818: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882881.08964: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882881.08968: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882881.08978: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882881.09200: Set connection var ansible_timeout to 10 30564 1726882881.09204: Set connection var ansible_pipelining to False 30564 1726882881.09207: Set connection var ansible_shell_type to sh 30564 1726882881.09213: Set connection var ansible_shell_executable to /bin/sh 30564 1726882881.09221: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882881.09245: Set connection var ansible_connection to ssh 30564 1726882881.09257: variable 'ansible_shell_executable' from source: unknown 30564 1726882881.09260: variable 'ansible_connection' from source: unknown 30564 1726882881.09407: variable 'ansible_module_compression' from source: unknown 30564 1726882881.09432: variable 'ansible_shell_type' from source: unknown 30564 1726882881.09436: variable 'ansible_shell_executable' from source: unknown 30564 1726882881.09438: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882881.09441: variable 'ansible_pipelining' from source: unknown 30564 1726882881.09443: variable 'ansible_timeout' from source: unknown 30564 1726882881.09445: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882881.09890: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30564 1726882881.09901: variable 'omit' from source: magic vars 30564 1726882881.09907: starting attempt loop 30564 1726882881.09910: running the handler 30564 1726882881.09972: _low_level_execute_command(): starting 30564 1726882881.09983: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882881.12145: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882881.12157: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882881.12170: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882881.12190: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882881.12232: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882881.12242: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882881.12250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882881.12334: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882881.12342: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882881.12349: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882881.12357: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882881.12369: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882881.12385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882881.12390: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882881.12396: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882881.12406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882881.12483: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882881.12559: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882881.12582: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882881.12708: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882881.14539: stdout chunk (state=3): >>>/root <<< 30564 1726882881.14549: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882881.14552: stdout chunk (state=3): >>><<< 30564 1726882881.14593: stderr chunk (state=3): >>><<< 30564 1726882881.14599: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882881.14603: _low_level_execute_command(): starting 30564 1726882881.14660: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882881.1458626-34031-47235744690958 `" && echo ansible-tmp-1726882881.1458626-34031-47235744690958="` echo /root/.ansible/tmp/ansible-tmp-1726882881.1458626-34031-47235744690958 `" ) && sleep 0' 30564 1726882881.16165: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882881.16183: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882881.16221: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882881.16235: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882881.16275: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882881.16327: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882881.16337: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882881.16350: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882881.16359: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882881.16367: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882881.16384: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882881.16398: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882881.16416: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882881.16434: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882881.16543: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882881.16557: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882881.16634: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882881.16659: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882881.16683: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882881.16816: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882881.18724: stdout chunk (state=3): >>>ansible-tmp-1726882881.1458626-34031-47235744690958=/root/.ansible/tmp/ansible-tmp-1726882881.1458626-34031-47235744690958 <<< 30564 1726882881.18879: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882881.18928: stderr chunk (state=3): >>><<< 30564 1726882881.18931: stdout chunk (state=3): >>><<< 30564 1726882881.19071: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882881.1458626-34031-47235744690958=/root/.ansible/tmp/ansible-tmp-1726882881.1458626-34031-47235744690958 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882881.19074: variable 'ansible_module_compression' from source: unknown 30564 1726882881.19077: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 30564 1726882881.19180: variable 'ansible_facts' from source: unknown 30564 1726882881.19317: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882881.1458626-34031-47235744690958/AnsiballZ_package_facts.py 30564 1726882881.19739: Sending initial data 30564 1726882881.19742: Sent initial data (161 bytes) 30564 1726882881.20705: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882881.20722: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882881.20738: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882881.20755: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882881.20802: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882881.20814: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882881.20831: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882881.20848: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882881.20860: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882881.20877: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882881.20890: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882881.20903: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882881.20927: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882881.20944: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882881.20956: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882881.20975: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882881.21061: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882881.21087: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882881.21104: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882881.21231: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882881.23002: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882881.23096: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882881.23198: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmp0cmohr0u /root/.ansible/tmp/ansible-tmp-1726882881.1458626-34031-47235744690958/AnsiballZ_package_facts.py <<< 30564 1726882881.23297: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882881.26776: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882881.26938: stderr chunk (state=3): >>><<< 30564 1726882881.26941: stdout chunk (state=3): >>><<< 30564 1726882881.26944: done transferring module to remote 30564 1726882881.26951: _low_level_execute_command(): starting 30564 1726882881.26953: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882881.1458626-34031-47235744690958/ /root/.ansible/tmp/ansible-tmp-1726882881.1458626-34031-47235744690958/AnsiballZ_package_facts.py && sleep 0' 30564 1726882881.27707: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882881.27722: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882881.27739: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882881.27758: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882881.27810: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882881.27826: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882881.27841: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882881.27859: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882881.27878: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882881.27889: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882881.27900: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882881.27912: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882881.27929: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882881.27941: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882881.27952: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882881.27971: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882881.28052: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882881.28079: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882881.28095: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882881.28225: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882881.30091: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882881.30094: stdout chunk (state=3): >>><<< 30564 1726882881.30096: stderr chunk (state=3): >>><<< 30564 1726882881.30177: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882881.30182: _low_level_execute_command(): starting 30564 1726882881.30185: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882881.1458626-34031-47235744690958/AnsiballZ_package_facts.py && sleep 0' 30564 1726882881.31244: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882881.31274: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882881.31289: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882881.31310: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882881.31354: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882881.31374: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882881.31389: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882881.31408: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882881.31424: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882881.31435: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882881.31457: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882881.31476: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882881.31503: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882881.31522: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882881.31533: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882881.31546: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882881.31639: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882881.31659: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882881.31684: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882881.31836: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882881.77948: stdout chunk (state=3): >>> <<< 30564 1726882881.77978: stdout chunk (state=3): >>>{"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": <<< 30564 1726882881.78003: stdout chunk (state=3): >>>[{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release":<<< 30564 1726882881.78132: stdout chunk (state=3): >>> "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.<<< 30564 1726882881.78189: stdout chunk (state=3): >>>1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-ba<<< 30564 1726882881.78203: stdout chunk (state=3): >>>se-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arc<<< 30564 1726882881.78207: stdout chunk (state=3): >>>h": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "a<<< 30564 1726882881.78213: stdout chunk (state=3): >>>rch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300",<<< 30564 1726882881.78216: stdout chunk (state=3): >>> "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", <<< 30564 1726882881.78220: stdout chunk (state=3): >>>"version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64"<<< 30564 1726882881.78222: stdout chunk (state=3): >>>, "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "r<<< 30564 1726882881.78226: stdout chunk (state=3): >>>elease": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 30564 1726882881.79683: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882881.79745: stderr chunk (state=3): >>><<< 30564 1726882881.79748: stdout chunk (state=3): >>><<< 30564 1726882881.79793: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882881.82941: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882881.1458626-34031-47235744690958/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882881.82972: _low_level_execute_command(): starting 30564 1726882881.82982: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882881.1458626-34031-47235744690958/ > /dev/null 2>&1 && sleep 0' 30564 1726882881.84996: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882881.85012: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882881.85027: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882881.85045: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882881.85101: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882881.85115: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882881.85130: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882881.85147: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882881.85158: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882881.85175: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882881.85190: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882881.85207: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882881.85224: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882881.85237: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882881.85249: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882881.85271: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882881.85348: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882881.85375: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882881.85393: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882881.85526: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882881.87445: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882881.87448: stdout chunk (state=3): >>><<< 30564 1726882881.87451: stderr chunk (state=3): >>><<< 30564 1726882881.87775: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882881.87779: handler run complete 30564 1726882881.89983: variable 'ansible_facts' from source: unknown 30564 1726882881.90998: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882881.97408: variable 'ansible_facts' from source: unknown 30564 1726882881.98477: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882881.99786: attempt loop complete, returning result 30564 1726882881.99816: _execute() done 30564 1726882881.99825: dumping result to json 30564 1726882882.00103: done dumping result, returning 30564 1726882882.00237: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0e448fcc-3ce9-4216-acec-000000001a1c] 30564 1726882882.00254: sending task result for task 0e448fcc-3ce9-4216-acec-000000001a1c ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30564 1726882882.04079: no more pending results, returning what we have 30564 1726882882.04083: results queue empty 30564 1726882882.04085: checking for any_errors_fatal 30564 1726882882.04093: done checking for any_errors_fatal 30564 1726882882.04094: checking for max_fail_percentage 30564 1726882882.04096: done checking for max_fail_percentage 30564 1726882882.04097: checking to see if all hosts have failed and the running result is not ok 30564 1726882882.04098: done checking to see if all hosts have failed 30564 1726882882.04099: getting the remaining hosts for this loop 30564 1726882882.04102: done getting the remaining hosts for this loop 30564 1726882882.04106: getting the next task for host managed_node2 30564 1726882882.04116: done getting next task for host managed_node2 30564 1726882882.04120: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 30564 1726882882.04126: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882882.04140: getting variables 30564 1726882882.04142: in VariableManager get_vars() 30564 1726882882.04193: Calling all_inventory to load vars for managed_node2 30564 1726882882.04196: Calling groups_inventory to load vars for managed_node2 30564 1726882882.04199: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882882.04216: Calling all_plugins_play to load vars for managed_node2 30564 1726882882.04219: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882882.04222: Calling groups_plugins_play to load vars for managed_node2 30564 1726882882.05174: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001a1c 30564 1726882882.05179: WORKER PROCESS EXITING 30564 1726882882.06173: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882882.09445: done with get_vars() 30564 1726882882.09485: done getting variables 30564 1726882882.09549: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:41:22 -0400 (0:00:01.042) 0:01:20.677 ****** 30564 1726882882.09602: entering _queue_task() for managed_node2/debug 30564 1726882882.09943: worker is 1 (out of 1 available) 30564 1726882882.09956: exiting _queue_task() for managed_node2/debug 30564 1726882882.09970: done queuing things up, now waiting for results queue to drain 30564 1726882882.09972: waiting for pending results... 30564 1726882882.10719: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 30564 1726882882.10874: in run() - task 0e448fcc-3ce9-4216-acec-0000000019c0 30564 1726882882.10898: variable 'ansible_search_path' from source: unknown 30564 1726882882.10905: variable 'ansible_search_path' from source: unknown 30564 1726882882.10950: calling self._execute() 30564 1726882882.11060: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882882.11075: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882882.11090: variable 'omit' from source: magic vars 30564 1726882882.11504: variable 'ansible_distribution_major_version' from source: facts 30564 1726882882.11521: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882882.11530: variable 'omit' from source: magic vars 30564 1726882882.11707: variable 'omit' from source: magic vars 30564 1726882882.11924: variable 'network_provider' from source: set_fact 30564 1726882882.11947: variable 'omit' from source: magic vars 30564 1726882882.12000: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882882.12039: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882882.12125: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882882.12150: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882882.12227: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882882.12262: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882882.12326: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882882.12335: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882882.12551: Set connection var ansible_timeout to 10 30564 1726882882.12562: Set connection var ansible_pipelining to False 30564 1726882882.12572: Set connection var ansible_shell_type to sh 30564 1726882882.12583: Set connection var ansible_shell_executable to /bin/sh 30564 1726882882.12594: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882882.12600: Set connection var ansible_connection to ssh 30564 1726882882.12629: variable 'ansible_shell_executable' from source: unknown 30564 1726882882.12743: variable 'ansible_connection' from source: unknown 30564 1726882882.12756: variable 'ansible_module_compression' from source: unknown 30564 1726882882.12765: variable 'ansible_shell_type' from source: unknown 30564 1726882882.12773: variable 'ansible_shell_executable' from source: unknown 30564 1726882882.12780: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882882.12787: variable 'ansible_pipelining' from source: unknown 30564 1726882882.12793: variable 'ansible_timeout' from source: unknown 30564 1726882882.12800: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882882.13053: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882882.13280: variable 'omit' from source: magic vars 30564 1726882882.13292: starting attempt loop 30564 1726882882.13299: running the handler 30564 1726882882.13389: handler run complete 30564 1726882882.13408: attempt loop complete, returning result 30564 1726882882.13418: _execute() done 30564 1726882882.13425: dumping result to json 30564 1726882882.13430: done dumping result, returning 30564 1726882882.13441: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [0e448fcc-3ce9-4216-acec-0000000019c0] 30564 1726882882.13450: sending task result for task 0e448fcc-3ce9-4216-acec-0000000019c0 ok: [managed_node2] => {} MSG: Using network provider: nm 30564 1726882882.13648: no more pending results, returning what we have 30564 1726882882.13651: results queue empty 30564 1726882882.13653: checking for any_errors_fatal 30564 1726882882.13673: done checking for any_errors_fatal 30564 1726882882.13675: checking for max_fail_percentage 30564 1726882882.13677: done checking for max_fail_percentage 30564 1726882882.13678: checking to see if all hosts have failed and the running result is not ok 30564 1726882882.13678: done checking to see if all hosts have failed 30564 1726882882.13679: getting the remaining hosts for this loop 30564 1726882882.13681: done getting the remaining hosts for this loop 30564 1726882882.13685: getting the next task for host managed_node2 30564 1726882882.13695: done getting next task for host managed_node2 30564 1726882882.13699: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30564 1726882882.13704: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882882.13718: getting variables 30564 1726882882.13720: in VariableManager get_vars() 30564 1726882882.13766: Calling all_inventory to load vars for managed_node2 30564 1726882882.13769: Calling groups_inventory to load vars for managed_node2 30564 1726882882.13772: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882882.13784: Calling all_plugins_play to load vars for managed_node2 30564 1726882882.13787: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882882.13791: Calling groups_plugins_play to load vars for managed_node2 30564 1726882882.14782: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000019c0 30564 1726882882.14785: WORKER PROCESS EXITING 30564 1726882882.16630: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882882.19336: done with get_vars() 30564 1726882882.19371: done getting variables 30564 1726882882.19432: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:41:22 -0400 (0:00:00.101) 0:01:20.778 ****** 30564 1726882882.19769: entering _queue_task() for managed_node2/fail 30564 1726882882.20145: worker is 1 (out of 1 available) 30564 1726882882.20157: exiting _queue_task() for managed_node2/fail 30564 1726882882.20175: done queuing things up, now waiting for results queue to drain 30564 1726882882.20177: waiting for pending results... 30564 1726882882.20487: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30564 1726882882.20639: in run() - task 0e448fcc-3ce9-4216-acec-0000000019c1 30564 1726882882.20659: variable 'ansible_search_path' from source: unknown 30564 1726882882.20670: variable 'ansible_search_path' from source: unknown 30564 1726882882.20711: calling self._execute() 30564 1726882882.20820: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882882.20833: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882882.20850: variable 'omit' from source: magic vars 30564 1726882882.21256: variable 'ansible_distribution_major_version' from source: facts 30564 1726882882.21279: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882882.21407: variable 'network_state' from source: role '' defaults 30564 1726882882.21423: Evaluated conditional (network_state != {}): False 30564 1726882882.21430: when evaluation is False, skipping this task 30564 1726882882.21437: _execute() done 30564 1726882882.21443: dumping result to json 30564 1726882882.21449: done dumping result, returning 30564 1726882882.21458: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0e448fcc-3ce9-4216-acec-0000000019c1] 30564 1726882882.21471: sending task result for task 0e448fcc-3ce9-4216-acec-0000000019c1 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30564 1726882882.21658: no more pending results, returning what we have 30564 1726882882.21662: results queue empty 30564 1726882882.21665: checking for any_errors_fatal 30564 1726882882.21675: done checking for any_errors_fatal 30564 1726882882.21676: checking for max_fail_percentage 30564 1726882882.21678: done checking for max_fail_percentage 30564 1726882882.21679: checking to see if all hosts have failed and the running result is not ok 30564 1726882882.21680: done checking to see if all hosts have failed 30564 1726882882.21681: getting the remaining hosts for this loop 30564 1726882882.21683: done getting the remaining hosts for this loop 30564 1726882882.21687: getting the next task for host managed_node2 30564 1726882882.21696: done getting next task for host managed_node2 30564 1726882882.21700: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30564 1726882882.21706: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882882.21733: getting variables 30564 1726882882.21735: in VariableManager get_vars() 30564 1726882882.21778: Calling all_inventory to load vars for managed_node2 30564 1726882882.21781: Calling groups_inventory to load vars for managed_node2 30564 1726882882.21783: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882882.21796: Calling all_plugins_play to load vars for managed_node2 30564 1726882882.21799: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882882.21802: Calling groups_plugins_play to load vars for managed_node2 30564 1726882882.23180: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000019c1 30564 1726882882.23184: WORKER PROCESS EXITING 30564 1726882882.23996: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882882.25730: done with get_vars() 30564 1726882882.25755: done getting variables 30564 1726882882.25817: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:41:22 -0400 (0:00:00.060) 0:01:20.839 ****** 30564 1726882882.25853: entering _queue_task() for managed_node2/fail 30564 1726882882.26349: worker is 1 (out of 1 available) 30564 1726882882.26435: exiting _queue_task() for managed_node2/fail 30564 1726882882.26809: done queuing things up, now waiting for results queue to drain 30564 1726882882.26903: waiting for pending results... 30564 1726882882.26921: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30564 1726882882.26961: in run() - task 0e448fcc-3ce9-4216-acec-0000000019c2 30564 1726882882.26978: variable 'ansible_search_path' from source: unknown 30564 1726882882.26990: variable 'ansible_search_path' from source: unknown 30564 1726882882.27072: calling self._execute() 30564 1726882882.27201: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882882.27206: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882882.27211: variable 'omit' from source: magic vars 30564 1726882882.27610: variable 'ansible_distribution_major_version' from source: facts 30564 1726882882.27639: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882882.27811: variable 'network_state' from source: role '' defaults 30564 1726882882.27820: Evaluated conditional (network_state != {}): False 30564 1726882882.27824: when evaluation is False, skipping this task 30564 1726882882.27827: _execute() done 30564 1726882882.27836: dumping result to json 30564 1726882882.27840: done dumping result, returning 30564 1726882882.27846: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0e448fcc-3ce9-4216-acec-0000000019c2] 30564 1726882882.27859: sending task result for task 0e448fcc-3ce9-4216-acec-0000000019c2 30564 1726882882.28032: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000019c2 30564 1726882882.28035: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30564 1726882882.28212: no more pending results, returning what we have 30564 1726882882.28215: results queue empty 30564 1726882882.28216: checking for any_errors_fatal 30564 1726882882.28246: done checking for any_errors_fatal 30564 1726882882.28248: checking for max_fail_percentage 30564 1726882882.28250: done checking for max_fail_percentage 30564 1726882882.28251: checking to see if all hosts have failed and the running result is not ok 30564 1726882882.28252: done checking to see if all hosts have failed 30564 1726882882.28253: getting the remaining hosts for this loop 30564 1726882882.28257: done getting the remaining hosts for this loop 30564 1726882882.28262: getting the next task for host managed_node2 30564 1726882882.28275: done getting next task for host managed_node2 30564 1726882882.28283: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30564 1726882882.28294: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882882.28328: getting variables 30564 1726882882.28330: in VariableManager get_vars() 30564 1726882882.28381: Calling all_inventory to load vars for managed_node2 30564 1726882882.28387: Calling groups_inventory to load vars for managed_node2 30564 1726882882.28392: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882882.28404: Calling all_plugins_play to load vars for managed_node2 30564 1726882882.28410: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882882.28419: Calling groups_plugins_play to load vars for managed_node2 30564 1726882882.30121: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882882.32080: done with get_vars() 30564 1726882882.32104: done getting variables 30564 1726882882.32148: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:41:22 -0400 (0:00:00.063) 0:01:20.903 ****** 30564 1726882882.32177: entering _queue_task() for managed_node2/fail 30564 1726882882.32440: worker is 1 (out of 1 available) 30564 1726882882.32456: exiting _queue_task() for managed_node2/fail 30564 1726882882.32473: done queuing things up, now waiting for results queue to drain 30564 1726882882.32475: waiting for pending results... 30564 1726882882.32724: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30564 1726882882.32908: in run() - task 0e448fcc-3ce9-4216-acec-0000000019c3 30564 1726882882.32912: variable 'ansible_search_path' from source: unknown 30564 1726882882.32916: variable 'ansible_search_path' from source: unknown 30564 1726882882.32974: calling self._execute() 30564 1726882882.33125: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882882.33129: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882882.33141: variable 'omit' from source: magic vars 30564 1726882882.33572: variable 'ansible_distribution_major_version' from source: facts 30564 1726882882.33598: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882882.33831: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882882.36661: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882882.36739: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882882.36768: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882882.36800: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882882.36819: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882882.36886: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882882.36909: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882882.36926: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882882.36954: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882882.36966: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882882.37045: variable 'ansible_distribution_major_version' from source: facts 30564 1726882882.37060: Evaluated conditional (ansible_distribution_major_version | int > 9): False 30564 1726882882.37065: when evaluation is False, skipping this task 30564 1726882882.37068: _execute() done 30564 1726882882.37071: dumping result to json 30564 1726882882.37073: done dumping result, returning 30564 1726882882.37084: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0e448fcc-3ce9-4216-acec-0000000019c3] 30564 1726882882.37090: sending task result for task 0e448fcc-3ce9-4216-acec-0000000019c3 30564 1726882882.37195: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000019c3 30564 1726882882.37198: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 30564 1726882882.37268: no more pending results, returning what we have 30564 1726882882.37272: results queue empty 30564 1726882882.37274: checking for any_errors_fatal 30564 1726882882.37283: done checking for any_errors_fatal 30564 1726882882.37283: checking for max_fail_percentage 30564 1726882882.37285: done checking for max_fail_percentage 30564 1726882882.37286: checking to see if all hosts have failed and the running result is not ok 30564 1726882882.37287: done checking to see if all hosts have failed 30564 1726882882.37288: getting the remaining hosts for this loop 30564 1726882882.37289: done getting the remaining hosts for this loop 30564 1726882882.37293: getting the next task for host managed_node2 30564 1726882882.37301: done getting next task for host managed_node2 30564 1726882882.37308: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30564 1726882882.37312: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882882.37334: getting variables 30564 1726882882.37336: in VariableManager get_vars() 30564 1726882882.37376: Calling all_inventory to load vars for managed_node2 30564 1726882882.37379: Calling groups_inventory to load vars for managed_node2 30564 1726882882.37381: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882882.37391: Calling all_plugins_play to load vars for managed_node2 30564 1726882882.37394: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882882.37397: Calling groups_plugins_play to load vars for managed_node2 30564 1726882882.38548: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882882.40479: done with get_vars() 30564 1726882882.40511: done getting variables 30564 1726882882.40629: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:41:22 -0400 (0:00:00.084) 0:01:20.987 ****** 30564 1726882882.40658: entering _queue_task() for managed_node2/dnf 30564 1726882882.41410: worker is 1 (out of 1 available) 30564 1726882882.41442: exiting _queue_task() for managed_node2/dnf 30564 1726882882.41456: done queuing things up, now waiting for results queue to drain 30564 1726882882.41490: waiting for pending results... 30564 1726882882.42147: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30564 1726882882.42248: in run() - task 0e448fcc-3ce9-4216-acec-0000000019c4 30564 1726882882.42277: variable 'ansible_search_path' from source: unknown 30564 1726882882.42282: variable 'ansible_search_path' from source: unknown 30564 1726882882.42447: calling self._execute() 30564 1726882882.42491: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882882.42494: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882882.42525: variable 'omit' from source: magic vars 30564 1726882882.43279: variable 'ansible_distribution_major_version' from source: facts 30564 1726882882.43283: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882882.43388: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882882.45759: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882882.45809: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882882.45839: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882882.45867: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882882.45892: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882882.45952: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882882.45974: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882882.45993: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882882.46023: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882882.46034: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882882.46115: variable 'ansible_distribution' from source: facts 30564 1726882882.46119: variable 'ansible_distribution_major_version' from source: facts 30564 1726882882.46131: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 30564 1726882882.46206: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882882.46295: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882882.46312: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882882.46331: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882882.46359: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882882.46373: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882882.46399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882882.46415: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882882.46433: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882882.46460: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882882.46473: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882882.46499: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882882.46515: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882882.46531: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882882.46581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882882.46592: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882882.46692: variable 'network_connections' from source: include params 30564 1726882882.46701: variable 'interface' from source: play vars 30564 1726882882.46744: variable 'interface' from source: play vars 30564 1726882882.46797: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882882.47401: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882882.47404: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882882.47406: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882882.47408: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882882.47411: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882882.47413: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882882.47423: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882882.47757: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882882.47760: variable '__network_team_connections_defined' from source: role '' defaults 30564 1726882882.47766: variable 'network_connections' from source: include params 30564 1726882882.47771: variable 'interface' from source: play vars 30564 1726882882.47774: variable 'interface' from source: play vars 30564 1726882882.47777: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30564 1726882882.47779: when evaluation is False, skipping this task 30564 1726882882.47782: _execute() done 30564 1726882882.47784: dumping result to json 30564 1726882882.47787: done dumping result, returning 30564 1726882882.47790: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0e448fcc-3ce9-4216-acec-0000000019c4] 30564 1726882882.47791: sending task result for task 0e448fcc-3ce9-4216-acec-0000000019c4 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30564 1726882882.48245: no more pending results, returning what we have 30564 1726882882.48252: results queue empty 30564 1726882882.48253: checking for any_errors_fatal 30564 1726882882.48287: done checking for any_errors_fatal 30564 1726882882.48289: checking for max_fail_percentage 30564 1726882882.48291: done checking for max_fail_percentage 30564 1726882882.48292: checking to see if all hosts have failed and the running result is not ok 30564 1726882882.48293: done checking to see if all hosts have failed 30564 1726882882.48299: getting the remaining hosts for this loop 30564 1726882882.48300: done getting the remaining hosts for this loop 30564 1726882882.48308: getting the next task for host managed_node2 30564 1726882882.48317: done getting next task for host managed_node2 30564 1726882882.48321: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30564 1726882882.48326: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882882.48355: getting variables 30564 1726882882.48357: in VariableManager get_vars() 30564 1726882882.48403: Calling all_inventory to load vars for managed_node2 30564 1726882882.48406: Calling groups_inventory to load vars for managed_node2 30564 1726882882.48408: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882882.48418: Calling all_plugins_play to load vars for managed_node2 30564 1726882882.48421: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882882.48423: Calling groups_plugins_play to load vars for managed_node2 30564 1726882882.49107: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000019c4 30564 1726882882.49111: WORKER PROCESS EXITING 30564 1726882882.51118: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882882.53595: done with get_vars() 30564 1726882882.53634: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30564 1726882882.53710: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:41:22 -0400 (0:00:00.130) 0:01:21.118 ****** 30564 1726882882.53756: entering _queue_task() for managed_node2/yum 30564 1726882882.54158: worker is 1 (out of 1 available) 30564 1726882882.54178: exiting _queue_task() for managed_node2/yum 30564 1726882882.54192: done queuing things up, now waiting for results queue to drain 30564 1726882882.54193: waiting for pending results... 30564 1726882882.54539: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30564 1726882882.54789: in run() - task 0e448fcc-3ce9-4216-acec-0000000019c5 30564 1726882882.54844: variable 'ansible_search_path' from source: unknown 30564 1726882882.54852: variable 'ansible_search_path' from source: unknown 30564 1726882882.54913: calling self._execute() 30564 1726882882.55095: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882882.55119: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882882.55136: variable 'omit' from source: magic vars 30564 1726882882.56010: variable 'ansible_distribution_major_version' from source: facts 30564 1726882882.56038: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882882.56399: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882882.60887: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882882.61017: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882882.61126: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882882.61193: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882882.61244: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882882.61388: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882882.61435: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882882.61467: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882882.61539: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882882.61560: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882882.61666: variable 'ansible_distribution_major_version' from source: facts 30564 1726882882.61682: Evaluated conditional (ansible_distribution_major_version | int < 8): False 30564 1726882882.61685: when evaluation is False, skipping this task 30564 1726882882.61688: _execute() done 30564 1726882882.61690: dumping result to json 30564 1726882882.61692: done dumping result, returning 30564 1726882882.61700: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0e448fcc-3ce9-4216-acec-0000000019c5] 30564 1726882882.61706: sending task result for task 0e448fcc-3ce9-4216-acec-0000000019c5 30564 1726882882.61834: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000019c5 30564 1726882882.61836: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 30564 1726882882.61889: no more pending results, returning what we have 30564 1726882882.61894: results queue empty 30564 1726882882.61895: checking for any_errors_fatal 30564 1726882882.61903: done checking for any_errors_fatal 30564 1726882882.61904: checking for max_fail_percentage 30564 1726882882.61906: done checking for max_fail_percentage 30564 1726882882.61907: checking to see if all hosts have failed and the running result is not ok 30564 1726882882.61907: done checking to see if all hosts have failed 30564 1726882882.61908: getting the remaining hosts for this loop 30564 1726882882.61910: done getting the remaining hosts for this loop 30564 1726882882.61914: getting the next task for host managed_node2 30564 1726882882.61922: done getting next task for host managed_node2 30564 1726882882.61927: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30564 1726882882.61932: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882882.61954: getting variables 30564 1726882882.61956: in VariableManager get_vars() 30564 1726882882.61996: Calling all_inventory to load vars for managed_node2 30564 1726882882.61999: Calling groups_inventory to load vars for managed_node2 30564 1726882882.62001: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882882.62042: Calling all_plugins_play to load vars for managed_node2 30564 1726882882.62047: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882882.62051: Calling groups_plugins_play to load vars for managed_node2 30564 1726882882.66247: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882882.68331: done with get_vars() 30564 1726882882.68363: done getting variables 30564 1726882882.68438: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:41:22 -0400 (0:00:00.147) 0:01:21.266 ****** 30564 1726882882.68479: entering _queue_task() for managed_node2/fail 30564 1726882882.68855: worker is 1 (out of 1 available) 30564 1726882882.68873: exiting _queue_task() for managed_node2/fail 30564 1726882882.68889: done queuing things up, now waiting for results queue to drain 30564 1726882882.68891: waiting for pending results... 30564 1726882882.69218: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30564 1726882882.69332: in run() - task 0e448fcc-3ce9-4216-acec-0000000019c6 30564 1726882882.69343: variable 'ansible_search_path' from source: unknown 30564 1726882882.69346: variable 'ansible_search_path' from source: unknown 30564 1726882882.69383: calling self._execute() 30564 1726882882.69513: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882882.69524: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882882.69538: variable 'omit' from source: magic vars 30564 1726882882.70018: variable 'ansible_distribution_major_version' from source: facts 30564 1726882882.70042: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882882.70180: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882882.70790: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882882.84886: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882882.84991: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882882.85057: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882882.85110: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882882.85153: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882882.85235: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882882.85277: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882882.85324: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882882.85411: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882882.85432: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882882.85500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882882.85531: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882882.85593: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882882.85647: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882882.85676: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882882.85736: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882882.85773: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882882.85812: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882882.85860: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882882.85891: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882882.86090: variable 'network_connections' from source: include params 30564 1726882882.86111: variable 'interface' from source: play vars 30564 1726882882.86198: variable 'interface' from source: play vars 30564 1726882882.86296: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882882.86500: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882882.86545: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882882.86592: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882882.86624: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882882.86690: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882882.86736: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882882.86785: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882882.86828: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882882.86894: variable '__network_team_connections_defined' from source: role '' defaults 30564 1726882882.87210: variable 'network_connections' from source: include params 30564 1726882882.87225: variable 'interface' from source: play vars 30564 1726882882.87318: variable 'interface' from source: play vars 30564 1726882882.87344: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30564 1726882882.87355: when evaluation is False, skipping this task 30564 1726882882.87362: _execute() done 30564 1726882882.87372: dumping result to json 30564 1726882882.87380: done dumping result, returning 30564 1726882882.87390: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-4216-acec-0000000019c6] 30564 1726882882.87398: sending task result for task 0e448fcc-3ce9-4216-acec-0000000019c6 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30564 1726882882.87585: no more pending results, returning what we have 30564 1726882882.87588: results queue empty 30564 1726882882.87589: checking for any_errors_fatal 30564 1726882882.87598: done checking for any_errors_fatal 30564 1726882882.87599: checking for max_fail_percentage 30564 1726882882.87601: done checking for max_fail_percentage 30564 1726882882.87602: checking to see if all hosts have failed and the running result is not ok 30564 1726882882.87603: done checking to see if all hosts have failed 30564 1726882882.87604: getting the remaining hosts for this loop 30564 1726882882.87605: done getting the remaining hosts for this loop 30564 1726882882.87610: getting the next task for host managed_node2 30564 1726882882.87619: done getting next task for host managed_node2 30564 1726882882.87623: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 30564 1726882882.87628: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882882.87654: getting variables 30564 1726882882.87656: in VariableManager get_vars() 30564 1726882882.87704: Calling all_inventory to load vars for managed_node2 30564 1726882882.87706: Calling groups_inventory to load vars for managed_node2 30564 1726882882.87708: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882882.87717: Calling all_plugins_play to load vars for managed_node2 30564 1726882882.87719: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882882.87721: Calling groups_plugins_play to load vars for managed_node2 30564 1726882882.88689: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000019c6 30564 1726882882.88693: WORKER PROCESS EXITING 30564 1726882882.95896: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882882.97773: done with get_vars() 30564 1726882882.97802: done getting variables 30564 1726882882.97851: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:41:22 -0400 (0:00:00.294) 0:01:21.560 ****** 30564 1726882882.97897: entering _queue_task() for managed_node2/package 30564 1726882882.98261: worker is 1 (out of 1 available) 30564 1726882882.98282: exiting _queue_task() for managed_node2/package 30564 1726882882.98299: done queuing things up, now waiting for results queue to drain 30564 1726882882.98302: waiting for pending results... 30564 1726882882.98642: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 30564 1726882882.98813: in run() - task 0e448fcc-3ce9-4216-acec-0000000019c7 30564 1726882882.98834: variable 'ansible_search_path' from source: unknown 30564 1726882882.98848: variable 'ansible_search_path' from source: unknown 30564 1726882882.98898: calling self._execute() 30564 1726882882.99027: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882882.99042: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882882.99062: variable 'omit' from source: magic vars 30564 1726882882.99510: variable 'ansible_distribution_major_version' from source: facts 30564 1726882882.99538: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882882.99791: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882883.00156: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882883.00228: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882883.00355: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882883.00454: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882883.00602: variable 'network_packages' from source: role '' defaults 30564 1726882883.00781: variable '__network_provider_setup' from source: role '' defaults 30564 1726882883.00797: variable '__network_service_name_default_nm' from source: role '' defaults 30564 1726882883.00886: variable '__network_service_name_default_nm' from source: role '' defaults 30564 1726882883.01006: variable '__network_packages_default_nm' from source: role '' defaults 30564 1726882883.01072: variable '__network_packages_default_nm' from source: role '' defaults 30564 1726882883.01278: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882883.04579: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882883.04655: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882883.04712: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882883.04754: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882883.04809: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882883.04908: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882883.04944: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882883.04990: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882883.05040: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882883.05061: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882883.05124: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882883.05153: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882883.05192: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882883.05245: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882883.05270: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882883.05700: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30564 1726882883.05838: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882883.05877: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882883.05907: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882883.05963: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882883.05987: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882883.06093: variable 'ansible_python' from source: facts 30564 1726882883.06114: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30564 1726882883.06213: variable '__network_wpa_supplicant_required' from source: role '' defaults 30564 1726882883.06299: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30564 1726882883.06423: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882883.06447: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882883.06487: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882883.06533: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882883.06551: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882883.07254: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882883.07350: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882883.07381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882883.07555: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882883.07581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882883.07853: variable 'network_connections' from source: include params 30564 1726882883.07889: variable 'interface' from source: play vars 30564 1726882883.08017: variable 'interface' from source: play vars 30564 1726882883.08108: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882883.08142: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882883.08187: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882883.08234: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882883.08303: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882883.08643: variable 'network_connections' from source: include params 30564 1726882883.08656: variable 'interface' from source: play vars 30564 1726882883.08777: variable 'interface' from source: play vars 30564 1726882883.08816: variable '__network_packages_default_wireless' from source: role '' defaults 30564 1726882883.08909: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882883.09281: variable 'network_connections' from source: include params 30564 1726882883.09299: variable 'interface' from source: play vars 30564 1726882883.09381: variable 'interface' from source: play vars 30564 1726882883.09417: variable '__network_packages_default_team' from source: role '' defaults 30564 1726882883.09514: variable '__network_team_connections_defined' from source: role '' defaults 30564 1726882883.09880: variable 'network_connections' from source: include params 30564 1726882883.09889: variable 'interface' from source: play vars 30564 1726882883.09971: variable 'interface' from source: play vars 30564 1726882883.10033: variable '__network_service_name_default_initscripts' from source: role '' defaults 30564 1726882883.10110: variable '__network_service_name_default_initscripts' from source: role '' defaults 30564 1726882883.10123: variable '__network_packages_default_initscripts' from source: role '' defaults 30564 1726882883.10187: variable '__network_packages_default_initscripts' from source: role '' defaults 30564 1726882883.10428: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30564 1726882883.11002: variable 'network_connections' from source: include params 30564 1726882883.11012: variable 'interface' from source: play vars 30564 1726882883.11087: variable 'interface' from source: play vars 30564 1726882883.11104: variable 'ansible_distribution' from source: facts 30564 1726882883.11113: variable '__network_rh_distros' from source: role '' defaults 30564 1726882883.11123: variable 'ansible_distribution_major_version' from source: facts 30564 1726882883.11147: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30564 1726882883.11594: variable 'ansible_distribution' from source: facts 30564 1726882883.11598: variable '__network_rh_distros' from source: role '' defaults 30564 1726882883.11604: variable 'ansible_distribution_major_version' from source: facts 30564 1726882883.11616: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30564 1726882883.11787: variable 'ansible_distribution' from source: facts 30564 1726882883.11790: variable '__network_rh_distros' from source: role '' defaults 30564 1726882883.11795: variable 'ansible_distribution_major_version' from source: facts 30564 1726882883.11831: variable 'network_provider' from source: set_fact 30564 1726882883.11851: variable 'ansible_facts' from source: unknown 30564 1726882883.12818: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 30564 1726882883.12821: when evaluation is False, skipping this task 30564 1726882883.12828: _execute() done 30564 1726882883.12831: dumping result to json 30564 1726882883.12833: done dumping result, returning 30564 1726882883.12842: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [0e448fcc-3ce9-4216-acec-0000000019c7] 30564 1726882883.12848: sending task result for task 0e448fcc-3ce9-4216-acec-0000000019c7 30564 1726882883.12951: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000019c7 30564 1726882883.12954: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 30564 1726882883.13008: no more pending results, returning what we have 30564 1726882883.13011: results queue empty 30564 1726882883.13012: checking for any_errors_fatal 30564 1726882883.13022: done checking for any_errors_fatal 30564 1726882883.13023: checking for max_fail_percentage 30564 1726882883.13025: done checking for max_fail_percentage 30564 1726882883.13026: checking to see if all hosts have failed and the running result is not ok 30564 1726882883.13027: done checking to see if all hosts have failed 30564 1726882883.13027: getting the remaining hosts for this loop 30564 1726882883.13030: done getting the remaining hosts for this loop 30564 1726882883.13034: getting the next task for host managed_node2 30564 1726882883.13043: done getting next task for host managed_node2 30564 1726882883.13048: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30564 1726882883.13052: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882883.13083: getting variables 30564 1726882883.13085: in VariableManager get_vars() 30564 1726882883.13129: Calling all_inventory to load vars for managed_node2 30564 1726882883.13131: Calling groups_inventory to load vars for managed_node2 30564 1726882883.13134: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882883.13144: Calling all_plugins_play to load vars for managed_node2 30564 1726882883.13147: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882883.13149: Calling groups_plugins_play to load vars for managed_node2 30564 1726882883.16189: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882883.17704: done with get_vars() 30564 1726882883.17740: done getting variables 30564 1726882883.17833: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:41:23 -0400 (0:00:00.199) 0:01:21.760 ****** 30564 1726882883.17891: entering _queue_task() for managed_node2/package 30564 1726882883.18263: worker is 1 (out of 1 available) 30564 1726882883.18282: exiting _queue_task() for managed_node2/package 30564 1726882883.18299: done queuing things up, now waiting for results queue to drain 30564 1726882883.18301: waiting for pending results... 30564 1726882883.18622: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30564 1726882883.18804: in run() - task 0e448fcc-3ce9-4216-acec-0000000019c8 30564 1726882883.18824: variable 'ansible_search_path' from source: unknown 30564 1726882883.18833: variable 'ansible_search_path' from source: unknown 30564 1726882883.18884: calling self._execute() 30564 1726882883.18999: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882883.19011: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882883.19027: variable 'omit' from source: magic vars 30564 1726882883.19362: variable 'ansible_distribution_major_version' from source: facts 30564 1726882883.19377: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882883.19469: variable 'network_state' from source: role '' defaults 30564 1726882883.19482: Evaluated conditional (network_state != {}): False 30564 1726882883.19485: when evaluation is False, skipping this task 30564 1726882883.19488: _execute() done 30564 1726882883.19493: dumping result to json 30564 1726882883.19497: done dumping result, returning 30564 1726882883.19502: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0e448fcc-3ce9-4216-acec-0000000019c8] 30564 1726882883.19505: sending task result for task 0e448fcc-3ce9-4216-acec-0000000019c8 30564 1726882883.19648: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000019c8 30564 1726882883.19650: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30564 1726882883.19701: no more pending results, returning what we have 30564 1726882883.19705: results queue empty 30564 1726882883.19706: checking for any_errors_fatal 30564 1726882883.19713: done checking for any_errors_fatal 30564 1726882883.19723: checking for max_fail_percentage 30564 1726882883.19724: done checking for max_fail_percentage 30564 1726882883.19725: checking to see if all hosts have failed and the running result is not ok 30564 1726882883.19726: done checking to see if all hosts have failed 30564 1726882883.19727: getting the remaining hosts for this loop 30564 1726882883.19729: done getting the remaining hosts for this loop 30564 1726882883.19732: getting the next task for host managed_node2 30564 1726882883.19886: done getting next task for host managed_node2 30564 1726882883.19891: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30564 1726882883.19896: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882883.19917: getting variables 30564 1726882883.19919: in VariableManager get_vars() 30564 1726882883.19954: Calling all_inventory to load vars for managed_node2 30564 1726882883.19957: Calling groups_inventory to load vars for managed_node2 30564 1726882883.19960: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882883.19974: Calling all_plugins_play to load vars for managed_node2 30564 1726882883.19977: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882883.19981: Calling groups_plugins_play to load vars for managed_node2 30564 1726882883.21463: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882883.23209: done with get_vars() 30564 1726882883.23228: done getting variables 30564 1726882883.23274: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:41:23 -0400 (0:00:00.054) 0:01:21.814 ****** 30564 1726882883.23310: entering _queue_task() for managed_node2/package 30564 1726882883.23553: worker is 1 (out of 1 available) 30564 1726882883.23567: exiting _queue_task() for managed_node2/package 30564 1726882883.23581: done queuing things up, now waiting for results queue to drain 30564 1726882883.23583: waiting for pending results... 30564 1726882883.23788: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30564 1726882883.23889: in run() - task 0e448fcc-3ce9-4216-acec-0000000019c9 30564 1726882883.23931: variable 'ansible_search_path' from source: unknown 30564 1726882883.23935: variable 'ansible_search_path' from source: unknown 30564 1726882883.23946: calling self._execute() 30564 1726882883.24042: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882883.24048: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882883.24053: variable 'omit' from source: magic vars 30564 1726882883.24447: variable 'ansible_distribution_major_version' from source: facts 30564 1726882883.24475: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882883.24596: variable 'network_state' from source: role '' defaults 30564 1726882883.24606: Evaluated conditional (network_state != {}): False 30564 1726882883.24610: when evaluation is False, skipping this task 30564 1726882883.24613: _execute() done 30564 1726882883.24616: dumping result to json 30564 1726882883.24618: done dumping result, returning 30564 1726882883.24627: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0e448fcc-3ce9-4216-acec-0000000019c9] 30564 1726882883.24632: sending task result for task 0e448fcc-3ce9-4216-acec-0000000019c9 30564 1726882883.24733: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000019c9 30564 1726882883.24736: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30564 1726882883.24819: no more pending results, returning what we have 30564 1726882883.24823: results queue empty 30564 1726882883.24825: checking for any_errors_fatal 30564 1726882883.24836: done checking for any_errors_fatal 30564 1726882883.24837: checking for max_fail_percentage 30564 1726882883.24838: done checking for max_fail_percentage 30564 1726882883.24839: checking to see if all hosts have failed and the running result is not ok 30564 1726882883.24840: done checking to see if all hosts have failed 30564 1726882883.24841: getting the remaining hosts for this loop 30564 1726882883.24843: done getting the remaining hosts for this loop 30564 1726882883.24857: getting the next task for host managed_node2 30564 1726882883.24867: done getting next task for host managed_node2 30564 1726882883.24872: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30564 1726882883.24879: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882883.24900: getting variables 30564 1726882883.24902: in VariableManager get_vars() 30564 1726882883.24939: Calling all_inventory to load vars for managed_node2 30564 1726882883.24942: Calling groups_inventory to load vars for managed_node2 30564 1726882883.24945: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882883.24969: Calling all_plugins_play to load vars for managed_node2 30564 1726882883.24973: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882883.24977: Calling groups_plugins_play to load vars for managed_node2 30564 1726882883.26734: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882883.28661: done with get_vars() 30564 1726882883.28689: done getting variables 30564 1726882883.28738: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:41:23 -0400 (0:00:00.054) 0:01:21.868 ****** 30564 1726882883.28767: entering _queue_task() for managed_node2/service 30564 1726882883.29025: worker is 1 (out of 1 available) 30564 1726882883.29038: exiting _queue_task() for managed_node2/service 30564 1726882883.29050: done queuing things up, now waiting for results queue to drain 30564 1726882883.29051: waiting for pending results... 30564 1726882883.29259: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30564 1726882883.29356: in run() - task 0e448fcc-3ce9-4216-acec-0000000019ca 30564 1726882883.29371: variable 'ansible_search_path' from source: unknown 30564 1726882883.29375: variable 'ansible_search_path' from source: unknown 30564 1726882883.29402: calling self._execute() 30564 1726882883.29493: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882883.29497: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882883.29506: variable 'omit' from source: magic vars 30564 1726882883.29806: variable 'ansible_distribution_major_version' from source: facts 30564 1726882883.29817: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882883.29908: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882883.30043: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882883.33231: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882883.33347: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882883.33353: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882883.33400: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882883.33421: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882883.33500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882883.33528: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882883.33546: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882883.33583: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882883.33609: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882883.33645: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882883.33661: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882883.33726: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882883.33777: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882883.33841: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882883.33846: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882883.33895: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882883.34134: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882883.34585: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882883.34588: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882883.34609: variable 'network_connections' from source: include params 30564 1726882883.34612: variable 'interface' from source: play vars 30564 1726882883.34615: variable 'interface' from source: play vars 30564 1726882883.34617: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882883.34689: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882883.34730: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882883.34761: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882883.34799: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882883.34838: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882883.34862: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882883.34897: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882883.34924: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882883.35013: variable '__network_team_connections_defined' from source: role '' defaults 30564 1726882883.35346: variable 'network_connections' from source: include params 30564 1726882883.35349: variable 'interface' from source: play vars 30564 1726882883.35445: variable 'interface' from source: play vars 30564 1726882883.35468: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30564 1726882883.35486: when evaluation is False, skipping this task 30564 1726882883.35495: _execute() done 30564 1726882883.35498: dumping result to json 30564 1726882883.35501: done dumping result, returning 30564 1726882883.35504: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-4216-acec-0000000019ca] 30564 1726882883.35532: sending task result for task 0e448fcc-3ce9-4216-acec-0000000019ca skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30564 1726882883.35764: no more pending results, returning what we have 30564 1726882883.35773: results queue empty 30564 1726882883.35775: checking for any_errors_fatal 30564 1726882883.35787: done checking for any_errors_fatal 30564 1726882883.35789: checking for max_fail_percentage 30564 1726882883.35791: done checking for max_fail_percentage 30564 1726882883.35792: checking to see if all hosts have failed and the running result is not ok 30564 1726882883.35793: done checking to see if all hosts have failed 30564 1726882883.35794: getting the remaining hosts for this loop 30564 1726882883.35796: done getting the remaining hosts for this loop 30564 1726882883.35803: getting the next task for host managed_node2 30564 1726882883.35817: done getting next task for host managed_node2 30564 1726882883.35827: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30564 1726882883.35833: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882883.35876: getting variables 30564 1726882883.35882: in VariableManager get_vars() 30564 1726882883.35944: Calling all_inventory to load vars for managed_node2 30564 1726882883.35950: Calling groups_inventory to load vars for managed_node2 30564 1726882883.35955: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882883.35976: Calling all_plugins_play to load vars for managed_node2 30564 1726882883.35983: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882883.35990: Calling groups_plugins_play to load vars for managed_node2 30564 1726882883.36775: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000019ca 30564 1726882883.36779: WORKER PROCESS EXITING 30564 1726882883.38224: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882883.39333: done with get_vars() 30564 1726882883.39350: done getting variables 30564 1726882883.39396: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:41:23 -0400 (0:00:00.106) 0:01:21.975 ****** 30564 1726882883.39421: entering _queue_task() for managed_node2/service 30564 1726882883.39658: worker is 1 (out of 1 available) 30564 1726882883.39673: exiting _queue_task() for managed_node2/service 30564 1726882883.39686: done queuing things up, now waiting for results queue to drain 30564 1726882883.39687: waiting for pending results... 30564 1726882883.39884: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30564 1726882883.39992: in run() - task 0e448fcc-3ce9-4216-acec-0000000019cb 30564 1726882883.40008: variable 'ansible_search_path' from source: unknown 30564 1726882883.40011: variable 'ansible_search_path' from source: unknown 30564 1726882883.40037: calling self._execute() 30564 1726882883.40178: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882883.40198: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882883.40222: variable 'omit' from source: magic vars 30564 1726882883.40750: variable 'ansible_distribution_major_version' from source: facts 30564 1726882883.40777: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882883.40984: variable 'network_provider' from source: set_fact 30564 1726882883.41002: variable 'network_state' from source: role '' defaults 30564 1726882883.41038: Evaluated conditional (network_provider == "nm" or network_state != {}): True 30564 1726882883.41052: variable 'omit' from source: magic vars 30564 1726882883.41152: variable 'omit' from source: magic vars 30564 1726882883.41188: variable 'network_service_name' from source: role '' defaults 30564 1726882883.41290: variable 'network_service_name' from source: role '' defaults 30564 1726882883.41444: variable '__network_provider_setup' from source: role '' defaults 30564 1726882883.41498: variable '__network_service_name_default_nm' from source: role '' defaults 30564 1726882883.41600: variable '__network_service_name_default_nm' from source: role '' defaults 30564 1726882883.41634: variable '__network_packages_default_nm' from source: role '' defaults 30564 1726882883.41729: variable '__network_packages_default_nm' from source: role '' defaults 30564 1726882883.42075: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882883.44283: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882883.44349: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882883.44384: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882883.44411: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882883.44433: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882883.44496: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882883.44517: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882883.44535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882883.44564: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882883.44579: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882883.44612: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882883.44630: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882883.44653: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882883.44697: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882883.44710: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882883.44873: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30564 1726882883.44952: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882883.44973: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882883.44989: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882883.45037: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882883.45040: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882883.45116: variable 'ansible_python' from source: facts 30564 1726882883.45157: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30564 1726882883.45240: variable '__network_wpa_supplicant_required' from source: role '' defaults 30564 1726882883.45298: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30564 1726882883.45427: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882883.45495: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882883.45498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882883.45513: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882883.45524: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882883.45559: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882883.45581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882883.45599: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882883.45626: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882883.45666: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882883.45744: variable 'network_connections' from source: include params 30564 1726882883.45751: variable 'interface' from source: play vars 30564 1726882883.45808: variable 'interface' from source: play vars 30564 1726882883.45909: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882883.46116: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882883.46145: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882883.46205: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882883.46239: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882883.46297: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882883.46318: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882883.46351: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882883.46382: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882883.46443: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882883.46673: variable 'network_connections' from source: include params 30564 1726882883.46676: variable 'interface' from source: play vars 30564 1726882883.46741: variable 'interface' from source: play vars 30564 1726882883.46775: variable '__network_packages_default_wireless' from source: role '' defaults 30564 1726882883.46867: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882883.47130: variable 'network_connections' from source: include params 30564 1726882883.47137: variable 'interface' from source: play vars 30564 1726882883.47215: variable 'interface' from source: play vars 30564 1726882883.47232: variable '__network_packages_default_team' from source: role '' defaults 30564 1726882883.47326: variable '__network_team_connections_defined' from source: role '' defaults 30564 1726882883.47548: variable 'network_connections' from source: include params 30564 1726882883.47551: variable 'interface' from source: play vars 30564 1726882883.47609: variable 'interface' from source: play vars 30564 1726882883.47655: variable '__network_service_name_default_initscripts' from source: role '' defaults 30564 1726882883.47703: variable '__network_service_name_default_initscripts' from source: role '' defaults 30564 1726882883.47709: variable '__network_packages_default_initscripts' from source: role '' defaults 30564 1726882883.47758: variable '__network_packages_default_initscripts' from source: role '' defaults 30564 1726882883.47903: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30564 1726882883.48246: variable 'network_connections' from source: include params 30564 1726882883.48250: variable 'interface' from source: play vars 30564 1726882883.48298: variable 'interface' from source: play vars 30564 1726882883.48304: variable 'ansible_distribution' from source: facts 30564 1726882883.48307: variable '__network_rh_distros' from source: role '' defaults 30564 1726882883.48313: variable 'ansible_distribution_major_version' from source: facts 30564 1726882883.48328: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30564 1726882883.48439: variable 'ansible_distribution' from source: facts 30564 1726882883.48442: variable '__network_rh_distros' from source: role '' defaults 30564 1726882883.48447: variable 'ansible_distribution_major_version' from source: facts 30564 1726882883.48458: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30564 1726882883.48710: variable 'ansible_distribution' from source: facts 30564 1726882883.48716: variable '__network_rh_distros' from source: role '' defaults 30564 1726882883.48721: variable 'ansible_distribution_major_version' from source: facts 30564 1726882883.48724: variable 'network_provider' from source: set_fact 30564 1726882883.48755: variable 'omit' from source: magic vars 30564 1726882883.48772: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882883.48792: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882883.48809: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882883.48823: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882883.48833: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882883.48857: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882883.48860: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882883.48862: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882883.48943: Set connection var ansible_timeout to 10 30564 1726882883.48958: Set connection var ansible_pipelining to False 30564 1726882883.48961: Set connection var ansible_shell_type to sh 30564 1726882883.48965: Set connection var ansible_shell_executable to /bin/sh 30564 1726882883.48967: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882883.48971: Set connection var ansible_connection to ssh 30564 1726882883.49067: variable 'ansible_shell_executable' from source: unknown 30564 1726882883.49086: variable 'ansible_connection' from source: unknown 30564 1726882883.49093: variable 'ansible_module_compression' from source: unknown 30564 1726882883.49101: variable 'ansible_shell_type' from source: unknown 30564 1726882883.49116: variable 'ansible_shell_executable' from source: unknown 30564 1726882883.49118: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882883.49121: variable 'ansible_pipelining' from source: unknown 30564 1726882883.49123: variable 'ansible_timeout' from source: unknown 30564 1726882883.49125: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882883.49217: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882883.49259: variable 'omit' from source: magic vars 30564 1726882883.49325: starting attempt loop 30564 1726882883.49343: running the handler 30564 1726882883.49363: variable 'ansible_facts' from source: unknown 30564 1726882883.50004: _low_level_execute_command(): starting 30564 1726882883.50008: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882883.50684: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882883.50695: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882883.50706: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882883.50719: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882883.50763: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882883.50771: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882883.50774: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882883.50788: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882883.50796: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882883.50802: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882883.50810: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882883.50819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882883.50829: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882883.50838: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882883.50842: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882883.50851: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882883.50925: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882883.50940: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882883.50944: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882883.51102: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882883.52750: stdout chunk (state=3): >>>/root <<< 30564 1726882883.52854: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882883.52902: stderr chunk (state=3): >>><<< 30564 1726882883.52907: stdout chunk (state=3): >>><<< 30564 1726882883.52926: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882883.52936: _low_level_execute_command(): starting 30564 1726882883.52942: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882883.5292597-34146-248357732506239 `" && echo ansible-tmp-1726882883.5292597-34146-248357732506239="` echo /root/.ansible/tmp/ansible-tmp-1726882883.5292597-34146-248357732506239 `" ) && sleep 0' 30564 1726882883.53392: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882883.53398: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882883.53425: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882883.53432: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882883.53441: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882883.53478: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882883.53492: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882883.53498: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882883.53503: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882883.53508: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882883.53523: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882883.53622: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882883.53625: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882883.53638: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882883.53823: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882883.55623: stdout chunk (state=3): >>>ansible-tmp-1726882883.5292597-34146-248357732506239=/root/.ansible/tmp/ansible-tmp-1726882883.5292597-34146-248357732506239 <<< 30564 1726882883.55729: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882883.55799: stderr chunk (state=3): >>><<< 30564 1726882883.55810: stdout chunk (state=3): >>><<< 30564 1726882883.56073: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882883.5292597-34146-248357732506239=/root/.ansible/tmp/ansible-tmp-1726882883.5292597-34146-248357732506239 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882883.56080: variable 'ansible_module_compression' from source: unknown 30564 1726882883.56083: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 30564 1726882883.56085: variable 'ansible_facts' from source: unknown 30564 1726882883.56181: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882883.5292597-34146-248357732506239/AnsiballZ_systemd.py 30564 1726882883.56340: Sending initial data 30564 1726882883.56344: Sent initial data (156 bytes) 30564 1726882883.57319: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882883.57334: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882883.57348: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882883.57367: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882883.57407: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882883.57420: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882883.57434: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882883.57452: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882883.57467: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882883.57486: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882883.57499: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882883.57513: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882883.57529: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882883.57541: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882883.57557: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882883.57575: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882883.57654: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882883.57682: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882883.57700: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882883.57835: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882883.59598: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882883.59699: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882883.59805: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmp6cnmsn9a /root/.ansible/tmp/ansible-tmp-1726882883.5292597-34146-248357732506239/AnsiballZ_systemd.py <<< 30564 1726882883.59901: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882883.62122: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882883.62361: stderr chunk (state=3): >>><<< 30564 1726882883.62367: stdout chunk (state=3): >>><<< 30564 1726882883.62373: done transferring module to remote 30564 1726882883.62375: _low_level_execute_command(): starting 30564 1726882883.62378: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882883.5292597-34146-248357732506239/ /root/.ansible/tmp/ansible-tmp-1726882883.5292597-34146-248357732506239/AnsiballZ_systemd.py && sleep 0' 30564 1726882883.62930: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882883.62944: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882883.62959: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882883.62984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882883.63025: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882883.63037: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882883.63051: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882883.63074: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882883.63088: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882883.63099: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882883.63112: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882883.63126: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882883.63143: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882883.63156: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882883.63173: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882883.63189: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882883.63262: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882883.63284: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882883.63297: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882883.63546: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882883.65261: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882883.65305: stderr chunk (state=3): >>><<< 30564 1726882883.65308: stdout chunk (state=3): >>><<< 30564 1726882883.65322: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882883.65329: _low_level_execute_command(): starting 30564 1726882883.65332: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882883.5292597-34146-248357732506239/AnsiballZ_systemd.py && sleep 0' 30564 1726882883.65751: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882883.65759: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882883.65775: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882883.65803: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882883.65810: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882883.65819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882883.65828: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882883.65833: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882883.65838: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882883.65847: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882883.65859: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882883.65862: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882883.65874: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882883.65919: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882883.65932: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882883.65941: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882883.66055: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882883.90827: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6692", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ExecMainStartTimestampMonotonic": "202392137", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6692", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3602", "MemoryCurrent": "9187328", "MemoryAvailable": "infinity", "CPUUsageNSec": "2270954000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft"<<< 30564 1726882883.90866: stdout chunk (state=3): >>>: "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service multi-user.target network.target shutdown.target cloud-init.service", "After": "cloud-init-local.service dbus-broker.service network-pre.target system.slice dbus.socket systemd-journald.socket basic.target sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:57 EDT", "StateChangeTimestampMonotonic": "316658837", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveExitTimestampMonotonic": "202392395", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveEnterTimestampMonotonic": "202472383", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveExitTimestampMonotonic": "202362940", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveEnterTimestampMonotonic": "202381901", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ConditionTimestampMonotonic": "202382734", "AssertTimestamp": "Fri 2024-09-20 21:31:03 EDT", "AssertTimestampMonotonic": "202382737", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "55e27919215348fab37a11b7ea324f90", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 30564 1726882883.92384: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882883.92438: stderr chunk (state=3): >>><<< 30564 1726882883.92442: stdout chunk (state=3): >>><<< 30564 1726882883.92457: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6692", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ExecMainStartTimestampMonotonic": "202392137", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6692", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3602", "MemoryCurrent": "9187328", "MemoryAvailable": "infinity", "CPUUsageNSec": "2270954000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service multi-user.target network.target shutdown.target cloud-init.service", "After": "cloud-init-local.service dbus-broker.service network-pre.target system.slice dbus.socket systemd-journald.socket basic.target sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:57 EDT", "StateChangeTimestampMonotonic": "316658837", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveExitTimestampMonotonic": "202392395", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveEnterTimestampMonotonic": "202472383", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveExitTimestampMonotonic": "202362940", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveEnterTimestampMonotonic": "202381901", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ConditionTimestampMonotonic": "202382734", "AssertTimestamp": "Fri 2024-09-20 21:31:03 EDT", "AssertTimestampMonotonic": "202382737", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "55e27919215348fab37a11b7ea324f90", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882883.92577: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882883.5292597-34146-248357732506239/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882883.92592: _low_level_execute_command(): starting 30564 1726882883.92595: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882883.5292597-34146-248357732506239/ > /dev/null 2>&1 && sleep 0' 30564 1726882883.93052: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882883.93058: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882883.93095: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882883.93108: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882883.93159: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882883.93186: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882883.93284: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882883.95090: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882883.95134: stderr chunk (state=3): >>><<< 30564 1726882883.95138: stdout chunk (state=3): >>><<< 30564 1726882883.95150: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882883.95156: handler run complete 30564 1726882883.95199: attempt loop complete, returning result 30564 1726882883.95202: _execute() done 30564 1726882883.95204: dumping result to json 30564 1726882883.95217: done dumping result, returning 30564 1726882883.95224: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0e448fcc-3ce9-4216-acec-0000000019cb] 30564 1726882883.95229: sending task result for task 0e448fcc-3ce9-4216-acec-0000000019cb 30564 1726882883.95457: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000019cb 30564 1726882883.95460: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30564 1726882883.95526: no more pending results, returning what we have 30564 1726882883.95529: results queue empty 30564 1726882883.95530: checking for any_errors_fatal 30564 1726882883.95536: done checking for any_errors_fatal 30564 1726882883.95536: checking for max_fail_percentage 30564 1726882883.95538: done checking for max_fail_percentage 30564 1726882883.95538: checking to see if all hosts have failed and the running result is not ok 30564 1726882883.95539: done checking to see if all hosts have failed 30564 1726882883.95540: getting the remaining hosts for this loop 30564 1726882883.95542: done getting the remaining hosts for this loop 30564 1726882883.95545: getting the next task for host managed_node2 30564 1726882883.95552: done getting next task for host managed_node2 30564 1726882883.95556: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30564 1726882883.95562: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882883.95585: getting variables 30564 1726882883.95587: in VariableManager get_vars() 30564 1726882883.95618: Calling all_inventory to load vars for managed_node2 30564 1726882883.95621: Calling groups_inventory to load vars for managed_node2 30564 1726882883.95623: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882883.95631: Calling all_plugins_play to load vars for managed_node2 30564 1726882883.95634: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882883.95636: Calling groups_plugins_play to load vars for managed_node2 30564 1726882883.96502: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882883.97459: done with get_vars() 30564 1726882883.97481: done getting variables 30564 1726882883.97527: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:41:23 -0400 (0:00:00.581) 0:01:22.556 ****** 30564 1726882883.97558: entering _queue_task() for managed_node2/service 30564 1726882883.97798: worker is 1 (out of 1 available) 30564 1726882883.97810: exiting _queue_task() for managed_node2/service 30564 1726882883.97822: done queuing things up, now waiting for results queue to drain 30564 1726882883.97824: waiting for pending results... 30564 1726882883.98019: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30564 1726882883.98128: in run() - task 0e448fcc-3ce9-4216-acec-0000000019cc 30564 1726882883.98140: variable 'ansible_search_path' from source: unknown 30564 1726882883.98144: variable 'ansible_search_path' from source: unknown 30564 1726882883.98181: calling self._execute() 30564 1726882883.98257: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882883.98261: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882883.98273: variable 'omit' from source: magic vars 30564 1726882883.98558: variable 'ansible_distribution_major_version' from source: facts 30564 1726882883.98573: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882883.98652: variable 'network_provider' from source: set_fact 30564 1726882883.98656: Evaluated conditional (network_provider == "nm"): True 30564 1726882883.98728: variable '__network_wpa_supplicant_required' from source: role '' defaults 30564 1726882883.98791: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30564 1726882883.98910: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882884.01036: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882884.01085: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882884.01113: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882884.01144: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882884.01162: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882884.01219: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882884.01240: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882884.01261: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882884.01291: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882884.01302: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882884.01333: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882884.01348: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882884.01375: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882884.01398: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882884.01408: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882884.01435: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882884.01451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882884.01476: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882884.01500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882884.01511: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882884.01605: variable 'network_connections' from source: include params 30564 1726882884.01614: variable 'interface' from source: play vars 30564 1726882884.01658: variable 'interface' from source: play vars 30564 1726882884.01721: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882884.01831: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882884.01856: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882884.01882: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882884.01905: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882884.01936: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882884.01951: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882884.01972: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882884.01989: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882884.02030: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882884.02180: variable 'network_connections' from source: include params 30564 1726882884.02184: variable 'interface' from source: play vars 30564 1726882884.02226: variable 'interface' from source: play vars 30564 1726882884.02249: Evaluated conditional (__network_wpa_supplicant_required): False 30564 1726882884.02253: when evaluation is False, skipping this task 30564 1726882884.02255: _execute() done 30564 1726882884.02257: dumping result to json 30564 1726882884.02260: done dumping result, returning 30564 1726882884.02267: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0e448fcc-3ce9-4216-acec-0000000019cc] 30564 1726882884.02280: sending task result for task 0e448fcc-3ce9-4216-acec-0000000019cc skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 30564 1726882884.02438: no more pending results, returning what we have 30564 1726882884.02442: results queue empty 30564 1726882884.02443: checking for any_errors_fatal 30564 1726882884.02469: done checking for any_errors_fatal 30564 1726882884.02472: checking for max_fail_percentage 30564 1726882884.02474: done checking for max_fail_percentage 30564 1726882884.02475: checking to see if all hosts have failed and the running result is not ok 30564 1726882884.02475: done checking to see if all hosts have failed 30564 1726882884.02476: getting the remaining hosts for this loop 30564 1726882884.02478: done getting the remaining hosts for this loop 30564 1726882884.02481: getting the next task for host managed_node2 30564 1726882884.02488: done getting next task for host managed_node2 30564 1726882884.02492: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 30564 1726882884.02497: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882884.02517: getting variables 30564 1726882884.02519: in VariableManager get_vars() 30564 1726882884.02552: Calling all_inventory to load vars for managed_node2 30564 1726882884.02554: Calling groups_inventory to load vars for managed_node2 30564 1726882884.02556: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882884.02566: Calling all_plugins_play to load vars for managed_node2 30564 1726882884.02571: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882884.02574: Calling groups_plugins_play to load vars for managed_node2 30564 1726882884.03249: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000019cc 30564 1726882884.03252: WORKER PROCESS EXITING 30564 1726882884.04202: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882884.06036: done with get_vars() 30564 1726882884.06054: done getting variables 30564 1726882884.06101: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:41:24 -0400 (0:00:00.085) 0:01:22.642 ****** 30564 1726882884.06126: entering _queue_task() for managed_node2/service 30564 1726882884.06356: worker is 1 (out of 1 available) 30564 1726882884.06374: exiting _queue_task() for managed_node2/service 30564 1726882884.06387: done queuing things up, now waiting for results queue to drain 30564 1726882884.06388: waiting for pending results... 30564 1726882884.06581: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 30564 1726882884.06666: in run() - task 0e448fcc-3ce9-4216-acec-0000000019cd 30564 1726882884.06679: variable 'ansible_search_path' from source: unknown 30564 1726882884.06682: variable 'ansible_search_path' from source: unknown 30564 1726882884.06710: calling self._execute() 30564 1726882884.06792: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882884.06796: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882884.06805: variable 'omit' from source: magic vars 30564 1726882884.07092: variable 'ansible_distribution_major_version' from source: facts 30564 1726882884.07104: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882884.07190: variable 'network_provider' from source: set_fact 30564 1726882884.07195: Evaluated conditional (network_provider == "initscripts"): False 30564 1726882884.07198: when evaluation is False, skipping this task 30564 1726882884.07201: _execute() done 30564 1726882884.07203: dumping result to json 30564 1726882884.07205: done dumping result, returning 30564 1726882884.07211: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [0e448fcc-3ce9-4216-acec-0000000019cd] 30564 1726882884.07216: sending task result for task 0e448fcc-3ce9-4216-acec-0000000019cd 30564 1726882884.07310: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000019cd 30564 1726882884.07314: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30564 1726882884.07359: no more pending results, returning what we have 30564 1726882884.07365: results queue empty 30564 1726882884.07366: checking for any_errors_fatal 30564 1726882884.07375: done checking for any_errors_fatal 30564 1726882884.07376: checking for max_fail_percentage 30564 1726882884.07378: done checking for max_fail_percentage 30564 1726882884.07382: checking to see if all hosts have failed and the running result is not ok 30564 1726882884.07383: done checking to see if all hosts have failed 30564 1726882884.07384: getting the remaining hosts for this loop 30564 1726882884.07385: done getting the remaining hosts for this loop 30564 1726882884.07389: getting the next task for host managed_node2 30564 1726882884.07396: done getting next task for host managed_node2 30564 1726882884.07399: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30564 1726882884.07404: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882884.07430: getting variables 30564 1726882884.07432: in VariableManager get_vars() 30564 1726882884.07465: Calling all_inventory to load vars for managed_node2 30564 1726882884.07470: Calling groups_inventory to load vars for managed_node2 30564 1726882884.07472: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882884.07480: Calling all_plugins_play to load vars for managed_node2 30564 1726882884.07482: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882884.07484: Calling groups_plugins_play to load vars for managed_node2 30564 1726882884.08308: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882884.09887: done with get_vars() 30564 1726882884.09907: done getting variables 30564 1726882884.09949: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:41:24 -0400 (0:00:00.038) 0:01:22.681 ****** 30564 1726882884.09979: entering _queue_task() for managed_node2/copy 30564 1726882884.10187: worker is 1 (out of 1 available) 30564 1726882884.10199: exiting _queue_task() for managed_node2/copy 30564 1726882884.10211: done queuing things up, now waiting for results queue to drain 30564 1726882884.10212: waiting for pending results... 30564 1726882884.10408: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30564 1726882884.10499: in run() - task 0e448fcc-3ce9-4216-acec-0000000019ce 30564 1726882884.10511: variable 'ansible_search_path' from source: unknown 30564 1726882884.10515: variable 'ansible_search_path' from source: unknown 30564 1726882884.10545: calling self._execute() 30564 1726882884.10629: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882884.10633: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882884.10642: variable 'omit' from source: magic vars 30564 1726882884.10930: variable 'ansible_distribution_major_version' from source: facts 30564 1726882884.10941: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882884.11030: variable 'network_provider' from source: set_fact 30564 1726882884.11034: Evaluated conditional (network_provider == "initscripts"): False 30564 1726882884.11037: when evaluation is False, skipping this task 30564 1726882884.11040: _execute() done 30564 1726882884.11043: dumping result to json 30564 1726882884.11046: done dumping result, returning 30564 1726882884.11055: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0e448fcc-3ce9-4216-acec-0000000019ce] 30564 1726882884.11061: sending task result for task 0e448fcc-3ce9-4216-acec-0000000019ce 30564 1726882884.11155: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000019ce 30564 1726882884.11157: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 30564 1726882884.11210: no more pending results, returning what we have 30564 1726882884.11214: results queue empty 30564 1726882884.11215: checking for any_errors_fatal 30564 1726882884.11220: done checking for any_errors_fatal 30564 1726882884.11220: checking for max_fail_percentage 30564 1726882884.11222: done checking for max_fail_percentage 30564 1726882884.11223: checking to see if all hosts have failed and the running result is not ok 30564 1726882884.11223: done checking to see if all hosts have failed 30564 1726882884.11224: getting the remaining hosts for this loop 30564 1726882884.11226: done getting the remaining hosts for this loop 30564 1726882884.11229: getting the next task for host managed_node2 30564 1726882884.11235: done getting next task for host managed_node2 30564 1726882884.11239: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30564 1726882884.11244: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882884.11263: getting variables 30564 1726882884.11269: in VariableManager get_vars() 30564 1726882884.11306: Calling all_inventory to load vars for managed_node2 30564 1726882884.11309: Calling groups_inventory to load vars for managed_node2 30564 1726882884.11311: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882884.11318: Calling all_plugins_play to load vars for managed_node2 30564 1726882884.11320: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882884.11322: Calling groups_plugins_play to load vars for managed_node2 30564 1726882884.12380: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882884.14707: done with get_vars() 30564 1726882884.14739: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:41:24 -0400 (0:00:00.048) 0:01:22.729 ****** 30564 1726882884.14809: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 30564 1726882884.15046: worker is 1 (out of 1 available) 30564 1726882884.15057: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 30564 1726882884.15074: done queuing things up, now waiting for results queue to drain 30564 1726882884.15075: waiting for pending results... 30564 1726882884.15277: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30564 1726882884.15370: in run() - task 0e448fcc-3ce9-4216-acec-0000000019cf 30564 1726882884.15391: variable 'ansible_search_path' from source: unknown 30564 1726882884.15396: variable 'ansible_search_path' from source: unknown 30564 1726882884.15426: calling self._execute() 30564 1726882884.15511: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882884.15516: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882884.15525: variable 'omit' from source: magic vars 30564 1726882884.15823: variable 'ansible_distribution_major_version' from source: facts 30564 1726882884.15830: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882884.15836: variable 'omit' from source: magic vars 30564 1726882884.15881: variable 'omit' from source: magic vars 30564 1726882884.15996: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882884.18315: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882884.18408: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882884.18454: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882884.18512: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882884.18540: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882884.18642: variable 'network_provider' from source: set_fact 30564 1726882884.18802: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882884.18839: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882884.18873: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882884.18928: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882884.18948: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882884.19040: variable 'omit' from source: magic vars 30564 1726882884.19172: variable 'omit' from source: magic vars 30564 1726882884.19290: variable 'network_connections' from source: include params 30564 1726882884.19307: variable 'interface' from source: play vars 30564 1726882884.19386: variable 'interface' from source: play vars 30564 1726882884.19541: variable 'omit' from source: magic vars 30564 1726882884.19559: variable '__lsr_ansible_managed' from source: task vars 30564 1726882884.19632: variable '__lsr_ansible_managed' from source: task vars 30564 1726882884.19837: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 30564 1726882884.20089: Loaded config def from plugin (lookup/template) 30564 1726882884.20103: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 30564 1726882884.20141: File lookup term: get_ansible_managed.j2 30564 1726882884.20148: variable 'ansible_search_path' from source: unknown 30564 1726882884.20157: evaluation_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 30564 1726882884.20220: search_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 30564 1726882884.20247: variable 'ansible_search_path' from source: unknown 30564 1726882884.28608: variable 'ansible_managed' from source: unknown 30564 1726882884.28794: variable 'omit' from source: magic vars 30564 1726882884.28832: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882884.28867: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882884.28900: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882884.28927: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882884.28941: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882884.28977: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882884.28986: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882884.28999: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882884.29113: Set connection var ansible_timeout to 10 30564 1726882884.29130: Set connection var ansible_pipelining to False 30564 1726882884.29136: Set connection var ansible_shell_type to sh 30564 1726882884.29145: Set connection var ansible_shell_executable to /bin/sh 30564 1726882884.29157: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882884.29162: Set connection var ansible_connection to ssh 30564 1726882884.29195: variable 'ansible_shell_executable' from source: unknown 30564 1726882884.29203: variable 'ansible_connection' from source: unknown 30564 1726882884.29215: variable 'ansible_module_compression' from source: unknown 30564 1726882884.29224: variable 'ansible_shell_type' from source: unknown 30564 1726882884.29236: variable 'ansible_shell_executable' from source: unknown 30564 1726882884.29247: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882884.29255: variable 'ansible_pipelining' from source: unknown 30564 1726882884.29261: variable 'ansible_timeout' from source: unknown 30564 1726882884.29275: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882884.29420: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30564 1726882884.29449: variable 'omit' from source: magic vars 30564 1726882884.29463: starting attempt loop 30564 1726882884.29476: running the handler 30564 1726882884.29495: _low_level_execute_command(): starting 30564 1726882884.29504: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882884.30318: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882884.30340: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882884.30356: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882884.30380: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882884.30426: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882884.30448: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882884.30461: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882884.30482: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882884.30493: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882884.30502: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882884.30513: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882884.30525: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882884.30544: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882884.30562: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882884.30583: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882884.30599: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882884.30691: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882884.30707: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882884.30722: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882884.30890: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882884.32528: stdout chunk (state=3): >>>/root <<< 30564 1726882884.32631: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882884.32694: stderr chunk (state=3): >>><<< 30564 1726882884.32696: stdout chunk (state=3): >>><<< 30564 1726882884.32781: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882884.32785: _low_level_execute_command(): starting 30564 1726882884.32788: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882884.3270895-34180-108609130781674 `" && echo ansible-tmp-1726882884.3270895-34180-108609130781674="` echo /root/.ansible/tmp/ansible-tmp-1726882884.3270895-34180-108609130781674 `" ) && sleep 0' 30564 1726882884.33157: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882884.33161: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882884.33195: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882884.33198: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882884.33202: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882884.33248: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882884.33255: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882884.33377: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882884.35246: stdout chunk (state=3): >>>ansible-tmp-1726882884.3270895-34180-108609130781674=/root/.ansible/tmp/ansible-tmp-1726882884.3270895-34180-108609130781674 <<< 30564 1726882884.35417: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882884.35421: stderr chunk (state=3): >>><<< 30564 1726882884.35423: stdout chunk (state=3): >>><<< 30564 1726882884.35438: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882884.3270895-34180-108609130781674=/root/.ansible/tmp/ansible-tmp-1726882884.3270895-34180-108609130781674 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882884.35490: variable 'ansible_module_compression' from source: unknown 30564 1726882884.35534: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 30564 1726882884.35563: variable 'ansible_facts' from source: unknown 30564 1726882884.35679: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882884.3270895-34180-108609130781674/AnsiballZ_network_connections.py 30564 1726882884.35875: Sending initial data 30564 1726882884.35879: Sent initial data (168 bytes) 30564 1726882884.36642: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882884.36647: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882884.36684: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882884.36691: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882884.36700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882884.36709: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882884.36716: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882884.36725: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882884.36732: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882884.36737: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882884.36742: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882884.36750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882884.36803: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882884.36829: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882884.36834: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882884.36929: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882884.38701: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 <<< 30564 1726882884.38707: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882884.38798: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882884.38898: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmp3vmv0wpl /root/.ansible/tmp/ansible-tmp-1726882884.3270895-34180-108609130781674/AnsiballZ_network_connections.py <<< 30564 1726882884.39009: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882884.40895: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882884.40979: stderr chunk (state=3): >>><<< 30564 1726882884.40983: stdout chunk (state=3): >>><<< 30564 1726882884.41004: done transferring module to remote 30564 1726882884.41014: _low_level_execute_command(): starting 30564 1726882884.41019: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882884.3270895-34180-108609130781674/ /root/.ansible/tmp/ansible-tmp-1726882884.3270895-34180-108609130781674/AnsiballZ_network_connections.py && sleep 0' 30564 1726882884.41635: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882884.41647: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882884.41652: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882884.41666: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882884.41711: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882884.41718: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882884.41727: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882884.41740: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882884.41749: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882884.41759: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882884.41762: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882884.41775: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882884.41786: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882884.41794: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882884.41800: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882884.41809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882884.41884: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882884.41897: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882884.41908: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882884.42033: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882884.43954: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882884.43958: stdout chunk (state=3): >>><<< 30564 1726882884.43966: stderr chunk (state=3): >>><<< 30564 1726882884.43984: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882884.43987: _low_level_execute_command(): starting 30564 1726882884.43991: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882884.3270895-34180-108609130781674/AnsiballZ_network_connections.py && sleep 0' 30564 1726882884.44711: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882884.45195: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882884.45212: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882884.45241: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882884.45260: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882884.45268: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882884.45282: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882884.45294: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882884.45301: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882884.45307: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882884.45314: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882884.45323: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882884.45334: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882884.45341: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882884.45347: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882884.45355: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882884.45438: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882884.45455: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882884.45550: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882884.46112: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882884.68031: stdout chunk (state=3): >>> {"changed": false, "warnings": [], "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 891d4ab6-2d22-4634-8d3b-2e935067cc98 skipped because already active\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 30564 1726882884.69491: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882884.69531: stderr chunk (state=3): >>><<< 30564 1726882884.69534: stdout chunk (state=3): >>><<< 30564 1726882884.69554: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "warnings": [], "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 891d4ab6-2d22-4634-8d3b-2e935067cc98 skipped because already active\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882884.69597: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'state': 'up'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882884.3270895-34180-108609130781674/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882884.69607: _low_level_execute_command(): starting 30564 1726882884.69612: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882884.3270895-34180-108609130781674/ > /dev/null 2>&1 && sleep 0' 30564 1726882884.70285: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882884.70291: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882884.70303: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882884.70317: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882884.70360: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882884.70373: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882884.70384: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882884.70398: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882884.70406: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882884.70413: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882884.70421: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882884.70430: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882884.70448: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882884.70455: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882884.70462: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882884.70477: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882884.70562: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882884.70573: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882884.70584: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882884.70709: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882884.72570: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882884.72577: stderr chunk (state=3): >>><<< 30564 1726882884.72580: stdout chunk (state=3): >>><<< 30564 1726882884.72598: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882884.72603: handler run complete 30564 1726882884.72635: attempt loop complete, returning result 30564 1726882884.72638: _execute() done 30564 1726882884.72641: dumping result to json 30564 1726882884.72643: done dumping result, returning 30564 1726882884.72653: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0e448fcc-3ce9-4216-acec-0000000019cf] 30564 1726882884.72660: sending task result for task 0e448fcc-3ce9-4216-acec-0000000019cf 30564 1726882884.72786: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000019cf 30564 1726882884.72789: WORKER PROCESS EXITING ok: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "state": "up" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false } STDERR: [002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 891d4ab6-2d22-4634-8d3b-2e935067cc98 skipped because already active 30564 1726882884.72978: no more pending results, returning what we have 30564 1726882884.72982: results queue empty 30564 1726882884.72984: checking for any_errors_fatal 30564 1726882884.72991: done checking for any_errors_fatal 30564 1726882884.72991: checking for max_fail_percentage 30564 1726882884.72994: done checking for max_fail_percentage 30564 1726882884.72995: checking to see if all hosts have failed and the running result is not ok 30564 1726882884.72996: done checking to see if all hosts have failed 30564 1726882884.72996: getting the remaining hosts for this loop 30564 1726882884.72998: done getting the remaining hosts for this loop 30564 1726882884.73002: getting the next task for host managed_node2 30564 1726882884.73011: done getting next task for host managed_node2 30564 1726882884.73015: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 30564 1726882884.73022: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882884.73035: getting variables 30564 1726882884.73037: in VariableManager get_vars() 30564 1726882884.73099: Calling all_inventory to load vars for managed_node2 30564 1726882884.73102: Calling groups_inventory to load vars for managed_node2 30564 1726882884.73105: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882884.73116: Calling all_plugins_play to load vars for managed_node2 30564 1726882884.73120: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882884.73123: Calling groups_plugins_play to load vars for managed_node2 30564 1726882884.76474: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882884.79161: done with get_vars() 30564 1726882884.79192: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:41:24 -0400 (0:00:00.647) 0:01:23.377 ****** 30564 1726882884.79606: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 30564 1726882884.79986: worker is 1 (out of 1 available) 30564 1726882884.80019: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 30564 1726882884.80032: done queuing things up, now waiting for results queue to drain 30564 1726882884.80034: waiting for pending results... 30564 1726882884.80378: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 30564 1726882884.80532: in run() - task 0e448fcc-3ce9-4216-acec-0000000019d0 30564 1726882884.80551: variable 'ansible_search_path' from source: unknown 30564 1726882884.80554: variable 'ansible_search_path' from source: unknown 30564 1726882884.80600: calling self._execute() 30564 1726882884.80712: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882884.80718: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882884.80729: variable 'omit' from source: magic vars 30564 1726882884.81163: variable 'ansible_distribution_major_version' from source: facts 30564 1726882884.81181: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882884.81443: variable 'network_state' from source: role '' defaults 30564 1726882884.81463: Evaluated conditional (network_state != {}): False 30564 1726882884.81468: when evaluation is False, skipping this task 30564 1726882884.81470: _execute() done 30564 1726882884.81474: dumping result to json 30564 1726882884.81476: done dumping result, returning 30564 1726882884.81483: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [0e448fcc-3ce9-4216-acec-0000000019d0] 30564 1726882884.81510: sending task result for task 0e448fcc-3ce9-4216-acec-0000000019d0 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30564 1726882884.81681: no more pending results, returning what we have 30564 1726882884.81686: results queue empty 30564 1726882884.81687: checking for any_errors_fatal 30564 1726882884.81703: done checking for any_errors_fatal 30564 1726882884.81704: checking for max_fail_percentage 30564 1726882884.81705: done checking for max_fail_percentage 30564 1726882884.81706: checking to see if all hosts have failed and the running result is not ok 30564 1726882884.81707: done checking to see if all hosts have failed 30564 1726882884.81708: getting the remaining hosts for this loop 30564 1726882884.81710: done getting the remaining hosts for this loop 30564 1726882884.81713: getting the next task for host managed_node2 30564 1726882884.81722: done getting next task for host managed_node2 30564 1726882884.81726: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30564 1726882884.81733: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882884.81749: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000019d0 30564 1726882884.81754: WORKER PROCESS EXITING 30564 1726882884.81779: getting variables 30564 1726882884.81781: in VariableManager get_vars() 30564 1726882884.81823: Calling all_inventory to load vars for managed_node2 30564 1726882884.81826: Calling groups_inventory to load vars for managed_node2 30564 1726882884.81828: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882884.81841: Calling all_plugins_play to load vars for managed_node2 30564 1726882884.81845: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882884.81848: Calling groups_plugins_play to load vars for managed_node2 30564 1726882884.85348: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882884.87511: done with get_vars() 30564 1726882884.87534: done getting variables 30564 1726882884.87604: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:41:24 -0400 (0:00:00.080) 0:01:23.457 ****** 30564 1726882884.87653: entering _queue_task() for managed_node2/debug 30564 1726882884.88026: worker is 1 (out of 1 available) 30564 1726882884.88043: exiting _queue_task() for managed_node2/debug 30564 1726882884.88059: done queuing things up, now waiting for results queue to drain 30564 1726882884.88061: waiting for pending results... 30564 1726882884.88385: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30564 1726882884.88540: in run() - task 0e448fcc-3ce9-4216-acec-0000000019d1 30564 1726882884.88553: variable 'ansible_search_path' from source: unknown 30564 1726882884.88557: variable 'ansible_search_path' from source: unknown 30564 1726882884.88603: calling self._execute() 30564 1726882884.88715: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882884.88725: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882884.88736: variable 'omit' from source: magic vars 30564 1726882884.89166: variable 'ansible_distribution_major_version' from source: facts 30564 1726882884.89182: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882884.89187: variable 'omit' from source: magic vars 30564 1726882884.89262: variable 'omit' from source: magic vars 30564 1726882884.89305: variable 'omit' from source: magic vars 30564 1726882884.89345: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882884.89458: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882884.89461: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882884.89465: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882884.89468: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882884.90189: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882884.90193: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882884.90195: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882884.90418: Set connection var ansible_timeout to 10 30564 1726882884.90423: Set connection var ansible_pipelining to False 30564 1726882884.90426: Set connection var ansible_shell_type to sh 30564 1726882884.90433: Set connection var ansible_shell_executable to /bin/sh 30564 1726882884.90441: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882884.90444: Set connection var ansible_connection to ssh 30564 1726882884.90470: variable 'ansible_shell_executable' from source: unknown 30564 1726882884.90476: variable 'ansible_connection' from source: unknown 30564 1726882884.90480: variable 'ansible_module_compression' from source: unknown 30564 1726882884.90482: variable 'ansible_shell_type' from source: unknown 30564 1726882884.90485: variable 'ansible_shell_executable' from source: unknown 30564 1726882884.90487: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882884.90491: variable 'ansible_pipelining' from source: unknown 30564 1726882884.90609: variable 'ansible_timeout' from source: unknown 30564 1726882884.90619: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882884.90881: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882884.90891: variable 'omit' from source: magic vars 30564 1726882884.90897: starting attempt loop 30564 1726882884.90900: running the handler 30564 1726882884.91038: variable '__network_connections_result' from source: set_fact 30564 1726882884.91106: handler run complete 30564 1726882884.91124: attempt loop complete, returning result 30564 1726882884.91127: _execute() done 30564 1726882884.91130: dumping result to json 30564 1726882884.91133: done dumping result, returning 30564 1726882884.91140: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0e448fcc-3ce9-4216-acec-0000000019d1] 30564 1726882884.91152: sending task result for task 0e448fcc-3ce9-4216-acec-0000000019d1 30564 1726882884.91251: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000019d1 ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 891d4ab6-2d22-4634-8d3b-2e935067cc98 skipped because already active" ] } 30564 1726882884.91320: WORKER PROCESS EXITING 30564 1726882884.91342: no more pending results, returning what we have 30564 1726882884.91346: results queue empty 30564 1726882884.91348: checking for any_errors_fatal 30564 1726882884.91355: done checking for any_errors_fatal 30564 1726882884.91356: checking for max_fail_percentage 30564 1726882884.91357: done checking for max_fail_percentage 30564 1726882884.91358: checking to see if all hosts have failed and the running result is not ok 30564 1726882884.91359: done checking to see if all hosts have failed 30564 1726882884.91361: getting the remaining hosts for this loop 30564 1726882884.91362: done getting the remaining hosts for this loop 30564 1726882884.91371: getting the next task for host managed_node2 30564 1726882884.91380: done getting next task for host managed_node2 30564 1726882884.91384: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30564 1726882884.91391: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882884.91406: getting variables 30564 1726882884.91408: in VariableManager get_vars() 30564 1726882884.91453: Calling all_inventory to load vars for managed_node2 30564 1726882884.91456: Calling groups_inventory to load vars for managed_node2 30564 1726882884.91459: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882884.91475: Calling all_plugins_play to load vars for managed_node2 30564 1726882884.91480: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882884.91483: Calling groups_plugins_play to load vars for managed_node2 30564 1726882884.93452: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882884.96408: done with get_vars() 30564 1726882884.96474: done getting variables 30564 1726882884.96533: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:41:24 -0400 (0:00:00.089) 0:01:23.547 ****** 30564 1726882884.96583: entering _queue_task() for managed_node2/debug 30564 1726882884.96906: worker is 1 (out of 1 available) 30564 1726882884.96920: exiting _queue_task() for managed_node2/debug 30564 1726882884.96934: done queuing things up, now waiting for results queue to drain 30564 1726882884.96935: waiting for pending results... 30564 1726882884.97257: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30564 1726882884.97441: in run() - task 0e448fcc-3ce9-4216-acec-0000000019d2 30564 1726882884.97458: variable 'ansible_search_path' from source: unknown 30564 1726882884.97462: variable 'ansible_search_path' from source: unknown 30564 1726882884.97506: calling self._execute() 30564 1726882884.97930: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882884.97934: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882884.97946: variable 'omit' from source: magic vars 30564 1726882884.98345: variable 'ansible_distribution_major_version' from source: facts 30564 1726882884.98355: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882884.98361: variable 'omit' from source: magic vars 30564 1726882884.98433: variable 'omit' from source: magic vars 30564 1726882884.98514: variable 'omit' from source: magic vars 30564 1726882884.99224: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882884.99258: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882884.99282: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882884.99300: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882884.99426: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882884.99455: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882884.99459: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882884.99461: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882884.99682: Set connection var ansible_timeout to 10 30564 1726882884.99686: Set connection var ansible_pipelining to False 30564 1726882884.99688: Set connection var ansible_shell_type to sh 30564 1726882884.99696: Set connection var ansible_shell_executable to /bin/sh 30564 1726882884.99703: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882884.99706: Set connection var ansible_connection to ssh 30564 1726882884.99730: variable 'ansible_shell_executable' from source: unknown 30564 1726882884.99734: variable 'ansible_connection' from source: unknown 30564 1726882884.99737: variable 'ansible_module_compression' from source: unknown 30564 1726882884.99740: variable 'ansible_shell_type' from source: unknown 30564 1726882884.99742: variable 'ansible_shell_executable' from source: unknown 30564 1726882884.99744: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882884.99763: variable 'ansible_pipelining' from source: unknown 30564 1726882884.99767: variable 'ansible_timeout' from source: unknown 30564 1726882884.99775: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882884.99921: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882884.99931: variable 'omit' from source: magic vars 30564 1726882884.99937: starting attempt loop 30564 1726882884.99940: running the handler 30564 1726882884.99997: variable '__network_connections_result' from source: set_fact 30564 1726882885.00070: variable '__network_connections_result' from source: set_fact 30564 1726882885.00188: handler run complete 30564 1726882885.00219: attempt loop complete, returning result 30564 1726882885.00223: _execute() done 30564 1726882885.00225: dumping result to json 30564 1726882885.00228: done dumping result, returning 30564 1726882885.00237: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0e448fcc-3ce9-4216-acec-0000000019d2] 30564 1726882885.00243: sending task result for task 0e448fcc-3ce9-4216-acec-0000000019d2 30564 1726882885.00344: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000019d2 30564 1726882885.00348: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "state": "up" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false, "failed": false, "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 891d4ab6-2d22-4634-8d3b-2e935067cc98 skipped because already active\n", "stderr_lines": [ "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 891d4ab6-2d22-4634-8d3b-2e935067cc98 skipped because already active" ] } } 30564 1726882885.00448: no more pending results, returning what we have 30564 1726882885.00452: results queue empty 30564 1726882885.00453: checking for any_errors_fatal 30564 1726882885.00462: done checking for any_errors_fatal 30564 1726882885.00465: checking for max_fail_percentage 30564 1726882885.00467: done checking for max_fail_percentage 30564 1726882885.00471: checking to see if all hosts have failed and the running result is not ok 30564 1726882885.00471: done checking to see if all hosts have failed 30564 1726882885.00473: getting the remaining hosts for this loop 30564 1726882885.00474: done getting the remaining hosts for this loop 30564 1726882885.00478: getting the next task for host managed_node2 30564 1726882885.00488: done getting next task for host managed_node2 30564 1726882885.00493: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30564 1726882885.00499: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882885.00513: getting variables 30564 1726882885.00515: in VariableManager get_vars() 30564 1726882885.00554: Calling all_inventory to load vars for managed_node2 30564 1726882885.00557: Calling groups_inventory to load vars for managed_node2 30564 1726882885.00571: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882885.00583: Calling all_plugins_play to load vars for managed_node2 30564 1726882885.00587: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882885.00590: Calling groups_plugins_play to load vars for managed_node2 30564 1726882885.02508: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882885.04349: done with get_vars() 30564 1726882885.04377: done getting variables 30564 1726882885.04435: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:41:25 -0400 (0:00:00.078) 0:01:23.626 ****** 30564 1726882885.04478: entering _queue_task() for managed_node2/debug 30564 1726882885.04787: worker is 1 (out of 1 available) 30564 1726882885.04800: exiting _queue_task() for managed_node2/debug 30564 1726882885.04812: done queuing things up, now waiting for results queue to drain 30564 1726882885.04814: waiting for pending results... 30564 1726882885.05126: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30564 1726882885.05269: in run() - task 0e448fcc-3ce9-4216-acec-0000000019d3 30564 1726882885.05286: variable 'ansible_search_path' from source: unknown 30564 1726882885.05290: variable 'ansible_search_path' from source: unknown 30564 1726882885.05323: calling self._execute() 30564 1726882885.06075: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882885.06081: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882885.06092: variable 'omit' from source: magic vars 30564 1726882885.06920: variable 'ansible_distribution_major_version' from source: facts 30564 1726882885.06934: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882885.07281: variable 'network_state' from source: role '' defaults 30564 1726882885.07291: Evaluated conditional (network_state != {}): False 30564 1726882885.07298: when evaluation is False, skipping this task 30564 1726882885.07379: _execute() done 30564 1726882885.07382: dumping result to json 30564 1726882885.07385: done dumping result, returning 30564 1726882885.07392: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0e448fcc-3ce9-4216-acec-0000000019d3] 30564 1726882885.07399: sending task result for task 0e448fcc-3ce9-4216-acec-0000000019d3 30564 1726882885.07501: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000019d3 30564 1726882885.07505: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "network_state != {}" } 30564 1726882885.07556: no more pending results, returning what we have 30564 1726882885.07559: results queue empty 30564 1726882885.07560: checking for any_errors_fatal 30564 1726882885.07576: done checking for any_errors_fatal 30564 1726882885.07577: checking for max_fail_percentage 30564 1726882885.07579: done checking for max_fail_percentage 30564 1726882885.07580: checking to see if all hosts have failed and the running result is not ok 30564 1726882885.07581: done checking to see if all hosts have failed 30564 1726882885.07582: getting the remaining hosts for this loop 30564 1726882885.07584: done getting the remaining hosts for this loop 30564 1726882885.07587: getting the next task for host managed_node2 30564 1726882885.07594: done getting next task for host managed_node2 30564 1726882885.07599: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 30564 1726882885.07604: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882885.07633: getting variables 30564 1726882885.07635: in VariableManager get_vars() 30564 1726882885.07686: Calling all_inventory to load vars for managed_node2 30564 1726882885.07689: Calling groups_inventory to load vars for managed_node2 30564 1726882885.07692: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882885.07705: Calling all_plugins_play to load vars for managed_node2 30564 1726882885.07709: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882885.07712: Calling groups_plugins_play to load vars for managed_node2 30564 1726882885.10311: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882885.12098: done with get_vars() 30564 1726882885.12126: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:41:25 -0400 (0:00:00.077) 0:01:23.703 ****** 30564 1726882885.12232: entering _queue_task() for managed_node2/ping 30564 1726882885.12646: worker is 1 (out of 1 available) 30564 1726882885.12659: exiting _queue_task() for managed_node2/ping 30564 1726882885.12785: done queuing things up, now waiting for results queue to drain 30564 1726882885.12787: waiting for pending results... 30564 1726882885.13768: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 30564 1726882885.14833: in run() - task 0e448fcc-3ce9-4216-acec-0000000019d4 30564 1726882885.15020: variable 'ansible_search_path' from source: unknown 30564 1726882885.15028: variable 'ansible_search_path' from source: unknown 30564 1726882885.15067: calling self._execute() 30564 1726882885.15172: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882885.15213: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882885.15233: variable 'omit' from source: magic vars 30564 1726882885.16114: variable 'ansible_distribution_major_version' from source: facts 30564 1726882885.16140: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882885.16153: variable 'omit' from source: magic vars 30564 1726882885.16251: variable 'omit' from source: magic vars 30564 1726882885.16292: variable 'omit' from source: magic vars 30564 1726882885.16339: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882885.16386: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882885.16418: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882885.16442: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882885.16465: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882885.16501: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882885.16511: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882885.16519: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882885.16625: Set connection var ansible_timeout to 10 30564 1726882885.16637: Set connection var ansible_pipelining to False 30564 1726882885.16645: Set connection var ansible_shell_type to sh 30564 1726882885.16655: Set connection var ansible_shell_executable to /bin/sh 30564 1726882885.16671: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882885.16684: Set connection var ansible_connection to ssh 30564 1726882885.16714: variable 'ansible_shell_executable' from source: unknown 30564 1726882885.16722: variable 'ansible_connection' from source: unknown 30564 1726882885.16729: variable 'ansible_module_compression' from source: unknown 30564 1726882885.16735: variable 'ansible_shell_type' from source: unknown 30564 1726882885.16741: variable 'ansible_shell_executable' from source: unknown 30564 1726882885.16747: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882885.16754: variable 'ansible_pipelining' from source: unknown 30564 1726882885.16761: variable 'ansible_timeout' from source: unknown 30564 1726882885.16771: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882885.18457: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30564 1726882885.18468: variable 'omit' from source: magic vars 30564 1726882885.18542: starting attempt loop 30564 1726882885.18546: running the handler 30564 1726882885.18559: _low_level_execute_command(): starting 30564 1726882885.18568: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882885.19522: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882885.19539: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882885.19550: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882885.19566: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882885.19629: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882885.19647: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882885.19657: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882885.19678: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882885.19708: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882885.19715: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882885.19723: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882885.19733: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882885.19750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882885.19782: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882885.19789: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882885.19808: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882885.19915: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882885.19987: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882885.19997: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882885.20182: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882885.21826: stdout chunk (state=3): >>>/root <<< 30564 1726882885.21977: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882885.22005: stderr chunk (state=3): >>><<< 30564 1726882885.22008: stdout chunk (state=3): >>><<< 30564 1726882885.22072: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882885.22076: _low_level_execute_command(): starting 30564 1726882885.22079: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882885.220324-34241-133997820689646 `" && echo ansible-tmp-1726882885.220324-34241-133997820689646="` echo /root/.ansible/tmp/ansible-tmp-1726882885.220324-34241-133997820689646 `" ) && sleep 0' 30564 1726882885.22737: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882885.22751: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882885.22771: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882885.22790: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882885.22830: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882885.22841: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882885.22854: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882885.22875: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882885.22889: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882885.22900: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882885.22911: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882885.22923: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882885.22943: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882885.22968: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882885.22982: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882885.22995: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882885.23073: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882885.23096: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882885.23113: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882885.23251: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882885.25136: stdout chunk (state=3): >>>ansible-tmp-1726882885.220324-34241-133997820689646=/root/.ansible/tmp/ansible-tmp-1726882885.220324-34241-133997820689646 <<< 30564 1726882885.25278: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882885.25321: stderr chunk (state=3): >>><<< 30564 1726882885.25324: stdout chunk (state=3): >>><<< 30564 1726882885.25470: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882885.220324-34241-133997820689646=/root/.ansible/tmp/ansible-tmp-1726882885.220324-34241-133997820689646 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882885.25473: variable 'ansible_module_compression' from source: unknown 30564 1726882885.25476: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 30564 1726882885.25479: variable 'ansible_facts' from source: unknown 30564 1726882885.25636: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882885.220324-34241-133997820689646/AnsiballZ_ping.py 30564 1726882885.25695: Sending initial data 30564 1726882885.25705: Sent initial data (152 bytes) 30564 1726882885.26984: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882885.26987: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882885.27027: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882885.27030: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration <<< 30564 1726882885.27032: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882885.27034: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882885.27111: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882885.27114: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882885.27224: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882885.28957: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882885.29046: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882885.29153: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmpjogj63x7 /root/.ansible/tmp/ansible-tmp-1726882885.220324-34241-133997820689646/AnsiballZ_ping.py <<< 30564 1726882885.29246: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882885.30378: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882885.30471: stderr chunk (state=3): >>><<< 30564 1726882885.30475: stdout chunk (state=3): >>><<< 30564 1726882885.30481: done transferring module to remote 30564 1726882885.30490: _low_level_execute_command(): starting 30564 1726882885.30496: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882885.220324-34241-133997820689646/ /root/.ansible/tmp/ansible-tmp-1726882885.220324-34241-133997820689646/AnsiballZ_ping.py && sleep 0' 30564 1726882885.30912: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882885.30915: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882885.30951: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882885.30955: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 30564 1726882885.30958: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882885.31012: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882885.31017: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882885.31113: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882885.32847: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882885.32936: stderr chunk (state=3): >>><<< 30564 1726882885.32939: stdout chunk (state=3): >>><<< 30564 1726882885.32959: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882885.32962: _low_level_execute_command(): starting 30564 1726882885.32973: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882885.220324-34241-133997820689646/AnsiballZ_ping.py && sleep 0' 30564 1726882885.33623: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882885.33629: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882885.33668: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882885.33679: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882885.33686: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882885.33699: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882885.33712: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 30564 1726882885.33717: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882885.33792: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882885.33799: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882885.33819: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882885.33949: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882885.46787: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 30564 1726882885.47836: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882885.47848: stderr chunk (state=3): >>><<< 30564 1726882885.47851: stdout chunk (state=3): >>><<< 30564 1726882885.47872: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882885.47907: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882885.220324-34241-133997820689646/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882885.47912: _low_level_execute_command(): starting 30564 1726882885.47917: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882885.220324-34241-133997820689646/ > /dev/null 2>&1 && sleep 0' 30564 1726882885.48450: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882885.48454: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882885.48491: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 30564 1726882885.48495: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882885.48503: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882885.48550: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882885.48553: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882885.48558: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882885.48656: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882885.50465: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882885.50526: stderr chunk (state=3): >>><<< 30564 1726882885.50535: stdout chunk (state=3): >>><<< 30564 1726882885.50556: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882885.50560: handler run complete 30564 1726882885.50575: attempt loop complete, returning result 30564 1726882885.50578: _execute() done 30564 1726882885.50580: dumping result to json 30564 1726882885.50583: done dumping result, returning 30564 1726882885.50594: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0e448fcc-3ce9-4216-acec-0000000019d4] 30564 1726882885.50600: sending task result for task 0e448fcc-3ce9-4216-acec-0000000019d4 30564 1726882885.50733: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000019d4 30564 1726882885.50736: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 30564 1726882885.50812: no more pending results, returning what we have 30564 1726882885.50815: results queue empty 30564 1726882885.50816: checking for any_errors_fatal 30564 1726882885.50845: done checking for any_errors_fatal 30564 1726882885.50847: checking for max_fail_percentage 30564 1726882885.50853: done checking for max_fail_percentage 30564 1726882885.50854: checking to see if all hosts have failed and the running result is not ok 30564 1726882885.50855: done checking to see if all hosts have failed 30564 1726882885.50856: getting the remaining hosts for this loop 30564 1726882885.50858: done getting the remaining hosts for this loop 30564 1726882885.50862: getting the next task for host managed_node2 30564 1726882885.50880: done getting next task for host managed_node2 30564 1726882885.50883: ^ task is: TASK: meta (role_complete) 30564 1726882885.50888: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882885.50906: getting variables 30564 1726882885.50908: in VariableManager get_vars() 30564 1726882885.50965: Calling all_inventory to load vars for managed_node2 30564 1726882885.50968: Calling groups_inventory to load vars for managed_node2 30564 1726882885.50971: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882885.50980: Calling all_plugins_play to load vars for managed_node2 30564 1726882885.50983: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882885.50985: Calling groups_plugins_play to load vars for managed_node2 30564 1726882885.52216: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882885.53399: done with get_vars() 30564 1726882885.53415: done getting variables 30564 1726882885.53480: done queuing things up, now waiting for results queue to drain 30564 1726882885.53481: results queue empty 30564 1726882885.53482: checking for any_errors_fatal 30564 1726882885.53484: done checking for any_errors_fatal 30564 1726882885.53484: checking for max_fail_percentage 30564 1726882885.53485: done checking for max_fail_percentage 30564 1726882885.53485: checking to see if all hosts have failed and the running result is not ok 30564 1726882885.53486: done checking to see if all hosts have failed 30564 1726882885.53486: getting the remaining hosts for this loop 30564 1726882885.53487: done getting the remaining hosts for this loop 30564 1726882885.53489: getting the next task for host managed_node2 30564 1726882885.53493: done getting next task for host managed_node2 30564 1726882885.53494: ^ task is: TASK: Include network role 30564 1726882885.53496: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882885.53498: getting variables 30564 1726882885.53499: in VariableManager get_vars() 30564 1726882885.53506: Calling all_inventory to load vars for managed_node2 30564 1726882885.53507: Calling groups_inventory to load vars for managed_node2 30564 1726882885.53509: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882885.53512: Calling all_plugins_play to load vars for managed_node2 30564 1726882885.53514: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882885.53515: Calling groups_plugins_play to load vars for managed_node2 30564 1726882885.54324: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882885.55819: done with get_vars() 30564 1726882885.55838: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_profile.yml:3 Friday 20 September 2024 21:41:25 -0400 (0:00:00.436) 0:01:24.140 ****** 30564 1726882885.55913: entering _queue_task() for managed_node2/include_role 30564 1726882885.56233: worker is 1 (out of 1 available) 30564 1726882885.56245: exiting _queue_task() for managed_node2/include_role 30564 1726882885.56257: done queuing things up, now waiting for results queue to drain 30564 1726882885.56258: waiting for pending results... 30564 1726882885.56550: running TaskExecutor() for managed_node2/TASK: Include network role 30564 1726882885.56679: in run() - task 0e448fcc-3ce9-4216-acec-0000000017d9 30564 1726882885.56691: variable 'ansible_search_path' from source: unknown 30564 1726882885.56695: variable 'ansible_search_path' from source: unknown 30564 1726882885.56734: calling self._execute() 30564 1726882885.56870: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882885.56873: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882885.56877: variable 'omit' from source: magic vars 30564 1726882885.57253: variable 'ansible_distribution_major_version' from source: facts 30564 1726882885.57271: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882885.57274: _execute() done 30564 1726882885.57277: dumping result to json 30564 1726882885.57281: done dumping result, returning 30564 1726882885.57288: done running TaskExecutor() for managed_node2/TASK: Include network role [0e448fcc-3ce9-4216-acec-0000000017d9] 30564 1726882885.57294: sending task result for task 0e448fcc-3ce9-4216-acec-0000000017d9 30564 1726882885.57414: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000017d9 30564 1726882885.57417: WORKER PROCESS EXITING 30564 1726882885.57445: no more pending results, returning what we have 30564 1726882885.57451: in VariableManager get_vars() 30564 1726882885.57498: Calling all_inventory to load vars for managed_node2 30564 1726882885.57501: Calling groups_inventory to load vars for managed_node2 30564 1726882885.57505: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882885.57518: Calling all_plugins_play to load vars for managed_node2 30564 1726882885.57522: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882885.57524: Calling groups_plugins_play to load vars for managed_node2 30564 1726882885.59178: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882885.60215: done with get_vars() 30564 1726882885.60238: variable 'ansible_search_path' from source: unknown 30564 1726882885.60240: variable 'ansible_search_path' from source: unknown 30564 1726882885.60392: variable 'omit' from source: magic vars 30564 1726882885.60431: variable 'omit' from source: magic vars 30564 1726882885.60447: variable 'omit' from source: magic vars 30564 1726882885.60450: we have included files to process 30564 1726882885.60451: generating all_blocks data 30564 1726882885.60453: done generating all_blocks data 30564 1726882885.60457: processing included file: fedora.linux_system_roles.network 30564 1726882885.60482: in VariableManager get_vars() 30564 1726882885.60494: done with get_vars() 30564 1726882885.60518: in VariableManager get_vars() 30564 1726882885.60531: done with get_vars() 30564 1726882885.60561: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 30564 1726882885.60690: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 30564 1726882885.60773: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 30564 1726882885.61242: in VariableManager get_vars() 30564 1726882885.61262: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30564 1726882885.63172: iterating over new_blocks loaded from include file 30564 1726882885.63175: in VariableManager get_vars() 30564 1726882885.63190: done with get_vars() 30564 1726882885.63192: filtering new block on tags 30564 1726882885.75093: done filtering new block on tags 30564 1726882885.75098: in VariableManager get_vars() 30564 1726882885.75117: done with get_vars() 30564 1726882885.75119: filtering new block on tags 30564 1726882885.75145: done filtering new block on tags 30564 1726882885.75147: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node2 30564 1726882885.75152: extending task lists for all hosts with included blocks 30564 1726882885.75280: done extending task lists 30564 1726882885.75282: done processing included files 30564 1726882885.75283: results queue empty 30564 1726882885.75284: checking for any_errors_fatal 30564 1726882885.75285: done checking for any_errors_fatal 30564 1726882885.75286: checking for max_fail_percentage 30564 1726882885.75287: done checking for max_fail_percentage 30564 1726882885.75288: checking to see if all hosts have failed and the running result is not ok 30564 1726882885.75289: done checking to see if all hosts have failed 30564 1726882885.75290: getting the remaining hosts for this loop 30564 1726882885.75291: done getting the remaining hosts for this loop 30564 1726882885.75294: getting the next task for host managed_node2 30564 1726882885.75298: done getting next task for host managed_node2 30564 1726882885.75301: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30564 1726882885.75304: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882885.75316: getting variables 30564 1726882885.75317: in VariableManager get_vars() 30564 1726882885.75332: Calling all_inventory to load vars for managed_node2 30564 1726882885.75335: Calling groups_inventory to load vars for managed_node2 30564 1726882885.75337: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882885.75342: Calling all_plugins_play to load vars for managed_node2 30564 1726882885.75344: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882885.75347: Calling groups_plugins_play to load vars for managed_node2 30564 1726882885.76613: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882885.78896: done with get_vars() 30564 1726882885.78922: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:41:25 -0400 (0:00:00.230) 0:01:24.371 ****** 30564 1726882885.79000: entering _queue_task() for managed_node2/include_tasks 30564 1726882885.79356: worker is 1 (out of 1 available) 30564 1726882885.79371: exiting _queue_task() for managed_node2/include_tasks 30564 1726882885.79383: done queuing things up, now waiting for results queue to drain 30564 1726882885.79384: waiting for pending results... 30564 1726882885.79690: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30564 1726882885.79847: in run() - task 0e448fcc-3ce9-4216-acec-000000001b3b 30564 1726882885.79873: variable 'ansible_search_path' from source: unknown 30564 1726882885.79883: variable 'ansible_search_path' from source: unknown 30564 1726882885.79930: calling self._execute() 30564 1726882885.80036: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882885.80053: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882885.80074: variable 'omit' from source: magic vars 30564 1726882885.80656: variable 'ansible_distribution_major_version' from source: facts 30564 1726882885.80681: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882885.80714: _execute() done 30564 1726882885.80723: dumping result to json 30564 1726882885.80731: done dumping result, returning 30564 1726882885.80741: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0e448fcc-3ce9-4216-acec-000000001b3b] 30564 1726882885.80752: sending task result for task 0e448fcc-3ce9-4216-acec-000000001b3b 30564 1726882885.80916: no more pending results, returning what we have 30564 1726882885.80922: in VariableManager get_vars() 30564 1726882885.80975: Calling all_inventory to load vars for managed_node2 30564 1726882885.80978: Calling groups_inventory to load vars for managed_node2 30564 1726882885.80981: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882885.80995: Calling all_plugins_play to load vars for managed_node2 30564 1726882885.80999: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882885.81002: Calling groups_plugins_play to load vars for managed_node2 30564 1726882885.82027: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001b3b 30564 1726882885.82030: WORKER PROCESS EXITING 30564 1726882885.82555: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882885.84282: done with get_vars() 30564 1726882885.84303: variable 'ansible_search_path' from source: unknown 30564 1726882885.84304: variable 'ansible_search_path' from source: unknown 30564 1726882885.84352: we have included files to process 30564 1726882885.84353: generating all_blocks data 30564 1726882885.84356: done generating all_blocks data 30564 1726882885.84359: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30564 1726882885.84361: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30564 1726882885.84457: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30564 1726882885.85480: done processing included file 30564 1726882885.85483: iterating over new_blocks loaded from include file 30564 1726882885.85484: in VariableManager get_vars() 30564 1726882885.85509: done with get_vars() 30564 1726882885.85511: filtering new block on tags 30564 1726882885.85550: done filtering new block on tags 30564 1726882885.85554: in VariableManager get_vars() 30564 1726882885.85582: done with get_vars() 30564 1726882885.85584: filtering new block on tags 30564 1726882885.85631: done filtering new block on tags 30564 1726882885.85634: in VariableManager get_vars() 30564 1726882885.85666: done with get_vars() 30564 1726882885.85669: filtering new block on tags 30564 1726882885.85696: done filtering new block on tags 30564 1726882885.85698: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 30564 1726882885.85701: extending task lists for all hosts with included blocks 30564 1726882885.86745: done extending task lists 30564 1726882885.86746: done processing included files 30564 1726882885.86747: results queue empty 30564 1726882885.86747: checking for any_errors_fatal 30564 1726882885.86750: done checking for any_errors_fatal 30564 1726882885.86750: checking for max_fail_percentage 30564 1726882885.86751: done checking for max_fail_percentage 30564 1726882885.86752: checking to see if all hosts have failed and the running result is not ok 30564 1726882885.86752: done checking to see if all hosts have failed 30564 1726882885.86753: getting the remaining hosts for this loop 30564 1726882885.86754: done getting the remaining hosts for this loop 30564 1726882885.86756: getting the next task for host managed_node2 30564 1726882885.86759: done getting next task for host managed_node2 30564 1726882885.86761: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30564 1726882885.86765: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882885.86773: getting variables 30564 1726882885.86774: in VariableManager get_vars() 30564 1726882885.86783: Calling all_inventory to load vars for managed_node2 30564 1726882885.86784: Calling groups_inventory to load vars for managed_node2 30564 1726882885.86786: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882885.86789: Calling all_plugins_play to load vars for managed_node2 30564 1726882885.86790: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882885.86792: Calling groups_plugins_play to load vars for managed_node2 30564 1726882885.88356: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882885.90832: done with get_vars() 30564 1726882885.90881: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:41:25 -0400 (0:00:00.119) 0:01:24.491 ****** 30564 1726882885.90988: entering _queue_task() for managed_node2/setup 30564 1726882885.91452: worker is 1 (out of 1 available) 30564 1726882885.91473: exiting _queue_task() for managed_node2/setup 30564 1726882885.91489: done queuing things up, now waiting for results queue to drain 30564 1726882885.91491: waiting for pending results... 30564 1726882885.91936: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30564 1726882885.92126: in run() - task 0e448fcc-3ce9-4216-acec-000000001b92 30564 1726882885.92148: variable 'ansible_search_path' from source: unknown 30564 1726882885.92152: variable 'ansible_search_path' from source: unknown 30564 1726882885.92182: calling self._execute() 30564 1726882885.92333: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882885.92346: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882885.92366: variable 'omit' from source: magic vars 30564 1726882885.92900: variable 'ansible_distribution_major_version' from source: facts 30564 1726882885.92926: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882885.93267: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882885.96116: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882885.96204: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882885.96244: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882885.96283: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882885.96319: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882885.96415: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882885.96444: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882885.96479: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882885.96529: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882885.96543: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882885.96602: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882885.96633: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882885.96658: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882885.96716: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882885.96784: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882885.97021: variable '__network_required_facts' from source: role '' defaults 30564 1726882885.97035: variable 'ansible_facts' from source: unknown 30564 1726882885.98117: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 30564 1726882885.98121: when evaluation is False, skipping this task 30564 1726882885.98123: _execute() done 30564 1726882885.98126: dumping result to json 30564 1726882885.98129: done dumping result, returning 30564 1726882885.98131: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0e448fcc-3ce9-4216-acec-000000001b92] 30564 1726882885.98177: sending task result for task 0e448fcc-3ce9-4216-acec-000000001b92 30564 1726882885.98272: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001b92 30564 1726882885.98275: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30564 1726882885.98326: no more pending results, returning what we have 30564 1726882885.98331: results queue empty 30564 1726882885.98332: checking for any_errors_fatal 30564 1726882885.98335: done checking for any_errors_fatal 30564 1726882885.98335: checking for max_fail_percentage 30564 1726882885.98337: done checking for max_fail_percentage 30564 1726882885.98338: checking to see if all hosts have failed and the running result is not ok 30564 1726882885.98339: done checking to see if all hosts have failed 30564 1726882885.98340: getting the remaining hosts for this loop 30564 1726882885.98342: done getting the remaining hosts for this loop 30564 1726882885.98347: getting the next task for host managed_node2 30564 1726882885.98373: done getting next task for host managed_node2 30564 1726882885.98378: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 30564 1726882885.98386: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882885.98416: getting variables 30564 1726882885.98419: in VariableManager get_vars() 30564 1726882885.98482: Calling all_inventory to load vars for managed_node2 30564 1726882885.98486: Calling groups_inventory to load vars for managed_node2 30564 1726882885.98489: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882885.98500: Calling all_plugins_play to load vars for managed_node2 30564 1726882885.98507: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882885.98528: Calling groups_plugins_play to load vars for managed_node2 30564 1726882886.00554: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882886.02590: done with get_vars() 30564 1726882886.02624: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:41:26 -0400 (0:00:00.117) 0:01:24.608 ****** 30564 1726882886.02758: entering _queue_task() for managed_node2/stat 30564 1726882886.03134: worker is 1 (out of 1 available) 30564 1726882886.03153: exiting _queue_task() for managed_node2/stat 30564 1726882886.03178: done queuing things up, now waiting for results queue to drain 30564 1726882886.03180: waiting for pending results... 30564 1726882886.03617: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 30564 1726882886.03758: in run() - task 0e448fcc-3ce9-4216-acec-000000001b94 30564 1726882886.03776: variable 'ansible_search_path' from source: unknown 30564 1726882886.03780: variable 'ansible_search_path' from source: unknown 30564 1726882886.03822: calling self._execute() 30564 1726882886.03963: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882886.03971: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882886.03985: variable 'omit' from source: magic vars 30564 1726882886.04433: variable 'ansible_distribution_major_version' from source: facts 30564 1726882886.04445: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882886.04640: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882886.04948: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882886.05011: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882886.05049: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882886.05087: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882886.05210: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882886.05241: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882886.05301: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882886.05384: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882886.05485: variable '__network_is_ostree' from source: set_fact 30564 1726882886.05496: Evaluated conditional (not __network_is_ostree is defined): False 30564 1726882886.05499: when evaluation is False, skipping this task 30564 1726882886.05502: _execute() done 30564 1726882886.05504: dumping result to json 30564 1726882886.05507: done dumping result, returning 30564 1726882886.05514: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0e448fcc-3ce9-4216-acec-000000001b94] 30564 1726882886.05521: sending task result for task 0e448fcc-3ce9-4216-acec-000000001b94 skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30564 1726882886.05899: no more pending results, returning what we have 30564 1726882886.05908: results queue empty 30564 1726882886.05909: checking for any_errors_fatal 30564 1726882886.05920: done checking for any_errors_fatal 30564 1726882886.05921: checking for max_fail_percentage 30564 1726882886.05924: done checking for max_fail_percentage 30564 1726882886.05925: checking to see if all hosts have failed and the running result is not ok 30564 1726882886.05926: done checking to see if all hosts have failed 30564 1726882886.05927: getting the remaining hosts for this loop 30564 1726882886.05931: done getting the remaining hosts for this loop 30564 1726882886.05937: getting the next task for host managed_node2 30564 1726882886.05951: done getting next task for host managed_node2 30564 1726882886.05958: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30564 1726882886.05970: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882886.06014: getting variables 30564 1726882886.06018: in VariableManager get_vars() 30564 1726882886.06079: Calling all_inventory to load vars for managed_node2 30564 1726882886.06081: Calling groups_inventory to load vars for managed_node2 30564 1726882886.06084: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882886.06096: Calling all_plugins_play to load vars for managed_node2 30564 1726882886.06099: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882886.06103: Calling groups_plugins_play to load vars for managed_node2 30564 1726882886.06662: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001b94 30564 1726882886.06667: WORKER PROCESS EXITING 30564 1726882886.08960: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882886.11657: done with get_vars() 30564 1726882886.11803: done getting variables 30564 1726882886.11862: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:41:26 -0400 (0:00:00.092) 0:01:24.701 ****** 30564 1726882886.11994: entering _queue_task() for managed_node2/set_fact 30564 1726882886.12335: worker is 1 (out of 1 available) 30564 1726882886.12347: exiting _queue_task() for managed_node2/set_fact 30564 1726882886.12359: done queuing things up, now waiting for results queue to drain 30564 1726882886.12360: waiting for pending results... 30564 1726882886.12703: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30564 1726882886.12879: in run() - task 0e448fcc-3ce9-4216-acec-000000001b95 30564 1726882886.12893: variable 'ansible_search_path' from source: unknown 30564 1726882886.12897: variable 'ansible_search_path' from source: unknown 30564 1726882886.12932: calling self._execute() 30564 1726882886.13045: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882886.13052: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882886.13068: variable 'omit' from source: magic vars 30564 1726882886.13503: variable 'ansible_distribution_major_version' from source: facts 30564 1726882886.13522: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882886.13718: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882886.14026: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882886.14084: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882886.14118: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882886.14156: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882886.14305: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882886.14330: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882886.14360: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882886.14400: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882886.14509: variable '__network_is_ostree' from source: set_fact 30564 1726882886.14520: Evaluated conditional (not __network_is_ostree is defined): False 30564 1726882886.14523: when evaluation is False, skipping this task 30564 1726882886.14525: _execute() done 30564 1726882886.14528: dumping result to json 30564 1726882886.14530: done dumping result, returning 30564 1726882886.14536: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0e448fcc-3ce9-4216-acec-000000001b95] 30564 1726882886.14541: sending task result for task 0e448fcc-3ce9-4216-acec-000000001b95 30564 1726882886.14640: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001b95 30564 1726882886.14643: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30564 1726882886.14715: no more pending results, returning what we have 30564 1726882886.14719: results queue empty 30564 1726882886.14720: checking for any_errors_fatal 30564 1726882886.14728: done checking for any_errors_fatal 30564 1726882886.14728: checking for max_fail_percentage 30564 1726882886.14730: done checking for max_fail_percentage 30564 1726882886.14731: checking to see if all hosts have failed and the running result is not ok 30564 1726882886.14732: done checking to see if all hosts have failed 30564 1726882886.14733: getting the remaining hosts for this loop 30564 1726882886.14735: done getting the remaining hosts for this loop 30564 1726882886.14739: getting the next task for host managed_node2 30564 1726882886.14751: done getting next task for host managed_node2 30564 1726882886.14755: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 30564 1726882886.14762: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882886.14794: getting variables 30564 1726882886.14796: in VariableManager get_vars() 30564 1726882886.14836: Calling all_inventory to load vars for managed_node2 30564 1726882886.14838: Calling groups_inventory to load vars for managed_node2 30564 1726882886.14840: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882886.14850: Calling all_plugins_play to load vars for managed_node2 30564 1726882886.14853: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882886.14856: Calling groups_plugins_play to load vars for managed_node2 30564 1726882886.17005: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882886.18735: done with get_vars() 30564 1726882886.18752: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:41:26 -0400 (0:00:00.068) 0:01:24.769 ****** 30564 1726882886.18826: entering _queue_task() for managed_node2/service_facts 30564 1726882886.19053: worker is 1 (out of 1 available) 30564 1726882886.19067: exiting _queue_task() for managed_node2/service_facts 30564 1726882886.19082: done queuing things up, now waiting for results queue to drain 30564 1726882886.19084: waiting for pending results... 30564 1726882886.19275: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 30564 1726882886.19374: in run() - task 0e448fcc-3ce9-4216-acec-000000001b97 30564 1726882886.19384: variable 'ansible_search_path' from source: unknown 30564 1726882886.19388: variable 'ansible_search_path' from source: unknown 30564 1726882886.19416: calling self._execute() 30564 1726882886.19502: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882886.19506: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882886.19538: variable 'omit' from source: magic vars 30564 1726882886.21091: variable 'ansible_distribution_major_version' from source: facts 30564 1726882886.21095: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882886.21097: variable 'omit' from source: magic vars 30564 1726882886.21100: variable 'omit' from source: magic vars 30564 1726882886.21106: variable 'omit' from source: magic vars 30564 1726882886.21108: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882886.21111: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882886.21113: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882886.21116: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882886.21118: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882886.21185: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882886.21189: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882886.21193: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882886.21350: Set connection var ansible_timeout to 10 30564 1726882886.21354: Set connection var ansible_pipelining to False 30564 1726882886.21356: Set connection var ansible_shell_type to sh 30564 1726882886.21359: Set connection var ansible_shell_executable to /bin/sh 30564 1726882886.21360: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882886.21362: Set connection var ansible_connection to ssh 30564 1726882886.21366: variable 'ansible_shell_executable' from source: unknown 30564 1726882886.21371: variable 'ansible_connection' from source: unknown 30564 1726882886.21374: variable 'ansible_module_compression' from source: unknown 30564 1726882886.21376: variable 'ansible_shell_type' from source: unknown 30564 1726882886.21378: variable 'ansible_shell_executable' from source: unknown 30564 1726882886.21379: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882886.21381: variable 'ansible_pipelining' from source: unknown 30564 1726882886.21383: variable 'ansible_timeout' from source: unknown 30564 1726882886.21386: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882886.21514: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30564 1726882886.21522: variable 'omit' from source: magic vars 30564 1726882886.21527: starting attempt loop 30564 1726882886.21529: running the handler 30564 1726882886.21543: _low_level_execute_command(): starting 30564 1726882886.21549: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882886.22239: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882886.22251: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882886.22263: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882886.22281: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882886.22321: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882886.22329: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882886.22339: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882886.22351: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882886.22360: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882886.22371: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882886.22377: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882886.22387: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882886.22399: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882886.22412: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882886.22495: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 30564 1726882886.22503: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882886.22520: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882886.22652: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882886.24315: stdout chunk (state=3): >>>/root <<< 30564 1726882886.24414: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882886.24461: stderr chunk (state=3): >>><<< 30564 1726882886.24474: stdout chunk (state=3): >>><<< 30564 1726882886.24491: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882886.24505: _low_level_execute_command(): starting 30564 1726882886.24523: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882886.2449079-34307-239072410607272 `" && echo ansible-tmp-1726882886.2449079-34307-239072410607272="` echo /root/.ansible/tmp/ansible-tmp-1726882886.2449079-34307-239072410607272 `" ) && sleep 0' 30564 1726882886.25177: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882886.25191: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882886.25197: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882886.25236: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882886.25240: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882886.25255: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882886.25262: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882886.25355: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882886.25378: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882886.25516: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882886.27372: stdout chunk (state=3): >>>ansible-tmp-1726882886.2449079-34307-239072410607272=/root/.ansible/tmp/ansible-tmp-1726882886.2449079-34307-239072410607272 <<< 30564 1726882886.27500: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882886.27563: stderr chunk (state=3): >>><<< 30564 1726882886.27568: stdout chunk (state=3): >>><<< 30564 1726882886.27690: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882886.2449079-34307-239072410607272=/root/.ansible/tmp/ansible-tmp-1726882886.2449079-34307-239072410607272 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882886.27694: variable 'ansible_module_compression' from source: unknown 30564 1726882886.27696: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 30564 1726882886.27720: variable 'ansible_facts' from source: unknown 30564 1726882886.27802: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882886.2449079-34307-239072410607272/AnsiballZ_service_facts.py 30564 1726882886.27948: Sending initial data 30564 1726882886.27952: Sent initial data (162 bytes) 30564 1726882886.28857: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882886.28895: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882886.28915: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882886.28928: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882886.28937: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882886.28940: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882886.28951: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882886.28957: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882886.28962: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882886.28969: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882886.28980: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882886.29032: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882886.29036: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882886.29058: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882886.29174: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882886.30894: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882886.30991: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882886.31093: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmp_5gtartd /root/.ansible/tmp/ansible-tmp-1726882886.2449079-34307-239072410607272/AnsiballZ_service_facts.py <<< 30564 1726882886.31190: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882886.32307: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882886.32405: stderr chunk (state=3): >>><<< 30564 1726882886.32408: stdout chunk (state=3): >>><<< 30564 1726882886.32423: done transferring module to remote 30564 1726882886.32432: _low_level_execute_command(): starting 30564 1726882886.32437: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882886.2449079-34307-239072410607272/ /root/.ansible/tmp/ansible-tmp-1726882886.2449079-34307-239072410607272/AnsiballZ_service_facts.py && sleep 0' 30564 1726882886.32858: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882886.32865: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882886.32911: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882886.32918: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882886.32921: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882886.32972: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882886.32988: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882886.33088: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882886.34832: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882886.34916: stderr chunk (state=3): >>><<< 30564 1726882886.34923: stdout chunk (state=3): >>><<< 30564 1726882886.34949: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882886.34956: _low_level_execute_command(): starting 30564 1726882886.34972: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882886.2449079-34307-239072410607272/AnsiballZ_service_facts.py && sleep 0' 30564 1726882886.35716: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882886.35754: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882886.35760: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882886.35803: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882886.35854: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882886.35862: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882886.35994: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882887.68137: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-qu<<< 30564 1726882887.68223: stdout chunk (state=3): >>>it-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "ina<<< 30564 1726882887.68231: stdout chunk (state=3): >>>ctive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 30564 1726882887.69483: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882887.69547: stderr chunk (state=3): >>><<< 30564 1726882887.69551: stdout chunk (state=3): >>><<< 30564 1726882887.69777: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882887.70416: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882886.2449079-34307-239072410607272/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882887.70657: _low_level_execute_command(): starting 30564 1726882887.70670: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882886.2449079-34307-239072410607272/ > /dev/null 2>&1 && sleep 0' 30564 1726882887.71447: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882887.71460: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882887.71478: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882887.71497: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882887.71546: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882887.71558: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882887.71575: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882887.71592: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882887.71603: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882887.71614: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882887.71625: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882887.71648: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882887.71665: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882887.71678: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882887.71689: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882887.71702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882887.71787: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882887.71809: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882887.71824: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882887.71952: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882887.73835: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882887.73838: stdout chunk (state=3): >>><<< 30564 1726882887.73840: stderr chunk (state=3): >>><<< 30564 1726882887.74269: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882887.74273: handler run complete 30564 1726882887.74275: variable 'ansible_facts' from source: unknown 30564 1726882887.74278: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882887.74926: variable 'ansible_facts' from source: unknown 30564 1726882887.75060: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882887.75251: attempt loop complete, returning result 30564 1726882887.75263: _execute() done 30564 1726882887.75274: dumping result to json 30564 1726882887.75334: done dumping result, returning 30564 1726882887.75349: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [0e448fcc-3ce9-4216-acec-000000001b97] 30564 1726882887.75360: sending task result for task 0e448fcc-3ce9-4216-acec-000000001b97 ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30564 1726882887.76186: no more pending results, returning what we have 30564 1726882887.76190: results queue empty 30564 1726882887.76191: checking for any_errors_fatal 30564 1726882887.76198: done checking for any_errors_fatal 30564 1726882887.76199: checking for max_fail_percentage 30564 1726882887.76201: done checking for max_fail_percentage 30564 1726882887.76202: checking to see if all hosts have failed and the running result is not ok 30564 1726882887.76203: done checking to see if all hosts have failed 30564 1726882887.76204: getting the remaining hosts for this loop 30564 1726882887.76206: done getting the remaining hosts for this loop 30564 1726882887.76210: getting the next task for host managed_node2 30564 1726882887.76219: done getting next task for host managed_node2 30564 1726882887.76223: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 30564 1726882887.76229: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882887.76245: getting variables 30564 1726882887.76250: in VariableManager get_vars() 30564 1726882887.76296: Calling all_inventory to load vars for managed_node2 30564 1726882887.76299: Calling groups_inventory to load vars for managed_node2 30564 1726882887.76301: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882887.76314: Calling all_plugins_play to load vars for managed_node2 30564 1726882887.76317: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882887.76321: Calling groups_plugins_play to load vars for managed_node2 30564 1726882887.76920: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001b97 30564 1726882887.76923: WORKER PROCESS EXITING 30564 1726882887.78402: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882887.80735: done with get_vars() 30564 1726882887.80757: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:41:27 -0400 (0:00:01.621) 0:01:26.391 ****** 30564 1726882887.80976: entering _queue_task() for managed_node2/package_facts 30564 1726882887.81670: worker is 1 (out of 1 available) 30564 1726882887.81682: exiting _queue_task() for managed_node2/package_facts 30564 1726882887.81695: done queuing things up, now waiting for results queue to drain 30564 1726882887.81696: waiting for pending results... 30564 1726882887.82053: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 30564 1726882887.82222: in run() - task 0e448fcc-3ce9-4216-acec-000000001b98 30564 1726882887.82238: variable 'ansible_search_path' from source: unknown 30564 1726882887.82243: variable 'ansible_search_path' from source: unknown 30564 1726882887.82289: calling self._execute() 30564 1726882887.82472: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882887.82492: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882887.82496: variable 'omit' from source: magic vars 30564 1726882887.82857: variable 'ansible_distribution_major_version' from source: facts 30564 1726882887.82895: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882887.82900: variable 'omit' from source: magic vars 30564 1726882887.83011: variable 'omit' from source: magic vars 30564 1726882887.83050: variable 'omit' from source: magic vars 30564 1726882887.83095: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882887.83132: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882887.83157: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882887.83181: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882887.83193: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882887.83225: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882887.83229: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882887.83231: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882887.83345: Set connection var ansible_timeout to 10 30564 1726882887.83358: Set connection var ansible_pipelining to False 30564 1726882887.83362: Set connection var ansible_shell_type to sh 30564 1726882887.83372: Set connection var ansible_shell_executable to /bin/sh 30564 1726882887.83381: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882887.83384: Set connection var ansible_connection to ssh 30564 1726882887.83409: variable 'ansible_shell_executable' from source: unknown 30564 1726882887.83412: variable 'ansible_connection' from source: unknown 30564 1726882887.83415: variable 'ansible_module_compression' from source: unknown 30564 1726882887.83417: variable 'ansible_shell_type' from source: unknown 30564 1726882887.83419: variable 'ansible_shell_executable' from source: unknown 30564 1726882887.83422: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882887.83426: variable 'ansible_pipelining' from source: unknown 30564 1726882887.83429: variable 'ansible_timeout' from source: unknown 30564 1726882887.83432: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882887.83637: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30564 1726882887.83647: variable 'omit' from source: magic vars 30564 1726882887.83650: starting attempt loop 30564 1726882887.83653: running the handler 30564 1726882887.83674: _low_level_execute_command(): starting 30564 1726882887.83741: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882887.85758: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882887.85787: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882887.85800: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882887.85825: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882887.85895: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882887.85924: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882887.85937: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882887.85978: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882887.85992: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882887.86002: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882887.86029: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882887.86046: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882887.86077: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882887.86111: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882887.86132: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882887.86152: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882887.86285: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882887.86316: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882887.86330: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882887.86497: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882887.88133: stdout chunk (state=3): >>>/root <<< 30564 1726882887.88311: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882887.88320: stdout chunk (state=3): >>><<< 30564 1726882887.88324: stderr chunk (state=3): >>><<< 30564 1726882887.88354: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882887.88368: _low_level_execute_command(): starting 30564 1726882887.88377: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882887.883519-34394-152038954139898 `" && echo ansible-tmp-1726882887.883519-34394-152038954139898="` echo /root/.ansible/tmp/ansible-tmp-1726882887.883519-34394-152038954139898 `" ) && sleep 0' 30564 1726882887.89836: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882887.89841: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882887.89892: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882887.89898: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882887.89913: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882887.89920: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882887.89999: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882887.90002: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882887.90015: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882887.90140: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882887.92026: stdout chunk (state=3): >>>ansible-tmp-1726882887.883519-34394-152038954139898=/root/.ansible/tmp/ansible-tmp-1726882887.883519-34394-152038954139898 <<< 30564 1726882887.92144: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882887.92215: stderr chunk (state=3): >>><<< 30564 1726882887.92230: stdout chunk (state=3): >>><<< 30564 1726882887.92265: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882887.883519-34394-152038954139898=/root/.ansible/tmp/ansible-tmp-1726882887.883519-34394-152038954139898 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882887.92346: variable 'ansible_module_compression' from source: unknown 30564 1726882887.92408: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 30564 1726882887.92468: variable 'ansible_facts' from source: unknown 30564 1726882887.92725: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882887.883519-34394-152038954139898/AnsiballZ_package_facts.py 30564 1726882887.92942: Sending initial data 30564 1726882887.92946: Sent initial data (161 bytes) 30564 1726882887.94089: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882887.94098: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882887.94127: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882887.94142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882887.94208: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882887.94231: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882887.94243: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882887.94261: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882887.94270: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882887.94281: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882887.94289: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882887.94298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882887.94309: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882887.94320: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882887.94328: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882887.94338: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882887.94422: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882887.94441: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882887.94452: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882887.94575: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882887.96349: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882887.96449: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882887.96552: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmpz5bavykk /root/.ansible/tmp/ansible-tmp-1726882887.883519-34394-152038954139898/AnsiballZ_package_facts.py <<< 30564 1726882887.96654: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882888.00455: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882888.00573: stderr chunk (state=3): >>><<< 30564 1726882888.00577: stdout chunk (state=3): >>><<< 30564 1726882888.00611: done transferring module to remote 30564 1726882888.00621: _low_level_execute_command(): starting 30564 1726882888.00627: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882887.883519-34394-152038954139898/ /root/.ansible/tmp/ansible-tmp-1726882887.883519-34394-152038954139898/AnsiballZ_package_facts.py && sleep 0' 30564 1726882888.01622: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882888.01637: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882888.01663: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882888.01679: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882888.01743: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882888.01750: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882888.01760: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882888.01778: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882888.01783: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882888.01792: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882888.01802: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882888.01816: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882888.01828: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882888.01835: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882888.01869: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882888.01872: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882888.01948: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882888.01966: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882888.01981: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882888.02108: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882888.04142: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882888.04224: stderr chunk (state=3): >>><<< 30564 1726882888.04239: stdout chunk (state=3): >>><<< 30564 1726882888.04280: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882888.04284: _low_level_execute_command(): starting 30564 1726882888.04289: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882887.883519-34394-152038954139898/AnsiballZ_package_facts.py && sleep 0' 30564 1726882888.05317: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882888.05343: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882888.05359: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882888.05377: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882888.05414: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882888.05422: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882888.05441: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882888.05487: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882888.05490: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882888.05508: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882888.05511: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882888.05521: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882888.05544: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882888.05563: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882888.05597: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882888.05610: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882888.05700: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882888.05718: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882888.05730: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882888.05873: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882888.51937: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_<<< 30564 1726882888.52027: stdout chunk (state=3): >>>64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x8<<< 30564 1726882888.52054: stdout chunk (state=3): >>>6_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [<<< 30564 1726882888.52060: stdout chunk (state=3): >>>{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba<<< 30564 1726882888.52065: stdout chunk (state=3): >>>", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "py<<< 30564 1726882888.52129: stdout chunk (state=3): >>>thon3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epo<<< 30564 1726882888.52142: stdout chunk (state=3): >>>ch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch"<<< 30564 1726882888.52148: stdout chunk (state=3): >>>: 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.<<< 30564 1726882888.52162: stdout chunk (state=3): >>>9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source":<<< 30564 1726882888.52171: stdout chunk (state=3): >>> "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rp<<< 30564 1726882888.52184: stdout chunk (state=3): >>>m"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1"<<< 30564 1726882888.52190: stdout chunk (state=3): >>>, "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysin<<< 30564 1726882888.52193: stdout chunk (state=3): >>>it", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "pe<<< 30564 1726882888.52197: stdout chunk (state=3): >>>rl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "<<< 30564 1726882888.52199: stdout chunk (state=3): >>>8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch"<<< 30564 1726882888.52202: stdout chunk (state=3): >>>: null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "relea<<< 30564 1726882888.52205: stdout chunk (state=3): >>>se": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 30564 1726882888.53736: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882888.53739: stdout chunk (state=3): >>><<< 30564 1726882888.53748: stderr chunk (state=3): >>><<< 30564 1726882888.53795: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882888.58758: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882887.883519-34394-152038954139898/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882888.58780: _low_level_execute_command(): starting 30564 1726882888.58784: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882887.883519-34394-152038954139898/ > /dev/null 2>&1 && sleep 0' 30564 1726882888.60347: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882888.60685: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882888.60695: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882888.60710: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882888.60750: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882888.60757: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882888.60770: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882888.60787: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882888.60797: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882888.60800: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882888.60808: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882888.60818: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882888.60829: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882888.60836: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882888.60843: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882888.60851: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882888.60923: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882888.61087: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882888.61093: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882888.61408: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882888.63302: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882888.63313: stdout chunk (state=3): >>><<< 30564 1726882888.63316: stderr chunk (state=3): >>><<< 30564 1726882888.63323: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882888.63330: handler run complete 30564 1726882888.64461: variable 'ansible_facts' from source: unknown 30564 1726882888.65402: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882888.69903: variable 'ansible_facts' from source: unknown 30564 1726882888.71078: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882888.72501: attempt loop complete, returning result 30564 1726882888.72519: _execute() done 30564 1726882888.72526: dumping result to json 30564 1726882888.72792: done dumping result, returning 30564 1726882888.72809: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0e448fcc-3ce9-4216-acec-000000001b98] 30564 1726882888.72821: sending task result for task 0e448fcc-3ce9-4216-acec-000000001b98 30564 1726882888.75891: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001b98 30564 1726882888.75895: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30564 1726882888.76114: no more pending results, returning what we have 30564 1726882888.76117: results queue empty 30564 1726882888.76118: checking for any_errors_fatal 30564 1726882888.76127: done checking for any_errors_fatal 30564 1726882888.76128: checking for max_fail_percentage 30564 1726882888.76130: done checking for max_fail_percentage 30564 1726882888.76131: checking to see if all hosts have failed and the running result is not ok 30564 1726882888.76132: done checking to see if all hosts have failed 30564 1726882888.76133: getting the remaining hosts for this loop 30564 1726882888.76135: done getting the remaining hosts for this loop 30564 1726882888.76139: getting the next task for host managed_node2 30564 1726882888.76149: done getting next task for host managed_node2 30564 1726882888.76152: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 30564 1726882888.76159: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882888.76181: getting variables 30564 1726882888.76184: in VariableManager get_vars() 30564 1726882888.76219: Calling all_inventory to load vars for managed_node2 30564 1726882888.76222: Calling groups_inventory to load vars for managed_node2 30564 1726882888.76229: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882888.76240: Calling all_plugins_play to load vars for managed_node2 30564 1726882888.76243: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882888.76246: Calling groups_plugins_play to load vars for managed_node2 30564 1726882888.79989: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882888.84138: done with get_vars() 30564 1726882888.84217: done getting variables 30564 1726882888.84318: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:41:28 -0400 (0:00:01.033) 0:01:27.424 ****** 30564 1726882888.84659: entering _queue_task() for managed_node2/debug 30564 1726882888.85240: worker is 1 (out of 1 available) 30564 1726882888.85254: exiting _queue_task() for managed_node2/debug 30564 1726882888.85274: done queuing things up, now waiting for results queue to drain 30564 1726882888.85276: waiting for pending results... 30564 1726882888.85610: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 30564 1726882888.85778: in run() - task 0e448fcc-3ce9-4216-acec-000000001b3c 30564 1726882888.85810: variable 'ansible_search_path' from source: unknown 30564 1726882888.85820: variable 'ansible_search_path' from source: unknown 30564 1726882888.85867: calling self._execute() 30564 1726882888.85990: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882888.86007: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882888.86030: variable 'omit' from source: magic vars 30564 1726882888.86459: variable 'ansible_distribution_major_version' from source: facts 30564 1726882888.86483: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882888.86493: variable 'omit' from source: magic vars 30564 1726882888.86568: variable 'omit' from source: magic vars 30564 1726882888.86691: variable 'network_provider' from source: set_fact 30564 1726882888.86723: variable 'omit' from source: magic vars 30564 1726882888.86781: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882888.86833: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882888.86861: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882888.86905: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882888.86925: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882888.86960: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882888.86973: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882888.87004: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882888.87133: Set connection var ansible_timeout to 10 30564 1726882888.87143: Set connection var ansible_pipelining to False 30564 1726882888.87149: Set connection var ansible_shell_type to sh 30564 1726882888.87158: Set connection var ansible_shell_executable to /bin/sh 30564 1726882888.87171: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882888.87177: Set connection var ansible_connection to ssh 30564 1726882888.87212: variable 'ansible_shell_executable' from source: unknown 30564 1726882888.87229: variable 'ansible_connection' from source: unknown 30564 1726882888.87239: variable 'ansible_module_compression' from source: unknown 30564 1726882888.87245: variable 'ansible_shell_type' from source: unknown 30564 1726882888.87251: variable 'ansible_shell_executable' from source: unknown 30564 1726882888.87256: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882888.87262: variable 'ansible_pipelining' from source: unknown 30564 1726882888.87271: variable 'ansible_timeout' from source: unknown 30564 1726882888.87278: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882888.87446: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882888.87466: variable 'omit' from source: magic vars 30564 1726882888.87476: starting attempt loop 30564 1726882888.87486: running the handler 30564 1726882888.87537: handler run complete 30564 1726882888.87569: attempt loop complete, returning result 30564 1726882888.87577: _execute() done 30564 1726882888.87584: dumping result to json 30564 1726882888.87590: done dumping result, returning 30564 1726882888.87603: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [0e448fcc-3ce9-4216-acec-000000001b3c] 30564 1726882888.87613: sending task result for task 0e448fcc-3ce9-4216-acec-000000001b3c ok: [managed_node2] => {} MSG: Using network provider: nm 30564 1726882888.87795: no more pending results, returning what we have 30564 1726882888.87798: results queue empty 30564 1726882888.87799: checking for any_errors_fatal 30564 1726882888.87811: done checking for any_errors_fatal 30564 1726882888.87811: checking for max_fail_percentage 30564 1726882888.87813: done checking for max_fail_percentage 30564 1726882888.87814: checking to see if all hosts have failed and the running result is not ok 30564 1726882888.87815: done checking to see if all hosts have failed 30564 1726882888.87816: getting the remaining hosts for this loop 30564 1726882888.87818: done getting the remaining hosts for this loop 30564 1726882888.87822: getting the next task for host managed_node2 30564 1726882888.87832: done getting next task for host managed_node2 30564 1726882888.87837: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30564 1726882888.87845: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882888.87859: getting variables 30564 1726882888.87860: in VariableManager get_vars() 30564 1726882888.87902: Calling all_inventory to load vars for managed_node2 30564 1726882888.87904: Calling groups_inventory to load vars for managed_node2 30564 1726882888.87907: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882888.87917: Calling all_plugins_play to load vars for managed_node2 30564 1726882888.87920: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882888.87923: Calling groups_plugins_play to load vars for managed_node2 30564 1726882888.88926: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001b3c 30564 1726882888.88929: WORKER PROCESS EXITING 30564 1726882888.91052: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882888.93237: done with get_vars() 30564 1726882888.93309: done getting variables 30564 1726882888.93493: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:41:28 -0400 (0:00:00.091) 0:01:27.516 ****** 30564 1726882888.93534: entering _queue_task() for managed_node2/fail 30564 1726882888.94102: worker is 1 (out of 1 available) 30564 1726882888.94115: exiting _queue_task() for managed_node2/fail 30564 1726882888.94247: done queuing things up, now waiting for results queue to drain 30564 1726882888.94249: waiting for pending results... 30564 1726882888.95101: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30564 1726882888.95254: in run() - task 0e448fcc-3ce9-4216-acec-000000001b3d 30564 1726882888.95278: variable 'ansible_search_path' from source: unknown 30564 1726882888.95286: variable 'ansible_search_path' from source: unknown 30564 1726882888.95323: calling self._execute() 30564 1726882888.95431: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882888.95445: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882888.95463: variable 'omit' from source: magic vars 30564 1726882888.95869: variable 'ansible_distribution_major_version' from source: facts 30564 1726882888.95894: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882888.96028: variable 'network_state' from source: role '' defaults 30564 1726882888.96046: Evaluated conditional (network_state != {}): False 30564 1726882888.96054: when evaluation is False, skipping this task 30564 1726882888.96061: _execute() done 30564 1726882888.96070: dumping result to json 30564 1726882888.96077: done dumping result, returning 30564 1726882888.96087: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0e448fcc-3ce9-4216-acec-000000001b3d] 30564 1726882888.96100: sending task result for task 0e448fcc-3ce9-4216-acec-000000001b3d skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30564 1726882888.96255: no more pending results, returning what we have 30564 1726882888.96259: results queue empty 30564 1726882888.96261: checking for any_errors_fatal 30564 1726882888.96275: done checking for any_errors_fatal 30564 1726882888.96277: checking for max_fail_percentage 30564 1726882888.96280: done checking for max_fail_percentage 30564 1726882888.96281: checking to see if all hosts have failed and the running result is not ok 30564 1726882888.96282: done checking to see if all hosts have failed 30564 1726882888.96283: getting the remaining hosts for this loop 30564 1726882888.96285: done getting the remaining hosts for this loop 30564 1726882888.96289: getting the next task for host managed_node2 30564 1726882888.96298: done getting next task for host managed_node2 30564 1726882888.96302: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30564 1726882888.96310: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882888.96341: getting variables 30564 1726882888.96343: in VariableManager get_vars() 30564 1726882888.96389: Calling all_inventory to load vars for managed_node2 30564 1726882888.96392: Calling groups_inventory to load vars for managed_node2 30564 1726882888.96394: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882888.96407: Calling all_plugins_play to load vars for managed_node2 30564 1726882888.96410: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882888.96413: Calling groups_plugins_play to load vars for managed_node2 30564 1726882888.97393: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001b3d 30564 1726882888.97397: WORKER PROCESS EXITING 30564 1726882888.98400: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882889.00188: done with get_vars() 30564 1726882889.00210: done getting variables 30564 1726882889.00276: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:41:29 -0400 (0:00:00.067) 0:01:27.584 ****** 30564 1726882889.00311: entering _queue_task() for managed_node2/fail 30564 1726882889.00608: worker is 1 (out of 1 available) 30564 1726882889.00622: exiting _queue_task() for managed_node2/fail 30564 1726882889.00633: done queuing things up, now waiting for results queue to drain 30564 1726882889.00634: waiting for pending results... 30564 1726882889.00950: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30564 1726882889.01114: in run() - task 0e448fcc-3ce9-4216-acec-000000001b3e 30564 1726882889.01135: variable 'ansible_search_path' from source: unknown 30564 1726882889.01142: variable 'ansible_search_path' from source: unknown 30564 1726882889.01188: calling self._execute() 30564 1726882889.01309: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882889.01320: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882889.01334: variable 'omit' from source: magic vars 30564 1726882889.01766: variable 'ansible_distribution_major_version' from source: facts 30564 1726882889.01789: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882889.01924: variable 'network_state' from source: role '' defaults 30564 1726882889.01942: Evaluated conditional (network_state != {}): False 30564 1726882889.01956: when evaluation is False, skipping this task 30564 1726882889.01968: _execute() done 30564 1726882889.01977: dumping result to json 30564 1726882889.01986: done dumping result, returning 30564 1726882889.02003: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0e448fcc-3ce9-4216-acec-000000001b3e] 30564 1726882889.02015: sending task result for task 0e448fcc-3ce9-4216-acec-000000001b3e 30564 1726882889.02139: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001b3e 30564 1726882889.02147: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30564 1726882889.02199: no more pending results, returning what we have 30564 1726882889.02204: results queue empty 30564 1726882889.02205: checking for any_errors_fatal 30564 1726882889.02215: done checking for any_errors_fatal 30564 1726882889.02216: checking for max_fail_percentage 30564 1726882889.02218: done checking for max_fail_percentage 30564 1726882889.02219: checking to see if all hosts have failed and the running result is not ok 30564 1726882889.02220: done checking to see if all hosts have failed 30564 1726882889.02221: getting the remaining hosts for this loop 30564 1726882889.02223: done getting the remaining hosts for this loop 30564 1726882889.02226: getting the next task for host managed_node2 30564 1726882889.02237: done getting next task for host managed_node2 30564 1726882889.02241: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30564 1726882889.02248: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882889.02281: getting variables 30564 1726882889.02283: in VariableManager get_vars() 30564 1726882889.02324: Calling all_inventory to load vars for managed_node2 30564 1726882889.02327: Calling groups_inventory to load vars for managed_node2 30564 1726882889.02330: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882889.02342: Calling all_plugins_play to load vars for managed_node2 30564 1726882889.02345: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882889.02348: Calling groups_plugins_play to load vars for managed_node2 30564 1726882889.04057: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882889.06022: done with get_vars() 30564 1726882889.06044: done getting variables 30564 1726882889.06113: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:41:29 -0400 (0:00:00.058) 0:01:27.642 ****** 30564 1726882889.06145: entering _queue_task() for managed_node2/fail 30564 1726882889.06790: worker is 1 (out of 1 available) 30564 1726882889.06802: exiting _queue_task() for managed_node2/fail 30564 1726882889.06813: done queuing things up, now waiting for results queue to drain 30564 1726882889.06814: waiting for pending results... 30564 1726882889.07725: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30564 1726882889.07902: in run() - task 0e448fcc-3ce9-4216-acec-000000001b3f 30564 1726882889.07928: variable 'ansible_search_path' from source: unknown 30564 1726882889.07936: variable 'ansible_search_path' from source: unknown 30564 1726882889.07990: calling self._execute() 30564 1726882889.08101: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882889.08111: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882889.08129: variable 'omit' from source: magic vars 30564 1726882889.08527: variable 'ansible_distribution_major_version' from source: facts 30564 1726882889.08544: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882889.08735: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882889.13109: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882889.13246: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882889.13347: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882889.13418: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882889.13488: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882889.13618: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882889.13655: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882889.13688: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882889.13767: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882889.13787: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882889.13912: variable 'ansible_distribution_major_version' from source: facts 30564 1726882889.13938: Evaluated conditional (ansible_distribution_major_version | int > 9): False 30564 1726882889.13947: when evaluation is False, skipping this task 30564 1726882889.13958: _execute() done 30564 1726882889.13965: dumping result to json 30564 1726882889.13973: done dumping result, returning 30564 1726882889.13984: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0e448fcc-3ce9-4216-acec-000000001b3f] 30564 1726882889.13994: sending task result for task 0e448fcc-3ce9-4216-acec-000000001b3f skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 30564 1726882889.14146: no more pending results, returning what we have 30564 1726882889.14150: results queue empty 30564 1726882889.14151: checking for any_errors_fatal 30564 1726882889.14159: done checking for any_errors_fatal 30564 1726882889.14160: checking for max_fail_percentage 30564 1726882889.14162: done checking for max_fail_percentage 30564 1726882889.14165: checking to see if all hosts have failed and the running result is not ok 30564 1726882889.14166: done checking to see if all hosts have failed 30564 1726882889.14167: getting the remaining hosts for this loop 30564 1726882889.14169: done getting the remaining hosts for this loop 30564 1726882889.14173: getting the next task for host managed_node2 30564 1726882889.14183: done getting next task for host managed_node2 30564 1726882889.14188: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30564 1726882889.14194: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882889.14225: getting variables 30564 1726882889.14227: in VariableManager get_vars() 30564 1726882889.14269: Calling all_inventory to load vars for managed_node2 30564 1726882889.14272: Calling groups_inventory to load vars for managed_node2 30564 1726882889.14274: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882889.14285: Calling all_plugins_play to load vars for managed_node2 30564 1726882889.14288: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882889.14291: Calling groups_plugins_play to load vars for managed_node2 30564 1726882889.15312: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001b3f 30564 1726882889.15316: WORKER PROCESS EXITING 30564 1726882889.17880: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882889.20021: done with get_vars() 30564 1726882889.20047: done getting variables 30564 1726882889.20115: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:41:29 -0400 (0:00:00.139) 0:01:27.782 ****** 30564 1726882889.20148: entering _queue_task() for managed_node2/dnf 30564 1726882889.21548: worker is 1 (out of 1 available) 30564 1726882889.21565: exiting _queue_task() for managed_node2/dnf 30564 1726882889.21580: done queuing things up, now waiting for results queue to drain 30564 1726882889.21581: waiting for pending results... 30564 1726882889.22489: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30564 1726882889.23026: in run() - task 0e448fcc-3ce9-4216-acec-000000001b40 30564 1726882889.23049: variable 'ansible_search_path' from source: unknown 30564 1726882889.23081: variable 'ansible_search_path' from source: unknown 30564 1726882889.23272: calling self._execute() 30564 1726882889.23659: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882889.23734: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882889.23767: variable 'omit' from source: magic vars 30564 1726882889.24517: variable 'ansible_distribution_major_version' from source: facts 30564 1726882889.24537: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882889.24865: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882889.28251: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882889.28451: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882889.28503: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882889.28542: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882889.28598: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882889.28849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882889.28885: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882889.28923: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882889.28969: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882889.29013: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882889.29320: variable 'ansible_distribution' from source: facts 30564 1726882889.29328: variable 'ansible_distribution_major_version' from source: facts 30564 1726882889.29345: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 30564 1726882889.29507: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882889.29648: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882889.29680: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882889.29715: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882889.29758: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882889.29779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882889.29827: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882889.29853: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882889.29885: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882889.29954: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882889.29975: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882889.30024: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882889.30050: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882889.30151: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882889.30195: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882889.30248: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882889.30733: variable 'network_connections' from source: include params 30564 1726882889.30959: variable 'interface' from source: play vars 30564 1726882889.31034: variable 'interface' from source: play vars 30564 1726882889.31111: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882889.32636: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882889.32855: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882889.32948: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882889.33037: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882889.33358: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882889.33424: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882889.33641: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882889.33699: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882889.33928: variable '__network_team_connections_defined' from source: role '' defaults 30564 1726882889.34367: variable 'network_connections' from source: include params 30564 1726882889.34379: variable 'interface' from source: play vars 30564 1726882889.34446: variable 'interface' from source: play vars 30564 1726882889.34478: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30564 1726882889.34486: when evaluation is False, skipping this task 30564 1726882889.34492: _execute() done 30564 1726882889.34498: dumping result to json 30564 1726882889.34546: done dumping result, returning 30564 1726882889.34583: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0e448fcc-3ce9-4216-acec-000000001b40] 30564 1726882889.34626: sending task result for task 0e448fcc-3ce9-4216-acec-000000001b40 30564 1726882889.34756: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001b40 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30564 1726882889.34942: no more pending results, returning what we have 30564 1726882889.34947: results queue empty 30564 1726882889.34948: checking for any_errors_fatal 30564 1726882889.34955: done checking for any_errors_fatal 30564 1726882889.34956: checking for max_fail_percentage 30564 1726882889.34958: done checking for max_fail_percentage 30564 1726882889.34959: checking to see if all hosts have failed and the running result is not ok 30564 1726882889.34960: done checking to see if all hosts have failed 30564 1726882889.34961: getting the remaining hosts for this loop 30564 1726882889.35028: done getting the remaining hosts for this loop 30564 1726882889.35100: getting the next task for host managed_node2 30564 1726882889.35183: done getting next task for host managed_node2 30564 1726882889.35188: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30564 1726882889.35194: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882889.35221: getting variables 30564 1726882889.35223: in VariableManager get_vars() 30564 1726882889.35267: Calling all_inventory to load vars for managed_node2 30564 1726882889.35270: Calling groups_inventory to load vars for managed_node2 30564 1726882889.35273: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882889.35286: Calling all_plugins_play to load vars for managed_node2 30564 1726882889.35290: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882889.35293: Calling groups_plugins_play to load vars for managed_node2 30564 1726882889.36876: WORKER PROCESS EXITING 30564 1726882889.38771: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882889.41031: done with get_vars() 30564 1726882889.41055: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30564 1726882889.41138: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:41:29 -0400 (0:00:00.210) 0:01:27.993 ****** 30564 1726882889.41173: entering _queue_task() for managed_node2/yum 30564 1726882889.41483: worker is 1 (out of 1 available) 30564 1726882889.41495: exiting _queue_task() for managed_node2/yum 30564 1726882889.41507: done queuing things up, now waiting for results queue to drain 30564 1726882889.41509: waiting for pending results... 30564 1726882889.41826: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30564 1726882889.42090: in run() - task 0e448fcc-3ce9-4216-acec-000000001b41 30564 1726882889.42108: variable 'ansible_search_path' from source: unknown 30564 1726882889.42177: variable 'ansible_search_path' from source: unknown 30564 1726882889.42223: calling self._execute() 30564 1726882889.42443: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882889.42508: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882889.42530: variable 'omit' from source: magic vars 30564 1726882889.43408: variable 'ansible_distribution_major_version' from source: facts 30564 1726882889.43487: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882889.43936: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882889.48113: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882889.48259: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882889.48303: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882889.48357: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882889.48389: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882889.48479: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882889.48512: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882889.48550: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882889.48599: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882889.48618: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882889.48719: variable 'ansible_distribution_major_version' from source: facts 30564 1726882889.48885: Evaluated conditional (ansible_distribution_major_version | int < 8): False 30564 1726882889.48893: when evaluation is False, skipping this task 30564 1726882889.48901: _execute() done 30564 1726882889.48908: dumping result to json 30564 1726882889.48915: done dumping result, returning 30564 1726882889.48925: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0e448fcc-3ce9-4216-acec-000000001b41] 30564 1726882889.48935: sending task result for task 0e448fcc-3ce9-4216-acec-000000001b41 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 30564 1726882889.49104: no more pending results, returning what we have 30564 1726882889.49108: results queue empty 30564 1726882889.49109: checking for any_errors_fatal 30564 1726882889.49117: done checking for any_errors_fatal 30564 1726882889.49117: checking for max_fail_percentage 30564 1726882889.49119: done checking for max_fail_percentage 30564 1726882889.49120: checking to see if all hosts have failed and the running result is not ok 30564 1726882889.49121: done checking to see if all hosts have failed 30564 1726882889.49122: getting the remaining hosts for this loop 30564 1726882889.49125: done getting the remaining hosts for this loop 30564 1726882889.49129: getting the next task for host managed_node2 30564 1726882889.49139: done getting next task for host managed_node2 30564 1726882889.49143: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30564 1726882889.49149: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882889.49181: getting variables 30564 1726882889.49183: in VariableManager get_vars() 30564 1726882889.49225: Calling all_inventory to load vars for managed_node2 30564 1726882889.49228: Calling groups_inventory to load vars for managed_node2 30564 1726882889.49230: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882889.49241: Calling all_plugins_play to load vars for managed_node2 30564 1726882889.49244: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882889.49247: Calling groups_plugins_play to load vars for managed_node2 30564 1726882889.50349: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001b41 30564 1726882889.50352: WORKER PROCESS EXITING 30564 1726882889.52413: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882889.57397: done with get_vars() 30564 1726882889.57433: done getting variables 30564 1726882889.57500: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:41:29 -0400 (0:00:00.163) 0:01:28.156 ****** 30564 1726882889.57538: entering _queue_task() for managed_node2/fail 30564 1726882889.57852: worker is 1 (out of 1 available) 30564 1726882889.58659: exiting _queue_task() for managed_node2/fail 30564 1726882889.58672: done queuing things up, now waiting for results queue to drain 30564 1726882889.58674: waiting for pending results... 30564 1726882889.58695: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30564 1726882889.59292: in run() - task 0e448fcc-3ce9-4216-acec-000000001b42 30564 1726882889.59314: variable 'ansible_search_path' from source: unknown 30564 1726882889.59324: variable 'ansible_search_path' from source: unknown 30564 1726882889.59487: calling self._execute() 30564 1726882889.59608: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882889.59683: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882889.59701: variable 'omit' from source: magic vars 30564 1726882889.60508: variable 'ansible_distribution_major_version' from source: facts 30564 1726882889.60559: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882889.60877: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882889.61188: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882889.76395: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882889.76465: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882889.76506: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882889.76537: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882889.76563: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882889.76807: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882889.76943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882889.76946: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882889.77185: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882889.77188: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882889.77191: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882889.77193: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882889.77200: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882889.77282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882889.77298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882889.77336: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882889.77479: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882889.77501: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882889.77539: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882889.77552: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882889.78059: variable 'network_connections' from source: include params 30564 1726882889.78074: variable 'interface' from source: play vars 30564 1726882889.78156: variable 'interface' from source: play vars 30564 1726882889.78236: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882889.78418: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882889.78459: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882889.78493: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882889.78520: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882889.78568: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882889.78595: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882889.78619: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882889.78644: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882889.78693: variable '__network_team_connections_defined' from source: role '' defaults 30564 1726882889.78938: variable 'network_connections' from source: include params 30564 1726882889.78941: variable 'interface' from source: play vars 30564 1726882889.79013: variable 'interface' from source: play vars 30564 1726882889.79034: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30564 1726882889.79038: when evaluation is False, skipping this task 30564 1726882889.79040: _execute() done 30564 1726882889.79043: dumping result to json 30564 1726882889.79045: done dumping result, returning 30564 1726882889.79052: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-4216-acec-000000001b42] 30564 1726882889.79055: sending task result for task 0e448fcc-3ce9-4216-acec-000000001b42 30564 1726882889.79249: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001b42 30564 1726882889.79253: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30564 1726882889.79305: no more pending results, returning what we have 30564 1726882889.79308: results queue empty 30564 1726882889.79309: checking for any_errors_fatal 30564 1726882889.79315: done checking for any_errors_fatal 30564 1726882889.79316: checking for max_fail_percentage 30564 1726882889.79317: done checking for max_fail_percentage 30564 1726882889.79318: checking to see if all hosts have failed and the running result is not ok 30564 1726882889.79319: done checking to see if all hosts have failed 30564 1726882889.79320: getting the remaining hosts for this loop 30564 1726882889.79321: done getting the remaining hosts for this loop 30564 1726882889.79325: getting the next task for host managed_node2 30564 1726882889.79332: done getting next task for host managed_node2 30564 1726882889.79336: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 30564 1726882889.79341: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882889.79365: getting variables 30564 1726882889.79367: in VariableManager get_vars() 30564 1726882889.79404: Calling all_inventory to load vars for managed_node2 30564 1726882889.79407: Calling groups_inventory to load vars for managed_node2 30564 1726882889.79409: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882889.79418: Calling all_plugins_play to load vars for managed_node2 30564 1726882889.79421: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882889.79424: Calling groups_plugins_play to load vars for managed_node2 30564 1726882889.96843: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882890.00473: done with get_vars() 30564 1726882890.00505: done getting variables 30564 1726882890.00550: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:41:30 -0400 (0:00:00.430) 0:01:28.587 ****** 30564 1726882890.00583: entering _queue_task() for managed_node2/package 30564 1726882890.00940: worker is 1 (out of 1 available) 30564 1726882890.00953: exiting _queue_task() for managed_node2/package 30564 1726882890.00969: done queuing things up, now waiting for results queue to drain 30564 1726882890.00970: waiting for pending results... 30564 1726882890.01300: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 30564 1726882890.01476: in run() - task 0e448fcc-3ce9-4216-acec-000000001b43 30564 1726882890.01500: variable 'ansible_search_path' from source: unknown 30564 1726882890.01509: variable 'ansible_search_path' from source: unknown 30564 1726882890.01553: calling self._execute() 30564 1726882890.01676: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882890.01691: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882890.01712: variable 'omit' from source: magic vars 30564 1726882890.02146: variable 'ansible_distribution_major_version' from source: facts 30564 1726882890.02168: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882890.02389: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882890.02697: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882890.02753: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882890.02836: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882890.02878: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882890.03020: variable 'network_packages' from source: role '' defaults 30564 1726882890.03197: variable '__network_provider_setup' from source: role '' defaults 30564 1726882890.03272: variable '__network_service_name_default_nm' from source: role '' defaults 30564 1726882890.03397: variable '__network_service_name_default_nm' from source: role '' defaults 30564 1726882890.03411: variable '__network_packages_default_nm' from source: role '' defaults 30564 1726882890.03560: variable '__network_packages_default_nm' from source: role '' defaults 30564 1726882890.03953: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882890.06035: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882890.06085: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882890.06111: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882890.06134: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882890.06164: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882890.06225: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882890.06245: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882890.06268: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882890.06298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882890.06309: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882890.06338: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882890.06355: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882890.06379: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882890.06406: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882890.06416: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882890.06559: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30564 1726882890.06634: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882890.06651: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882890.06668: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882890.06697: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882890.06708: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882890.07118: variable 'ansible_python' from source: facts 30564 1726882890.07122: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30564 1726882890.07124: variable '__network_wpa_supplicant_required' from source: role '' defaults 30564 1726882890.07126: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30564 1726882890.07129: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882890.07131: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882890.07133: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882890.07159: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882890.07179: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882890.07265: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882890.07350: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882890.07540: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882890.07543: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882890.07545: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882890.07796: variable 'network_connections' from source: include params 30564 1726882890.07803: variable 'interface' from source: play vars 30564 1726882890.07902: variable 'interface' from source: play vars 30564 1726882890.07962: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882890.07993: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882890.08022: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882890.08051: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882890.08100: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882890.08948: variable 'network_connections' from source: include params 30564 1726882890.08951: variable 'interface' from source: play vars 30564 1726882890.09056: variable 'interface' from source: play vars 30564 1726882890.09087: variable '__network_packages_default_wireless' from source: role '' defaults 30564 1726882890.09165: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882890.09619: variable 'network_connections' from source: include params 30564 1726882890.09622: variable 'interface' from source: play vars 30564 1726882890.09697: variable 'interface' from source: play vars 30564 1726882890.09724: variable '__network_packages_default_team' from source: role '' defaults 30564 1726882890.09807: variable '__network_team_connections_defined' from source: role '' defaults 30564 1726882890.10052: variable 'network_connections' from source: include params 30564 1726882890.10055: variable 'interface' from source: play vars 30564 1726882890.10103: variable 'interface' from source: play vars 30564 1726882890.10144: variable '__network_service_name_default_initscripts' from source: role '' defaults 30564 1726882890.10186: variable '__network_service_name_default_initscripts' from source: role '' defaults 30564 1726882890.10191: variable '__network_packages_default_initscripts' from source: role '' defaults 30564 1726882890.10235: variable '__network_packages_default_initscripts' from source: role '' defaults 30564 1726882890.10378: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30564 1726882890.10713: variable 'network_connections' from source: include params 30564 1726882890.10722: variable 'interface' from source: play vars 30564 1726882890.10786: variable 'interface' from source: play vars 30564 1726882890.10798: variable 'ansible_distribution' from source: facts 30564 1726882890.10818: variable '__network_rh_distros' from source: role '' defaults 30564 1726882890.10828: variable 'ansible_distribution_major_version' from source: facts 30564 1726882890.10844: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30564 1726882890.11010: variable 'ansible_distribution' from source: facts 30564 1726882890.11017: variable '__network_rh_distros' from source: role '' defaults 30564 1726882890.11035: variable 'ansible_distribution_major_version' from source: facts 30564 1726882890.11049: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30564 1726882890.11216: variable 'ansible_distribution' from source: facts 30564 1726882890.11224: variable '__network_rh_distros' from source: role '' defaults 30564 1726882890.11234: variable 'ansible_distribution_major_version' from source: facts 30564 1726882890.11286: variable 'network_provider' from source: set_fact 30564 1726882890.11306: variable 'ansible_facts' from source: unknown 30564 1726882890.12284: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 30564 1726882890.12293: when evaluation is False, skipping this task 30564 1726882890.12300: _execute() done 30564 1726882890.12307: dumping result to json 30564 1726882890.12314: done dumping result, returning 30564 1726882890.12325: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [0e448fcc-3ce9-4216-acec-000000001b43] 30564 1726882890.12346: sending task result for task 0e448fcc-3ce9-4216-acec-000000001b43 skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 30564 1726882890.12530: no more pending results, returning what we have 30564 1726882890.12534: results queue empty 30564 1726882890.12535: checking for any_errors_fatal 30564 1726882890.12548: done checking for any_errors_fatal 30564 1726882890.12549: checking for max_fail_percentage 30564 1726882890.12551: done checking for max_fail_percentage 30564 1726882890.12552: checking to see if all hosts have failed and the running result is not ok 30564 1726882890.12553: done checking to see if all hosts have failed 30564 1726882890.12554: getting the remaining hosts for this loop 30564 1726882890.12557: done getting the remaining hosts for this loop 30564 1726882890.12561: getting the next task for host managed_node2 30564 1726882890.12582: done getting next task for host managed_node2 30564 1726882890.12588: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30564 1726882890.12594: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882890.12627: getting variables 30564 1726882890.12629: in VariableManager get_vars() 30564 1726882890.12693: Calling all_inventory to load vars for managed_node2 30564 1726882890.12696: Calling groups_inventory to load vars for managed_node2 30564 1726882890.12703: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882890.12715: Calling all_plugins_play to load vars for managed_node2 30564 1726882890.12719: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882890.12722: Calling groups_plugins_play to load vars for managed_node2 30564 1726882890.13821: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001b43 30564 1726882890.13826: WORKER PROCESS EXITING 30564 1726882890.15359: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882890.18423: done with get_vars() 30564 1726882890.18467: done getting variables 30564 1726882890.18533: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:41:30 -0400 (0:00:00.180) 0:01:28.767 ****** 30564 1726882890.18588: entering _queue_task() for managed_node2/package 30564 1726882890.18975: worker is 1 (out of 1 available) 30564 1726882890.18992: exiting _queue_task() for managed_node2/package 30564 1726882890.19018: done queuing things up, now waiting for results queue to drain 30564 1726882890.19020: waiting for pending results... 30564 1726882890.19403: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30564 1726882890.19614: in run() - task 0e448fcc-3ce9-4216-acec-000000001b44 30564 1726882890.19634: variable 'ansible_search_path' from source: unknown 30564 1726882890.20304: variable 'ansible_search_path' from source: unknown 30564 1726882890.20345: calling self._execute() 30564 1726882890.20580: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882890.20603: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882890.20629: variable 'omit' from source: magic vars 30564 1726882890.21121: variable 'ansible_distribution_major_version' from source: facts 30564 1726882890.21149: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882890.21601: variable 'network_state' from source: role '' defaults 30564 1726882890.21618: Evaluated conditional (network_state != {}): False 30564 1726882890.21626: when evaluation is False, skipping this task 30564 1726882890.21633: _execute() done 30564 1726882890.21641: dumping result to json 30564 1726882890.21648: done dumping result, returning 30564 1726882890.21660: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0e448fcc-3ce9-4216-acec-000000001b44] 30564 1726882890.21678: sending task result for task 0e448fcc-3ce9-4216-acec-000000001b44 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30564 1726882890.21861: no more pending results, returning what we have 30564 1726882890.21870: results queue empty 30564 1726882890.21871: checking for any_errors_fatal 30564 1726882890.21882: done checking for any_errors_fatal 30564 1726882890.21883: checking for max_fail_percentage 30564 1726882890.21885: done checking for max_fail_percentage 30564 1726882890.21886: checking to see if all hosts have failed and the running result is not ok 30564 1726882890.21886: done checking to see if all hosts have failed 30564 1726882890.21887: getting the remaining hosts for this loop 30564 1726882890.21889: done getting the remaining hosts for this loop 30564 1726882890.21893: getting the next task for host managed_node2 30564 1726882890.21905: done getting next task for host managed_node2 30564 1726882890.21910: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30564 1726882890.21920: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882890.21957: getting variables 30564 1726882890.21959: in VariableManager get_vars() 30564 1726882890.22009: Calling all_inventory to load vars for managed_node2 30564 1726882890.22012: Calling groups_inventory to load vars for managed_node2 30564 1726882890.22015: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882890.22028: Calling all_plugins_play to load vars for managed_node2 30564 1726882890.22035: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882890.22040: Calling groups_plugins_play to load vars for managed_node2 30564 1726882890.23017: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001b44 30564 1726882890.23020: WORKER PROCESS EXITING 30564 1726882890.25823: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882890.28233: done with get_vars() 30564 1726882890.28257: done getting variables 30564 1726882890.28349: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:41:30 -0400 (0:00:00.097) 0:01:28.865 ****** 30564 1726882890.28389: entering _queue_task() for managed_node2/package 30564 1726882890.28955: worker is 1 (out of 1 available) 30564 1726882890.28971: exiting _queue_task() for managed_node2/package 30564 1726882890.28994: done queuing things up, now waiting for results queue to drain 30564 1726882890.28995: waiting for pending results... 30564 1726882890.30029: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30564 1726882890.30188: in run() - task 0e448fcc-3ce9-4216-acec-000000001b45 30564 1726882890.30200: variable 'ansible_search_path' from source: unknown 30564 1726882890.30204: variable 'ansible_search_path' from source: unknown 30564 1726882890.30261: calling self._execute() 30564 1726882890.30374: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882890.30378: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882890.30389: variable 'omit' from source: magic vars 30564 1726882890.30796: variable 'ansible_distribution_major_version' from source: facts 30564 1726882890.30813: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882890.30947: variable 'network_state' from source: role '' defaults 30564 1726882890.30956: Evaluated conditional (network_state != {}): False 30564 1726882890.30959: when evaluation is False, skipping this task 30564 1726882890.30963: _execute() done 30564 1726882890.30967: dumping result to json 30564 1726882890.30973: done dumping result, returning 30564 1726882890.30979: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0e448fcc-3ce9-4216-acec-000000001b45] 30564 1726882890.30985: sending task result for task 0e448fcc-3ce9-4216-acec-000000001b45 30564 1726882890.31099: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001b45 30564 1726882890.31102: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30564 1726882890.31176: no more pending results, returning what we have 30564 1726882890.31181: results queue empty 30564 1726882890.31182: checking for any_errors_fatal 30564 1726882890.31190: done checking for any_errors_fatal 30564 1726882890.31191: checking for max_fail_percentage 30564 1726882890.31193: done checking for max_fail_percentage 30564 1726882890.31194: checking to see if all hosts have failed and the running result is not ok 30564 1726882890.31195: done checking to see if all hosts have failed 30564 1726882890.31196: getting the remaining hosts for this loop 30564 1726882890.31197: done getting the remaining hosts for this loop 30564 1726882890.31201: getting the next task for host managed_node2 30564 1726882890.31211: done getting next task for host managed_node2 30564 1726882890.31215: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30564 1726882890.31225: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882890.31257: getting variables 30564 1726882890.31259: in VariableManager get_vars() 30564 1726882890.31310: Calling all_inventory to load vars for managed_node2 30564 1726882890.31313: Calling groups_inventory to load vars for managed_node2 30564 1726882890.31316: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882890.31329: Calling all_plugins_play to load vars for managed_node2 30564 1726882890.31332: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882890.31335: Calling groups_plugins_play to load vars for managed_node2 30564 1726882890.33132: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882890.34802: done with get_vars() 30564 1726882890.34822: done getting variables 30564 1726882890.34876: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:41:30 -0400 (0:00:00.065) 0:01:28.930 ****** 30564 1726882890.34908: entering _queue_task() for managed_node2/service 30564 1726882890.35177: worker is 1 (out of 1 available) 30564 1726882890.35190: exiting _queue_task() for managed_node2/service 30564 1726882890.35203: done queuing things up, now waiting for results queue to drain 30564 1726882890.35204: waiting for pending results... 30564 1726882890.35521: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30564 1726882890.35642: in run() - task 0e448fcc-3ce9-4216-acec-000000001b46 30564 1726882890.35652: variable 'ansible_search_path' from source: unknown 30564 1726882890.35655: variable 'ansible_search_path' from source: unknown 30564 1726882890.35694: calling self._execute() 30564 1726882890.35802: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882890.35806: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882890.35824: variable 'omit' from source: magic vars 30564 1726882890.36218: variable 'ansible_distribution_major_version' from source: facts 30564 1726882890.36232: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882890.36357: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882890.36552: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882890.38936: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882890.39020: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882890.39053: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882890.39094: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882890.39119: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882890.39202: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882890.39230: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882890.39255: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882890.39303: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882890.39316: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882890.39361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882890.39389: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882890.39415: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882890.39454: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882890.39471: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882890.39513: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882890.39537: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882890.39562: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882890.39605: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882890.39621: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882890.39811: variable 'network_connections' from source: include params 30564 1726882890.39829: variable 'interface' from source: play vars 30564 1726882890.39893: variable 'interface' from source: play vars 30564 1726882890.39970: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882890.40131: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882890.40186: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882890.40215: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882890.40245: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882890.40292: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882890.40316: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882890.40341: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882890.40374: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882890.40422: variable '__network_team_connections_defined' from source: role '' defaults 30564 1726882890.40670: variable 'network_connections' from source: include params 30564 1726882890.40675: variable 'interface' from source: play vars 30564 1726882890.40739: variable 'interface' from source: play vars 30564 1726882890.40761: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30564 1726882890.40770: when evaluation is False, skipping this task 30564 1726882890.40773: _execute() done 30564 1726882890.40775: dumping result to json 30564 1726882890.40778: done dumping result, returning 30564 1726882890.40782: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-4216-acec-000000001b46] 30564 1726882890.40789: sending task result for task 0e448fcc-3ce9-4216-acec-000000001b46 30564 1726882890.40891: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001b46 30564 1726882890.40900: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30564 1726882890.40946: no more pending results, returning what we have 30564 1726882890.40950: results queue empty 30564 1726882890.40952: checking for any_errors_fatal 30564 1726882890.40960: done checking for any_errors_fatal 30564 1726882890.40960: checking for max_fail_percentage 30564 1726882890.40962: done checking for max_fail_percentage 30564 1726882890.40966: checking to see if all hosts have failed and the running result is not ok 30564 1726882890.40967: done checking to see if all hosts have failed 30564 1726882890.40967: getting the remaining hosts for this loop 30564 1726882890.40969: done getting the remaining hosts for this loop 30564 1726882890.40973: getting the next task for host managed_node2 30564 1726882890.40982: done getting next task for host managed_node2 30564 1726882890.40987: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30564 1726882890.40994: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882890.41021: getting variables 30564 1726882890.41023: in VariableManager get_vars() 30564 1726882890.41067: Calling all_inventory to load vars for managed_node2 30564 1726882890.41070: Calling groups_inventory to load vars for managed_node2 30564 1726882890.41073: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882890.41085: Calling all_plugins_play to load vars for managed_node2 30564 1726882890.41088: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882890.41092: Calling groups_plugins_play to load vars for managed_node2 30564 1726882890.42833: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882890.44858: done with get_vars() 30564 1726882890.44885: done getting variables 30564 1726882890.44956: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:41:30 -0400 (0:00:00.100) 0:01:29.031 ****** 30564 1726882890.44993: entering _queue_task() for managed_node2/service 30564 1726882890.45350: worker is 1 (out of 1 available) 30564 1726882890.45370: exiting _queue_task() for managed_node2/service 30564 1726882890.45392: done queuing things up, now waiting for results queue to drain 30564 1726882890.45394: waiting for pending results... 30564 1726882890.45617: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30564 1726882890.45698: in run() - task 0e448fcc-3ce9-4216-acec-000000001b47 30564 1726882890.45710: variable 'ansible_search_path' from source: unknown 30564 1726882890.45714: variable 'ansible_search_path' from source: unknown 30564 1726882890.45743: calling self._execute() 30564 1726882890.45826: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882890.45831: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882890.45844: variable 'omit' from source: magic vars 30564 1726882890.46136: variable 'ansible_distribution_major_version' from source: facts 30564 1726882890.46146: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882890.46261: variable 'network_provider' from source: set_fact 30564 1726882890.46267: variable 'network_state' from source: role '' defaults 30564 1726882890.46281: Evaluated conditional (network_provider == "nm" or network_state != {}): True 30564 1726882890.46284: variable 'omit' from source: magic vars 30564 1726882890.46330: variable 'omit' from source: magic vars 30564 1726882890.46348: variable 'network_service_name' from source: role '' defaults 30564 1726882890.46400: variable 'network_service_name' from source: role '' defaults 30564 1726882890.46474: variable '__network_provider_setup' from source: role '' defaults 30564 1726882890.46477: variable '__network_service_name_default_nm' from source: role '' defaults 30564 1726882890.46526: variable '__network_service_name_default_nm' from source: role '' defaults 30564 1726882890.46533: variable '__network_packages_default_nm' from source: role '' defaults 30564 1726882890.46579: variable '__network_packages_default_nm' from source: role '' defaults 30564 1726882890.46733: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882890.48739: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882890.48797: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882890.48824: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882890.48850: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882890.48873: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882890.48928: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882890.48948: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882890.48967: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882890.48998: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882890.49010: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882890.49040: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882890.49056: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882890.49075: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882890.49105: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882890.49116: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882890.49266: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30564 1726882890.49340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882890.49356: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882890.49375: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882890.49400: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882890.49413: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882890.49476: variable 'ansible_python' from source: facts 30564 1726882890.49489: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30564 1726882890.49545: variable '__network_wpa_supplicant_required' from source: role '' defaults 30564 1726882890.49599: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30564 1726882890.49686: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882890.49702: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882890.49719: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882890.49746: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882890.49757: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882890.49792: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882890.49811: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882890.49827: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882890.49854: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882890.49870: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882890.49956: variable 'network_connections' from source: include params 30564 1726882890.49965: variable 'interface' from source: play vars 30564 1726882890.50018: variable 'interface' from source: play vars 30564 1726882890.50093: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882890.50221: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882890.50255: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882890.50291: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882890.50323: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882890.50363: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882890.50387: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882890.50412: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882890.50435: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882890.50484: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882890.50986: variable 'network_connections' from source: include params 30564 1726882890.50990: variable 'interface' from source: play vars 30564 1726882890.50992: variable 'interface' from source: play vars 30564 1726882890.50994: variable '__network_packages_default_wireless' from source: role '' defaults 30564 1726882890.50996: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882890.51208: variable 'network_connections' from source: include params 30564 1726882890.51216: variable 'interface' from source: play vars 30564 1726882890.51282: variable 'interface' from source: play vars 30564 1726882890.51309: variable '__network_packages_default_team' from source: role '' defaults 30564 1726882890.51376: variable '__network_team_connections_defined' from source: role '' defaults 30564 1726882890.51653: variable 'network_connections' from source: include params 30564 1726882890.51656: variable 'interface' from source: play vars 30564 1726882890.51724: variable 'interface' from source: play vars 30564 1726882890.51775: variable '__network_service_name_default_initscripts' from source: role '' defaults 30564 1726882890.51830: variable '__network_service_name_default_initscripts' from source: role '' defaults 30564 1726882890.51837: variable '__network_packages_default_initscripts' from source: role '' defaults 30564 1726882890.51896: variable '__network_packages_default_initscripts' from source: role '' defaults 30564 1726882890.52104: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30564 1726882890.52890: variable 'network_connections' from source: include params 30564 1726882890.52893: variable 'interface' from source: play vars 30564 1726882890.52955: variable 'interface' from source: play vars 30564 1726882890.52962: variable 'ansible_distribution' from source: facts 30564 1726882890.52972: variable '__network_rh_distros' from source: role '' defaults 30564 1726882890.52975: variable 'ansible_distribution_major_version' from source: facts 30564 1726882890.52987: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30564 1726882890.53154: variable 'ansible_distribution' from source: facts 30564 1726882890.53158: variable '__network_rh_distros' from source: role '' defaults 30564 1726882890.53160: variable 'ansible_distribution_major_version' from source: facts 30564 1726882890.53645: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30564 1726882890.53811: variable 'ansible_distribution' from source: facts 30564 1726882890.53815: variable '__network_rh_distros' from source: role '' defaults 30564 1726882890.53821: variable 'ansible_distribution_major_version' from source: facts 30564 1726882890.53854: variable 'network_provider' from source: set_fact 30564 1726882890.53878: variable 'omit' from source: magic vars 30564 1726882890.53904: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882890.53932: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882890.53947: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882890.53966: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882890.53976: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882890.54005: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882890.54008: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882890.54011: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882890.54110: Set connection var ansible_timeout to 10 30564 1726882890.54113: Set connection var ansible_pipelining to False 30564 1726882890.54115: Set connection var ansible_shell_type to sh 30564 1726882890.54123: Set connection var ansible_shell_executable to /bin/sh 30564 1726882890.54130: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882890.54133: Set connection var ansible_connection to ssh 30564 1726882890.54159: variable 'ansible_shell_executable' from source: unknown 30564 1726882890.54164: variable 'ansible_connection' from source: unknown 30564 1726882890.54166: variable 'ansible_module_compression' from source: unknown 30564 1726882890.54172: variable 'ansible_shell_type' from source: unknown 30564 1726882890.54175: variable 'ansible_shell_executable' from source: unknown 30564 1726882890.54177: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882890.54179: variable 'ansible_pipelining' from source: unknown 30564 1726882890.54181: variable 'ansible_timeout' from source: unknown 30564 1726882890.54183: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882890.54282: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882890.54289: variable 'omit' from source: magic vars 30564 1726882890.54297: starting attempt loop 30564 1726882890.54300: running the handler 30564 1726882890.54376: variable 'ansible_facts' from source: unknown 30564 1726882890.55175: _low_level_execute_command(): starting 30564 1726882890.55181: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882890.55838: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882890.55847: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882890.55856: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882890.55875: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882890.55911: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882890.55919: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882890.55928: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882890.55940: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882890.55947: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882890.55953: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882890.55961: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882890.55973: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882890.56006: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882890.56012: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882890.56015: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882890.56017: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882890.56085: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882890.56103: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882890.56114: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882890.56246: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882890.57928: stdout chunk (state=3): >>>/root <<< 30564 1726882890.58034: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882890.58114: stderr chunk (state=3): >>><<< 30564 1726882890.58117: stdout chunk (state=3): >>><<< 30564 1726882890.58138: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882890.58148: _low_level_execute_command(): starting 30564 1726882890.58153: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882890.581368-34512-184539971804521 `" && echo ansible-tmp-1726882890.581368-34512-184539971804521="` echo /root/.ansible/tmp/ansible-tmp-1726882890.581368-34512-184539971804521 `" ) && sleep 0' 30564 1726882890.58812: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882890.58820: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882890.58830: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882890.58863: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882890.58902: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882890.58909: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882890.58918: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882890.58932: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882890.58939: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882890.59058: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882890.59071: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882890.59080: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882890.59092: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882890.59100: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882890.59107: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882890.59116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882890.59206: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882890.59219: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882890.59230: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882890.59356: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882890.61244: stdout chunk (state=3): >>>ansible-tmp-1726882890.581368-34512-184539971804521=/root/.ansible/tmp/ansible-tmp-1726882890.581368-34512-184539971804521 <<< 30564 1726882890.61408: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882890.61422: stderr chunk (state=3): >>><<< 30564 1726882890.61425: stdout chunk (state=3): >>><<< 30564 1726882890.61432: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882890.581368-34512-184539971804521=/root/.ansible/tmp/ansible-tmp-1726882890.581368-34512-184539971804521 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882890.61466: variable 'ansible_module_compression' from source: unknown 30564 1726882890.61522: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 30564 1726882890.61578: variable 'ansible_facts' from source: unknown 30564 1726882890.61780: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882890.581368-34512-184539971804521/AnsiballZ_systemd.py 30564 1726882890.61922: Sending initial data 30564 1726882890.61925: Sent initial data (155 bytes) 30564 1726882890.62826: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882890.62835: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882890.62846: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882890.62857: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882890.62897: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882890.62904: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882890.62914: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882890.63043: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882890.63047: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882890.63049: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882890.63051: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882890.63053: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882890.63055: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882890.63057: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882890.63059: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882890.63061: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882890.63189: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882890.63193: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882890.63196: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882890.63476: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882890.65056: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882890.65155: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882890.65256: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmpoft64ugd /root/.ansible/tmp/ansible-tmp-1726882890.581368-34512-184539971804521/AnsiballZ_systemd.py <<< 30564 1726882890.65353: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882890.68357: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882890.68430: stderr chunk (state=3): >>><<< 30564 1726882890.68433: stdout chunk (state=3): >>><<< 30564 1726882890.68452: done transferring module to remote 30564 1726882890.68465: _low_level_execute_command(): starting 30564 1726882890.68472: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882890.581368-34512-184539971804521/ /root/.ansible/tmp/ansible-tmp-1726882890.581368-34512-184539971804521/AnsiballZ_systemd.py && sleep 0' 30564 1726882890.69054: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882890.69062: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882890.69075: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882890.69090: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882890.69127: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882890.69134: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882890.69144: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882890.69157: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882890.69166: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882890.69176: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882890.69183: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882890.69192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882890.69203: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882890.69210: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882890.69216: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882890.69224: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882890.69296: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882890.69309: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882890.69319: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882890.69455: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882890.71240: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882890.71366: stderr chunk (state=3): >>><<< 30564 1726882890.71372: stdout chunk (state=3): >>><<< 30564 1726882890.71387: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882890.71394: _low_level_execute_command(): starting 30564 1726882890.71397: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882890.581368-34512-184539971804521/AnsiballZ_systemd.py && sleep 0' 30564 1726882890.71996: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882890.72007: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882890.72029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882890.72042: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882890.72094: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882890.72102: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882890.72112: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882890.72132: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882890.72148: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882890.72155: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882890.72167: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882890.72176: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882890.72188: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882890.72196: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882890.72202: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882890.72212: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882890.72298: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882890.72311: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882890.72321: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882890.72446: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882890.97359: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6692", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ExecMainStartTimestampMonotonic": "202392137", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6692", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManag<<< 30564 1726882890.97393: stdout chunk (state=3): >>>er.service", "ControlGroupId": "3602", "MemoryCurrent": "9187328", "MemoryAvailable": "infinity", "CPUUsageNSec": "2280220000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Watchdo<<< 30564 1726882890.97428: stdout chunk (state=3): >>>gSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service multi-user.target network.target shutdown.target cloud-init.service", "After": "cloud-init-local.service dbus-broker.service network-pre.target system.slice dbus.socket systemd-journald.socket basic.target sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:57 EDT", "StateChangeTimestampMonotonic": "316658837", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveExitTimestampMonotonic": "202392395", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveEnterTimestampMonotonic": "202472383", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveExitTimestampMonotonic": "202362940", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveEnterTimestampMonotonic": "202381901", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ConditionTimestampMonotonic": "202382734", "AssertTimestamp": "Fri 2024-09-20 21:31:03 EDT", "AssertTimestampMonotonic": "202382737", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "55e27919215348fab37a11b7ea324f90", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 30564 1726882890.98912: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882890.98972: stderr chunk (state=3): >>><<< 30564 1726882890.98976: stdout chunk (state=3): >>><<< 30564 1726882890.98986: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6692", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ExecMainStartTimestampMonotonic": "202392137", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6692", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3602", "MemoryCurrent": "9187328", "MemoryAvailable": "infinity", "CPUUsageNSec": "2280220000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service multi-user.target network.target shutdown.target cloud-init.service", "After": "cloud-init-local.service dbus-broker.service network-pre.target system.slice dbus.socket systemd-journald.socket basic.target sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:57 EDT", "StateChangeTimestampMonotonic": "316658837", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveExitTimestampMonotonic": "202392395", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveEnterTimestampMonotonic": "202472383", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveExitTimestampMonotonic": "202362940", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveEnterTimestampMonotonic": "202381901", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ConditionTimestampMonotonic": "202382734", "AssertTimestamp": "Fri 2024-09-20 21:31:03 EDT", "AssertTimestampMonotonic": "202382737", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "55e27919215348fab37a11b7ea324f90", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882890.99103: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882890.581368-34512-184539971804521/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882890.99116: _low_level_execute_command(): starting 30564 1726882890.99121: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882890.581368-34512-184539971804521/ > /dev/null 2>&1 && sleep 0' 30564 1726882890.99677: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882890.99691: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882890.99828: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882891.01678: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882891.01692: stderr chunk (state=3): >>><<< 30564 1726882891.01695: stdout chunk (state=3): >>><<< 30564 1726882891.01708: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882891.01715: handler run complete 30564 1726882891.01773: attempt loop complete, returning result 30564 1726882891.01776: _execute() done 30564 1726882891.01778: dumping result to json 30564 1726882891.01791: done dumping result, returning 30564 1726882891.01801: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0e448fcc-3ce9-4216-acec-000000001b47] 30564 1726882891.01806: sending task result for task 0e448fcc-3ce9-4216-acec-000000001b47 30564 1726882891.02093: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001b47 30564 1726882891.02096: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30564 1726882891.02154: no more pending results, returning what we have 30564 1726882891.02157: results queue empty 30564 1726882891.02159: checking for any_errors_fatal 30564 1726882891.02174: done checking for any_errors_fatal 30564 1726882891.02176: checking for max_fail_percentage 30564 1726882891.02177: done checking for max_fail_percentage 30564 1726882891.02178: checking to see if all hosts have failed and the running result is not ok 30564 1726882891.02179: done checking to see if all hosts have failed 30564 1726882891.02180: getting the remaining hosts for this loop 30564 1726882891.02181: done getting the remaining hosts for this loop 30564 1726882891.02184: getting the next task for host managed_node2 30564 1726882891.02191: done getting next task for host managed_node2 30564 1726882891.02194: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30564 1726882891.02200: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882891.02212: getting variables 30564 1726882891.02213: in VariableManager get_vars() 30564 1726882891.02245: Calling all_inventory to load vars for managed_node2 30564 1726882891.02248: Calling groups_inventory to load vars for managed_node2 30564 1726882891.02250: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882891.02258: Calling all_plugins_play to load vars for managed_node2 30564 1726882891.02261: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882891.02265: Calling groups_plugins_play to load vars for managed_node2 30564 1726882891.03917: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882891.05998: done with get_vars() 30564 1726882891.06033: done getting variables 30564 1726882891.06103: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:41:31 -0400 (0:00:00.611) 0:01:29.642 ****** 30564 1726882891.06148: entering _queue_task() for managed_node2/service 30564 1726882891.06521: worker is 1 (out of 1 available) 30564 1726882891.06533: exiting _queue_task() for managed_node2/service 30564 1726882891.06547: done queuing things up, now waiting for results queue to drain 30564 1726882891.06553: waiting for pending results... 30564 1726882891.06900: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30564 1726882891.07062: in run() - task 0e448fcc-3ce9-4216-acec-000000001b48 30564 1726882891.07086: variable 'ansible_search_path' from source: unknown 30564 1726882891.07099: variable 'ansible_search_path' from source: unknown 30564 1726882891.07145: calling self._execute() 30564 1726882891.07271: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882891.07284: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882891.07298: variable 'omit' from source: magic vars 30564 1726882891.07738: variable 'ansible_distribution_major_version' from source: facts 30564 1726882891.07760: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882891.07905: variable 'network_provider' from source: set_fact 30564 1726882891.07915: Evaluated conditional (network_provider == "nm"): True 30564 1726882891.08026: variable '__network_wpa_supplicant_required' from source: role '' defaults 30564 1726882891.08136: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30564 1726882891.08355: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882891.11910: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882891.11998: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882891.12050: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882891.12091: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882891.12117: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882891.12206: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882891.12239: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882891.12291: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882891.12337: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882891.12358: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882891.12419: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882891.12451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882891.12502: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882891.12622: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882891.12642: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882891.12735: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882891.12829: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882891.12857: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882891.12953: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882891.12988: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882891.13396: variable 'network_connections' from source: include params 30564 1726882891.13413: variable 'interface' from source: play vars 30564 1726882891.13503: variable 'interface' from source: play vars 30564 1726882891.13647: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882891.14153: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882891.14203: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882891.14259: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882891.14301: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882891.14361: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882891.14395: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882891.14431: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882891.14477: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882891.14532: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882891.14829: variable 'network_connections' from source: include params 30564 1726882891.14840: variable 'interface' from source: play vars 30564 1726882891.14922: variable 'interface' from source: play vars 30564 1726882891.14955: Evaluated conditional (__network_wpa_supplicant_required): False 30564 1726882891.14965: when evaluation is False, skipping this task 30564 1726882891.14977: _execute() done 30564 1726882891.14986: dumping result to json 30564 1726882891.15000: done dumping result, returning 30564 1726882891.15013: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0e448fcc-3ce9-4216-acec-000000001b48] 30564 1726882891.15034: sending task result for task 0e448fcc-3ce9-4216-acec-000000001b48 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 30564 1726882891.15204: no more pending results, returning what we have 30564 1726882891.15208: results queue empty 30564 1726882891.15209: checking for any_errors_fatal 30564 1726882891.15230: done checking for any_errors_fatal 30564 1726882891.15231: checking for max_fail_percentage 30564 1726882891.15233: done checking for max_fail_percentage 30564 1726882891.15234: checking to see if all hosts have failed and the running result is not ok 30564 1726882891.15235: done checking to see if all hosts have failed 30564 1726882891.15236: getting the remaining hosts for this loop 30564 1726882891.15238: done getting the remaining hosts for this loop 30564 1726882891.15242: getting the next task for host managed_node2 30564 1726882891.15251: done getting next task for host managed_node2 30564 1726882891.15255: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 30564 1726882891.15260: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882891.15295: getting variables 30564 1726882891.15298: in VariableManager get_vars() 30564 1726882891.15346: Calling all_inventory to load vars for managed_node2 30564 1726882891.15349: Calling groups_inventory to load vars for managed_node2 30564 1726882891.15352: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882891.15365: Calling all_plugins_play to load vars for managed_node2 30564 1726882891.15371: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882891.15375: Calling groups_plugins_play to load vars for managed_node2 30564 1726882891.16456: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001b48 30564 1726882891.16460: WORKER PROCESS EXITING 30564 1726882891.17573: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882891.19443: done with get_vars() 30564 1726882891.19472: done getting variables 30564 1726882891.19544: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:41:31 -0400 (0:00:00.134) 0:01:29.777 ****** 30564 1726882891.19583: entering _queue_task() for managed_node2/service 30564 1726882891.19953: worker is 1 (out of 1 available) 30564 1726882891.19971: exiting _queue_task() for managed_node2/service 30564 1726882891.19983: done queuing things up, now waiting for results queue to drain 30564 1726882891.19984: waiting for pending results... 30564 1726882891.20342: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 30564 1726882891.20507: in run() - task 0e448fcc-3ce9-4216-acec-000000001b49 30564 1726882891.20530: variable 'ansible_search_path' from source: unknown 30564 1726882891.20543: variable 'ansible_search_path' from source: unknown 30564 1726882891.20588: calling self._execute() 30564 1726882891.20727: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882891.20739: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882891.20758: variable 'omit' from source: magic vars 30564 1726882891.21227: variable 'ansible_distribution_major_version' from source: facts 30564 1726882891.21253: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882891.21398: variable 'network_provider' from source: set_fact 30564 1726882891.21414: Evaluated conditional (network_provider == "initscripts"): False 30564 1726882891.21422: when evaluation is False, skipping this task 30564 1726882891.21430: _execute() done 30564 1726882891.21436: dumping result to json 30564 1726882891.21443: done dumping result, returning 30564 1726882891.21453: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [0e448fcc-3ce9-4216-acec-000000001b49] 30564 1726882891.21465: sending task result for task 0e448fcc-3ce9-4216-acec-000000001b49 skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30564 1726882891.21644: no more pending results, returning what we have 30564 1726882891.21649: results queue empty 30564 1726882891.21651: checking for any_errors_fatal 30564 1726882891.21660: done checking for any_errors_fatal 30564 1726882891.21662: checking for max_fail_percentage 30564 1726882891.21666: done checking for max_fail_percentage 30564 1726882891.21667: checking to see if all hosts have failed and the running result is not ok 30564 1726882891.21670: done checking to see if all hosts have failed 30564 1726882891.21671: getting the remaining hosts for this loop 30564 1726882891.21673: done getting the remaining hosts for this loop 30564 1726882891.21679: getting the next task for host managed_node2 30564 1726882891.21689: done getting next task for host managed_node2 30564 1726882891.21694: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30564 1726882891.21703: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882891.21739: getting variables 30564 1726882891.21742: in VariableManager get_vars() 30564 1726882891.21797: Calling all_inventory to load vars for managed_node2 30564 1726882891.21800: Calling groups_inventory to load vars for managed_node2 30564 1726882891.21803: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882891.21817: Calling all_plugins_play to load vars for managed_node2 30564 1726882891.21820: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882891.21824: Calling groups_plugins_play to load vars for managed_node2 30564 1726882891.22872: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001b49 30564 1726882891.22875: WORKER PROCESS EXITING 30564 1726882891.23741: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882891.27482: done with get_vars() 30564 1726882891.27518: done getting variables 30564 1726882891.27716: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:41:31 -0400 (0:00:00.082) 0:01:29.860 ****** 30564 1726882891.27878: entering _queue_task() for managed_node2/copy 30564 1726882891.28631: worker is 1 (out of 1 available) 30564 1726882891.28643: exiting _queue_task() for managed_node2/copy 30564 1726882891.28657: done queuing things up, now waiting for results queue to drain 30564 1726882891.28658: waiting for pending results... 30564 1726882891.29781: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30564 1726882891.30186: in run() - task 0e448fcc-3ce9-4216-acec-000000001b4a 30564 1726882891.30243: variable 'ansible_search_path' from source: unknown 30564 1726882891.30308: variable 'ansible_search_path' from source: unknown 30564 1726882891.30355: calling self._execute() 30564 1726882891.30608: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882891.30796: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882891.30813: variable 'omit' from source: magic vars 30564 1726882891.31637: variable 'ansible_distribution_major_version' from source: facts 30564 1726882891.31733: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882891.31979: variable 'network_provider' from source: set_fact 30564 1726882891.32081: Evaluated conditional (network_provider == "initscripts"): False 30564 1726882891.32119: when evaluation is False, skipping this task 30564 1726882891.32147: _execute() done 30564 1726882891.32183: dumping result to json 30564 1726882891.32193: done dumping result, returning 30564 1726882891.32228: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0e448fcc-3ce9-4216-acec-000000001b4a] 30564 1726882891.32324: sending task result for task 0e448fcc-3ce9-4216-acec-000000001b4a skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 30564 1726882891.32496: no more pending results, returning what we have 30564 1726882891.32500: results queue empty 30564 1726882891.32501: checking for any_errors_fatal 30564 1726882891.32507: done checking for any_errors_fatal 30564 1726882891.32508: checking for max_fail_percentage 30564 1726882891.32510: done checking for max_fail_percentage 30564 1726882891.32511: checking to see if all hosts have failed and the running result is not ok 30564 1726882891.32512: done checking to see if all hosts have failed 30564 1726882891.32513: getting the remaining hosts for this loop 30564 1726882891.32514: done getting the remaining hosts for this loop 30564 1726882891.32518: getting the next task for host managed_node2 30564 1726882891.32526: done getting next task for host managed_node2 30564 1726882891.32531: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30564 1726882891.32537: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882891.32557: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001b4a 30564 1726882891.32561: WORKER PROCESS EXITING 30564 1726882891.32584: getting variables 30564 1726882891.32587: in VariableManager get_vars() 30564 1726882891.32633: Calling all_inventory to load vars for managed_node2 30564 1726882891.32636: Calling groups_inventory to load vars for managed_node2 30564 1726882891.32639: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882891.32653: Calling all_plugins_play to load vars for managed_node2 30564 1726882891.32656: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882891.32659: Calling groups_plugins_play to load vars for managed_node2 30564 1726882891.35341: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882891.38040: done with get_vars() 30564 1726882891.38770: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:41:31 -0400 (0:00:00.109) 0:01:29.969 ****** 30564 1726882891.38852: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 30564 1726882891.39177: worker is 1 (out of 1 available) 30564 1726882891.39189: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 30564 1726882891.39200: done queuing things up, now waiting for results queue to drain 30564 1726882891.39201: waiting for pending results... 30564 1726882891.39902: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30564 1726882891.40241: in run() - task 0e448fcc-3ce9-4216-acec-000000001b4b 30564 1726882891.40312: variable 'ansible_search_path' from source: unknown 30564 1726882891.40320: variable 'ansible_search_path' from source: unknown 30564 1726882891.40450: calling self._execute() 30564 1726882891.40684: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882891.40704: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882891.40721: variable 'omit' from source: magic vars 30564 1726882891.42308: variable 'ansible_distribution_major_version' from source: facts 30564 1726882891.42487: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882891.42499: variable 'omit' from source: magic vars 30564 1726882891.42687: variable 'omit' from source: magic vars 30564 1726882891.43208: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882891.46741: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882891.46829: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882891.46874: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882891.46924: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882891.46953: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882891.47050: variable 'network_provider' from source: set_fact 30564 1726882891.47207: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882891.47249: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882891.47288: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882891.47342: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882891.47361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882891.47460: variable 'omit' from source: magic vars 30564 1726882891.47588: variable 'omit' from source: magic vars 30564 1726882891.47706: variable 'network_connections' from source: include params 30564 1726882891.47727: variable 'interface' from source: play vars 30564 1726882891.47803: variable 'interface' from source: play vars 30564 1726882891.47973: variable 'omit' from source: magic vars 30564 1726882891.47986: variable '__lsr_ansible_managed' from source: task vars 30564 1726882891.48050: variable '__lsr_ansible_managed' from source: task vars 30564 1726882891.48246: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 30564 1726882891.48485: Loaded config def from plugin (lookup/template) 30564 1726882891.48494: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 30564 1726882891.48525: File lookup term: get_ansible_managed.j2 30564 1726882891.48540: variable 'ansible_search_path' from source: unknown 30564 1726882891.48551: evaluation_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 30564 1726882891.48574: search_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 30564 1726882891.48596: variable 'ansible_search_path' from source: unknown 30564 1726882891.55387: variable 'ansible_managed' from source: unknown 30564 1726882891.55529: variable 'omit' from source: magic vars 30564 1726882891.55561: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882891.55595: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882891.55617: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882891.55638: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882891.55652: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882891.55689: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882891.55699: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882891.55709: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882891.55810: Set connection var ansible_timeout to 10 30564 1726882891.55823: Set connection var ansible_pipelining to False 30564 1726882891.55830: Set connection var ansible_shell_type to sh 30564 1726882891.55841: Set connection var ansible_shell_executable to /bin/sh 30564 1726882891.55853: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882891.55860: Set connection var ansible_connection to ssh 30564 1726882891.55894: variable 'ansible_shell_executable' from source: unknown 30564 1726882891.55903: variable 'ansible_connection' from source: unknown 30564 1726882891.55910: variable 'ansible_module_compression' from source: unknown 30564 1726882891.55917: variable 'ansible_shell_type' from source: unknown 30564 1726882891.55924: variable 'ansible_shell_executable' from source: unknown 30564 1726882891.55933: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882891.55941: variable 'ansible_pipelining' from source: unknown 30564 1726882891.55948: variable 'ansible_timeout' from source: unknown 30564 1726882891.55956: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882891.56093: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30564 1726882891.56116: variable 'omit' from source: magic vars 30564 1726882891.56127: starting attempt loop 30564 1726882891.56135: running the handler 30564 1726882891.56154: _low_level_execute_command(): starting 30564 1726882891.56174: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882891.56671: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882891.56675: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882891.56689: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882891.56699: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882891.56706: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882891.56727: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882891.56730: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882891.56777: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882891.56786: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882891.56799: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882891.56918: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882891.58568: stdout chunk (state=3): >>>/root <<< 30564 1726882891.58675: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882891.58717: stderr chunk (state=3): >>><<< 30564 1726882891.58723: stdout chunk (state=3): >>><<< 30564 1726882891.58745: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882891.58757: _low_level_execute_command(): starting 30564 1726882891.58762: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882891.5874538-34552-122707626254007 `" && echo ansible-tmp-1726882891.5874538-34552-122707626254007="` echo /root/.ansible/tmp/ansible-tmp-1726882891.5874538-34552-122707626254007 `" ) && sleep 0' 30564 1726882891.59294: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882891.59297: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882891.59369: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882891.59373: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882891.59377: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882891.59380: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882891.59382: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882891.59384: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882891.59395: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882891.59398: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882891.59404: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882891.59414: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882891.59425: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882891.59433: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882891.59440: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882891.59449: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882891.59525: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882891.59538: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882891.59549: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882891.59671: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882891.61532: stdout chunk (state=3): >>>ansible-tmp-1726882891.5874538-34552-122707626254007=/root/.ansible/tmp/ansible-tmp-1726882891.5874538-34552-122707626254007 <<< 30564 1726882891.61702: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882891.61706: stdout chunk (state=3): >>><<< 30564 1726882891.61713: stderr chunk (state=3): >>><<< 30564 1726882891.61730: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882891.5874538-34552-122707626254007=/root/.ansible/tmp/ansible-tmp-1726882891.5874538-34552-122707626254007 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882891.61779: variable 'ansible_module_compression' from source: unknown 30564 1726882891.61823: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 30564 1726882891.61870: variable 'ansible_facts' from source: unknown 30564 1726882891.61999: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882891.5874538-34552-122707626254007/AnsiballZ_network_connections.py 30564 1726882891.62138: Sending initial data 30564 1726882891.62141: Sent initial data (168 bytes) 30564 1726882891.63061: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882891.63075: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882891.63087: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882891.63101: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882891.63138: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882891.63149: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882891.63159: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882891.63181: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882891.63189: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882891.63196: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882891.63205: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882891.63214: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882891.63225: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882891.63232: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882891.63239: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882891.63248: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882891.63328: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882891.63352: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882891.63355: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882891.63488: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882891.65215: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882891.65312: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882891.65415: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmp__kj1e1i /root/.ansible/tmp/ansible-tmp-1726882891.5874538-34552-122707626254007/AnsiballZ_network_connections.py <<< 30564 1726882891.65512: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882891.67270: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882891.67409: stderr chunk (state=3): >>><<< 30564 1726882891.67412: stdout chunk (state=3): >>><<< 30564 1726882891.67435: done transferring module to remote 30564 1726882891.67446: _low_level_execute_command(): starting 30564 1726882891.67451: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882891.5874538-34552-122707626254007/ /root/.ansible/tmp/ansible-tmp-1726882891.5874538-34552-122707626254007/AnsiballZ_network_connections.py && sleep 0' 30564 1726882891.68123: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882891.68132: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882891.68143: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882891.68157: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882891.68204: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882891.68211: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882891.68221: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882891.68234: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882891.68242: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882891.68248: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882891.68256: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882891.68267: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882891.68282: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882891.68296: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882891.68302: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882891.68312: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882891.68388: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882891.68407: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882891.68419: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882891.68540: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882891.70368: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882891.70374: stdout chunk (state=3): >>><<< 30564 1726882891.70382: stderr chunk (state=3): >>><<< 30564 1726882891.70397: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882891.70400: _low_level_execute_command(): starting 30564 1726882891.70404: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882891.5874538-34552-122707626254007/AnsiballZ_network_connections.py && sleep 0' 30564 1726882891.71005: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882891.71015: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882891.71023: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882891.71036: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882891.71076: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882891.71086: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882891.71096: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882891.71110: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882891.71115: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882891.71123: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882891.71130: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882891.71140: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882891.71151: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882891.71159: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882891.71167: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882891.71182: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882891.71252: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882891.71262: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882891.71282: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882891.71414: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882891.95837: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_zhy7i_io/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_zhy7i_io/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on statebr/891d4ab6-2d22-4634-8d3b-2e935067cc98: error=unknown <<< 30564 1726882891.96017: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 30564 1726882891.97508: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882891.97555: stderr chunk (state=3): >>><<< 30564 1726882891.97559: stdout chunk (state=3): >>><<< 30564 1726882891.97678: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_zhy7i_io/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_zhy7i_io/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on statebr/891d4ab6-2d22-4634-8d3b-2e935067cc98: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882891.97682: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'persistent_state': 'absent'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882891.5874538-34552-122707626254007/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882891.97685: _low_level_execute_command(): starting 30564 1726882891.97687: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882891.5874538-34552-122707626254007/ > /dev/null 2>&1 && sleep 0' 30564 1726882891.98353: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882891.98372: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882891.98389: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882891.98409: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882891.98471: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882891.98485: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882891.98501: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882891.98520: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882891.98532: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882891.98551: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882891.98566: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882891.98586: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882891.98603: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882891.98617: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882891.98629: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882891.98642: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882891.98731: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882891.98754: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882891.98783: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882891.98916: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882892.00727: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882892.00770: stderr chunk (state=3): >>><<< 30564 1726882892.00776: stdout chunk (state=3): >>><<< 30564 1726882892.00790: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882892.00796: handler run complete 30564 1726882892.00819: attempt loop complete, returning result 30564 1726882892.00822: _execute() done 30564 1726882892.00825: dumping result to json 30564 1726882892.00829: done dumping result, returning 30564 1726882892.00837: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0e448fcc-3ce9-4216-acec-000000001b4b] 30564 1726882892.00842: sending task result for task 0e448fcc-3ce9-4216-acec-000000001b4b 30564 1726882892.00952: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001b4b 30564 1726882892.00956: WORKER PROCESS EXITING changed: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 30564 1726882892.01086: no more pending results, returning what we have 30564 1726882892.01091: results queue empty 30564 1726882892.01092: checking for any_errors_fatal 30564 1726882892.01099: done checking for any_errors_fatal 30564 1726882892.01100: checking for max_fail_percentage 30564 1726882892.01101: done checking for max_fail_percentage 30564 1726882892.01102: checking to see if all hosts have failed and the running result is not ok 30564 1726882892.01103: done checking to see if all hosts have failed 30564 1726882892.01104: getting the remaining hosts for this loop 30564 1726882892.01106: done getting the remaining hosts for this loop 30564 1726882892.01109: getting the next task for host managed_node2 30564 1726882892.01116: done getting next task for host managed_node2 30564 1726882892.01119: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 30564 1726882892.01125: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882892.01139: getting variables 30564 1726882892.01140: in VariableManager get_vars() 30564 1726882892.01189: Calling all_inventory to load vars for managed_node2 30564 1726882892.01191: Calling groups_inventory to load vars for managed_node2 30564 1726882892.01194: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882892.01204: Calling all_plugins_play to load vars for managed_node2 30564 1726882892.01206: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882892.01209: Calling groups_plugins_play to load vars for managed_node2 30564 1726882892.02221: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882892.03800: done with get_vars() 30564 1726882892.03816: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:41:32 -0400 (0:00:00.650) 0:01:30.620 ****** 30564 1726882892.03907: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 30564 1726882892.04137: worker is 1 (out of 1 available) 30564 1726882892.04152: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 30564 1726882892.04166: done queuing things up, now waiting for results queue to drain 30564 1726882892.04167: waiting for pending results... 30564 1726882892.04381: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 30564 1726882892.04490: in run() - task 0e448fcc-3ce9-4216-acec-000000001b4c 30564 1726882892.04503: variable 'ansible_search_path' from source: unknown 30564 1726882892.04506: variable 'ansible_search_path' from source: unknown 30564 1726882892.04537: calling self._execute() 30564 1726882892.04623: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882892.04628: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882892.04640: variable 'omit' from source: magic vars 30564 1726882892.04932: variable 'ansible_distribution_major_version' from source: facts 30564 1726882892.04943: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882892.05031: variable 'network_state' from source: role '' defaults 30564 1726882892.05040: Evaluated conditional (network_state != {}): False 30564 1726882892.05045: when evaluation is False, skipping this task 30564 1726882892.05048: _execute() done 30564 1726882892.05051: dumping result to json 30564 1726882892.05054: done dumping result, returning 30564 1726882892.05057: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [0e448fcc-3ce9-4216-acec-000000001b4c] 30564 1726882892.05068: sending task result for task 0e448fcc-3ce9-4216-acec-000000001b4c 30564 1726882892.05161: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001b4c 30564 1726882892.05166: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30564 1726882892.05225: no more pending results, returning what we have 30564 1726882892.05229: results queue empty 30564 1726882892.05230: checking for any_errors_fatal 30564 1726882892.05240: done checking for any_errors_fatal 30564 1726882892.05240: checking for max_fail_percentage 30564 1726882892.05242: done checking for max_fail_percentage 30564 1726882892.05243: checking to see if all hosts have failed and the running result is not ok 30564 1726882892.05243: done checking to see if all hosts have failed 30564 1726882892.05244: getting the remaining hosts for this loop 30564 1726882892.05246: done getting the remaining hosts for this loop 30564 1726882892.05249: getting the next task for host managed_node2 30564 1726882892.05257: done getting next task for host managed_node2 30564 1726882892.05260: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30564 1726882892.05267: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882892.05293: getting variables 30564 1726882892.05298: in VariableManager get_vars() 30564 1726882892.05334: Calling all_inventory to load vars for managed_node2 30564 1726882892.05336: Calling groups_inventory to load vars for managed_node2 30564 1726882892.05339: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882892.05347: Calling all_plugins_play to load vars for managed_node2 30564 1726882892.05350: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882892.05351: Calling groups_plugins_play to load vars for managed_node2 30564 1726882892.06153: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882892.07752: done with get_vars() 30564 1726882892.07775: done getting variables 30564 1726882892.07831: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:41:32 -0400 (0:00:00.039) 0:01:30.659 ****** 30564 1726882892.07868: entering _queue_task() for managed_node2/debug 30564 1726882892.08127: worker is 1 (out of 1 available) 30564 1726882892.08140: exiting _queue_task() for managed_node2/debug 30564 1726882892.08153: done queuing things up, now waiting for results queue to drain 30564 1726882892.08155: waiting for pending results... 30564 1726882892.08480: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30564 1726882892.08633: in run() - task 0e448fcc-3ce9-4216-acec-000000001b4d 30564 1726882892.08655: variable 'ansible_search_path' from source: unknown 30564 1726882892.08666: variable 'ansible_search_path' from source: unknown 30564 1726882892.08715: calling self._execute() 30564 1726882892.08832: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882892.08844: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882892.08861: variable 'omit' from source: magic vars 30564 1726882892.09256: variable 'ansible_distribution_major_version' from source: facts 30564 1726882892.09277: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882892.09288: variable 'omit' from source: magic vars 30564 1726882892.09355: variable 'omit' from source: magic vars 30564 1726882892.09396: variable 'omit' from source: magic vars 30564 1726882892.09444: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882892.09491: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882892.09516: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882892.09540: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882892.09558: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882892.09598: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882892.09606: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882892.09614: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882892.09725: Set connection var ansible_timeout to 10 30564 1726882892.09736: Set connection var ansible_pipelining to False 30564 1726882892.09743: Set connection var ansible_shell_type to sh 30564 1726882892.09753: Set connection var ansible_shell_executable to /bin/sh 30564 1726882892.09767: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882892.09775: Set connection var ansible_connection to ssh 30564 1726882892.09808: variable 'ansible_shell_executable' from source: unknown 30564 1726882892.09817: variable 'ansible_connection' from source: unknown 30564 1726882892.09824: variable 'ansible_module_compression' from source: unknown 30564 1726882892.09830: variable 'ansible_shell_type' from source: unknown 30564 1726882892.09837: variable 'ansible_shell_executable' from source: unknown 30564 1726882892.09843: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882892.09851: variable 'ansible_pipelining' from source: unknown 30564 1726882892.09857: variable 'ansible_timeout' from source: unknown 30564 1726882892.09867: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882892.10018: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882892.10035: variable 'omit' from source: magic vars 30564 1726882892.10044: starting attempt loop 30564 1726882892.10052: running the handler 30564 1726882892.10183: variable '__network_connections_result' from source: set_fact 30564 1726882892.10242: handler run complete 30564 1726882892.10268: attempt loop complete, returning result 30564 1726882892.10275: _execute() done 30564 1726882892.10284: dumping result to json 30564 1726882892.10291: done dumping result, returning 30564 1726882892.10303: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0e448fcc-3ce9-4216-acec-000000001b4d] 30564 1726882892.10315: sending task result for task 0e448fcc-3ce9-4216-acec-000000001b4d 30564 1726882892.10425: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001b4d 30564 1726882892.10434: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "" ] } 30564 1726882892.10522: no more pending results, returning what we have 30564 1726882892.10525: results queue empty 30564 1726882892.10527: checking for any_errors_fatal 30564 1726882892.10534: done checking for any_errors_fatal 30564 1726882892.10535: checking for max_fail_percentage 30564 1726882892.10537: done checking for max_fail_percentage 30564 1726882892.10538: checking to see if all hosts have failed and the running result is not ok 30564 1726882892.10539: done checking to see if all hosts have failed 30564 1726882892.10540: getting the remaining hosts for this loop 30564 1726882892.10542: done getting the remaining hosts for this loop 30564 1726882892.10546: getting the next task for host managed_node2 30564 1726882892.10555: done getting next task for host managed_node2 30564 1726882892.10560: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30564 1726882892.10568: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882892.10583: getting variables 30564 1726882892.10585: in VariableManager get_vars() 30564 1726882892.10626: Calling all_inventory to load vars for managed_node2 30564 1726882892.10628: Calling groups_inventory to load vars for managed_node2 30564 1726882892.10631: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882892.10641: Calling all_plugins_play to load vars for managed_node2 30564 1726882892.10647: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882892.10650: Calling groups_plugins_play to load vars for managed_node2 30564 1726882892.13330: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882892.16058: done with get_vars() 30564 1726882892.16089: done getting variables 30564 1726882892.16154: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:41:32 -0400 (0:00:00.083) 0:01:30.743 ****** 30564 1726882892.16199: entering _queue_task() for managed_node2/debug 30564 1726882892.16532: worker is 1 (out of 1 available) 30564 1726882892.16546: exiting _queue_task() for managed_node2/debug 30564 1726882892.16559: done queuing things up, now waiting for results queue to drain 30564 1726882892.16560: waiting for pending results... 30564 1726882892.17686: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30564 1726882892.17962: in run() - task 0e448fcc-3ce9-4216-acec-000000001b4e 30564 1726882892.17986: variable 'ansible_search_path' from source: unknown 30564 1726882892.17993: variable 'ansible_search_path' from source: unknown 30564 1726882892.18072: calling self._execute() 30564 1726882892.18369: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882892.18383: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882892.18399: variable 'omit' from source: magic vars 30564 1726882892.19254: variable 'ansible_distribution_major_version' from source: facts 30564 1726882892.19277: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882892.19288: variable 'omit' from source: magic vars 30564 1726882892.19363: variable 'omit' from source: magic vars 30564 1726882892.19491: variable 'omit' from source: magic vars 30564 1726882892.19536: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882892.19705: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882892.19729: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882892.19752: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882892.19771: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882892.19812: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882892.19898: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882892.19906: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882892.20048: Set connection var ansible_timeout to 10 30564 1726882892.20118: Set connection var ansible_pipelining to False 30564 1726882892.20224: Set connection var ansible_shell_type to sh 30564 1726882892.20236: Set connection var ansible_shell_executable to /bin/sh 30564 1726882892.20248: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882892.20254: Set connection var ansible_connection to ssh 30564 1726882892.20284: variable 'ansible_shell_executable' from source: unknown 30564 1726882892.20292: variable 'ansible_connection' from source: unknown 30564 1726882892.20299: variable 'ansible_module_compression' from source: unknown 30564 1726882892.20305: variable 'ansible_shell_type' from source: unknown 30564 1726882892.20311: variable 'ansible_shell_executable' from source: unknown 30564 1726882892.20319: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882892.20332: variable 'ansible_pipelining' from source: unknown 30564 1726882892.20338: variable 'ansible_timeout' from source: unknown 30564 1726882892.20344: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882892.20590: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882892.20607: variable 'omit' from source: magic vars 30564 1726882892.20617: starting attempt loop 30564 1726882892.20623: running the handler 30564 1726882892.20685: variable '__network_connections_result' from source: set_fact 30564 1726882892.20771: variable '__network_connections_result' from source: set_fact 30564 1726882892.20888: handler run complete 30564 1726882892.20919: attempt loop complete, returning result 30564 1726882892.20926: _execute() done 30564 1726882892.20933: dumping result to json 30564 1726882892.20941: done dumping result, returning 30564 1726882892.20953: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0e448fcc-3ce9-4216-acec-000000001b4e] 30564 1726882892.20962: sending task result for task 0e448fcc-3ce9-4216-acec-000000001b4e ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 30564 1726882892.21183: no more pending results, returning what we have 30564 1726882892.21187: results queue empty 30564 1726882892.21188: checking for any_errors_fatal 30564 1726882892.21195: done checking for any_errors_fatal 30564 1726882892.21196: checking for max_fail_percentage 30564 1726882892.21198: done checking for max_fail_percentage 30564 1726882892.21199: checking to see if all hosts have failed and the running result is not ok 30564 1726882892.21200: done checking to see if all hosts have failed 30564 1726882892.21201: getting the remaining hosts for this loop 30564 1726882892.21203: done getting the remaining hosts for this loop 30564 1726882892.21206: getting the next task for host managed_node2 30564 1726882892.21215: done getting next task for host managed_node2 30564 1726882892.21219: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30564 1726882892.21226: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882892.21239: getting variables 30564 1726882892.21241: in VariableManager get_vars() 30564 1726882892.21285: Calling all_inventory to load vars for managed_node2 30564 1726882892.21288: Calling groups_inventory to load vars for managed_node2 30564 1726882892.21291: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882892.21302: Calling all_plugins_play to load vars for managed_node2 30564 1726882892.21305: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882892.21308: Calling groups_plugins_play to load vars for managed_node2 30564 1726882892.22281: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001b4e 30564 1726882892.22290: WORKER PROCESS EXITING 30564 1726882892.23677: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882892.26643: done with get_vars() 30564 1726882892.26672: done getting variables 30564 1726882892.26738: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:41:32 -0400 (0:00:00.105) 0:01:30.849 ****** 30564 1726882892.26774: entering _queue_task() for managed_node2/debug 30564 1726882892.27117: worker is 1 (out of 1 available) 30564 1726882892.27131: exiting _queue_task() for managed_node2/debug 30564 1726882892.27149: done queuing things up, now waiting for results queue to drain 30564 1726882892.27150: waiting for pending results... 30564 1726882892.27455: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30564 1726882892.27553: in run() - task 0e448fcc-3ce9-4216-acec-000000001b4f 30564 1726882892.27593: variable 'ansible_search_path' from source: unknown 30564 1726882892.27596: variable 'ansible_search_path' from source: unknown 30564 1726882892.27610: calling self._execute() 30564 1726882892.27892: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882892.27897: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882892.27899: variable 'omit' from source: magic vars 30564 1726882892.28108: variable 'ansible_distribution_major_version' from source: facts 30564 1726882892.28121: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882892.28245: variable 'network_state' from source: role '' defaults 30564 1726882892.28254: Evaluated conditional (network_state != {}): False 30564 1726882892.28257: when evaluation is False, skipping this task 30564 1726882892.28260: _execute() done 30564 1726882892.28265: dumping result to json 30564 1726882892.28270: done dumping result, returning 30564 1726882892.28278: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0e448fcc-3ce9-4216-acec-000000001b4f] 30564 1726882892.28283: sending task result for task 0e448fcc-3ce9-4216-acec-000000001b4f 30564 1726882892.28380: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001b4f 30564 1726882892.28382: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "network_state != {}" } 30564 1726882892.28450: no more pending results, returning what we have 30564 1726882892.28453: results queue empty 30564 1726882892.28454: checking for any_errors_fatal 30564 1726882892.28462: done checking for any_errors_fatal 30564 1726882892.28462: checking for max_fail_percentage 30564 1726882892.28467: done checking for max_fail_percentage 30564 1726882892.28470: checking to see if all hosts have failed and the running result is not ok 30564 1726882892.28471: done checking to see if all hosts have failed 30564 1726882892.28472: getting the remaining hosts for this loop 30564 1726882892.28474: done getting the remaining hosts for this loop 30564 1726882892.28477: getting the next task for host managed_node2 30564 1726882892.28485: done getting next task for host managed_node2 30564 1726882892.28489: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 30564 1726882892.28494: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882892.28517: getting variables 30564 1726882892.28518: in VariableManager get_vars() 30564 1726882892.28551: Calling all_inventory to load vars for managed_node2 30564 1726882892.28554: Calling groups_inventory to load vars for managed_node2 30564 1726882892.28556: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882892.28567: Calling all_plugins_play to load vars for managed_node2 30564 1726882892.28572: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882892.28575: Calling groups_plugins_play to load vars for managed_node2 30564 1726882892.29713: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882892.31403: done with get_vars() 30564 1726882892.31431: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:41:32 -0400 (0:00:00.047) 0:01:30.896 ****** 30564 1726882892.31549: entering _queue_task() for managed_node2/ping 30564 1726882892.31889: worker is 1 (out of 1 available) 30564 1726882892.31902: exiting _queue_task() for managed_node2/ping 30564 1726882892.31912: done queuing things up, now waiting for results queue to drain 30564 1726882892.31913: waiting for pending results... 30564 1726882892.32249: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 30564 1726882892.32430: in run() - task 0e448fcc-3ce9-4216-acec-000000001b50 30564 1726882892.32450: variable 'ansible_search_path' from source: unknown 30564 1726882892.32459: variable 'ansible_search_path' from source: unknown 30564 1726882892.32513: calling self._execute() 30564 1726882892.32637: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882892.32648: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882892.32666: variable 'omit' from source: magic vars 30564 1726882892.33119: variable 'ansible_distribution_major_version' from source: facts 30564 1726882892.33146: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882892.33157: variable 'omit' from source: magic vars 30564 1726882892.33240: variable 'omit' from source: magic vars 30564 1726882892.33290: variable 'omit' from source: magic vars 30564 1726882892.33340: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882892.33397: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882892.33423: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882892.33445: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882892.33475: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882892.33513: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882892.33522: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882892.33528: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882892.33647: Set connection var ansible_timeout to 10 30564 1726882892.33658: Set connection var ansible_pipelining to False 30564 1726882892.33672: Set connection var ansible_shell_type to sh 30564 1726882892.33689: Set connection var ansible_shell_executable to /bin/sh 30564 1726882892.33702: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882892.33710: Set connection var ansible_connection to ssh 30564 1726882892.33742: variable 'ansible_shell_executable' from source: unknown 30564 1726882892.33785: variable 'ansible_connection' from source: unknown 30564 1726882892.33801: variable 'ansible_module_compression' from source: unknown 30564 1726882892.33808: variable 'ansible_shell_type' from source: unknown 30564 1726882892.33814: variable 'ansible_shell_executable' from source: unknown 30564 1726882892.33822: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882892.33833: variable 'ansible_pipelining' from source: unknown 30564 1726882892.33839: variable 'ansible_timeout' from source: unknown 30564 1726882892.33846: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882892.34757: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30564 1726882892.34780: variable 'omit' from source: magic vars 30564 1726882892.34789: starting attempt loop 30564 1726882892.34796: running the handler 30564 1726882892.34813: _low_level_execute_command(): starting 30564 1726882892.34825: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882892.36216: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882892.36234: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882892.36250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882892.36288: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882892.36334: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882892.36347: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882892.36361: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882892.36393: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882892.36411: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882892.36423: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882892.36436: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882892.36448: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882892.36463: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882892.36481: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882892.36501: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882892.36520: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882892.36602: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882892.36635: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882892.36654: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882892.36799: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882892.38462: stdout chunk (state=3): >>>/root <<< 30564 1726882892.38583: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882892.38672: stderr chunk (state=3): >>><<< 30564 1726882892.38678: stdout chunk (state=3): >>><<< 30564 1726882892.38808: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882892.38812: _low_level_execute_command(): starting 30564 1726882892.38815: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882892.3870745-34595-276035221278992 `" && echo ansible-tmp-1726882892.3870745-34595-276035221278992="` echo /root/.ansible/tmp/ansible-tmp-1726882892.3870745-34595-276035221278992 `" ) && sleep 0' 30564 1726882892.39449: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882892.39463: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882892.39490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882892.39515: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882892.39558: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882892.39576: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882892.39593: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882892.39622: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882892.39634: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882892.39644: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882892.39655: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882892.39675: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882892.39691: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882892.39705: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882892.39752: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882892.39772: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882892.39882: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882892.39903: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882892.39918: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882892.40053: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882892.41929: stdout chunk (state=3): >>>ansible-tmp-1726882892.3870745-34595-276035221278992=/root/.ansible/tmp/ansible-tmp-1726882892.3870745-34595-276035221278992 <<< 30564 1726882892.42121: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882892.42124: stdout chunk (state=3): >>><<< 30564 1726882892.42127: stderr chunk (state=3): >>><<< 30564 1726882892.42374: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882892.3870745-34595-276035221278992=/root/.ansible/tmp/ansible-tmp-1726882892.3870745-34595-276035221278992 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882892.42377: variable 'ansible_module_compression' from source: unknown 30564 1726882892.42380: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 30564 1726882892.42382: variable 'ansible_facts' from source: unknown 30564 1726882892.42384: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882892.3870745-34595-276035221278992/AnsiballZ_ping.py 30564 1726882892.42701: Sending initial data 30564 1726882892.42704: Sent initial data (153 bytes) 30564 1726882892.43731: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882892.43735: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882892.43760: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882892.43780: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882892.43784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882892.43853: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882892.43867: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882892.43997: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882892.45756: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882892.45851: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882892.45950: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmpokh4dazd /root/.ansible/tmp/ansible-tmp-1726882892.3870745-34595-276035221278992/AnsiballZ_ping.py <<< 30564 1726882892.46047: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882892.47414: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882892.47480: stderr chunk (state=3): >>><<< 30564 1726882892.47484: stdout chunk (state=3): >>><<< 30564 1726882892.47504: done transferring module to remote 30564 1726882892.47515: _low_level_execute_command(): starting 30564 1726882892.47520: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882892.3870745-34595-276035221278992/ /root/.ansible/tmp/ansible-tmp-1726882892.3870745-34595-276035221278992/AnsiballZ_ping.py && sleep 0' 30564 1726882892.48106: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882892.48115: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882892.48126: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882892.48140: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882892.48180: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882892.48190: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882892.48197: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882892.48210: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882892.48218: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882892.48225: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882892.48233: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882892.48241: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882892.48251: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882892.48258: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882892.48265: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882892.48279: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882892.48343: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882892.48356: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882892.48367: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882892.48491: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882892.50258: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882892.50339: stderr chunk (state=3): >>><<< 30564 1726882892.50350: stdout chunk (state=3): >>><<< 30564 1726882892.50456: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882892.50460: _low_level_execute_command(): starting 30564 1726882892.50462: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882892.3870745-34595-276035221278992/AnsiballZ_ping.py && sleep 0' 30564 1726882892.51054: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882892.51073: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882892.51091: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882892.51120: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882892.51163: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882892.51178: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882892.51192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882892.51208: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882892.51227: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882892.51241: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882892.51252: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882892.51265: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882892.51281: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882892.51292: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882892.51301: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882892.51313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882892.51399: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882892.51421: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882892.51447: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882892.51592: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882892.64474: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 30564 1726882892.65475: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882892.65549: stderr chunk (state=3): >>><<< 30564 1726882892.65553: stdout chunk (state=3): >>><<< 30564 1726882892.65586: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882892.65613: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882892.3870745-34595-276035221278992/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882892.65621: _low_level_execute_command(): starting 30564 1726882892.65624: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882892.3870745-34595-276035221278992/ > /dev/null 2>&1 && sleep 0' 30564 1726882892.66323: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882892.66336: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882892.66353: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882892.66395: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882892.66466: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882892.66480: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882892.66496: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882892.66508: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882892.66516: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882892.66522: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882892.66530: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882892.66540: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882892.66556: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882892.66559: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882892.66582: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882892.66602: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882892.66670: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882892.66680: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882892.66792: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882892.68604: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882892.68651: stderr chunk (state=3): >>><<< 30564 1726882892.68655: stdout chunk (state=3): >>><<< 30564 1726882892.68670: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882892.68676: handler run complete 30564 1726882892.68690: attempt loop complete, returning result 30564 1726882892.68693: _execute() done 30564 1726882892.68695: dumping result to json 30564 1726882892.68697: done dumping result, returning 30564 1726882892.68707: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0e448fcc-3ce9-4216-acec-000000001b50] 30564 1726882892.68711: sending task result for task 0e448fcc-3ce9-4216-acec-000000001b50 30564 1726882892.68805: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001b50 30564 1726882892.68808: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 30564 1726882892.68885: no more pending results, returning what we have 30564 1726882892.68888: results queue empty 30564 1726882892.68889: checking for any_errors_fatal 30564 1726882892.68896: done checking for any_errors_fatal 30564 1726882892.68896: checking for max_fail_percentage 30564 1726882892.68898: done checking for max_fail_percentage 30564 1726882892.68899: checking to see if all hosts have failed and the running result is not ok 30564 1726882892.68900: done checking to see if all hosts have failed 30564 1726882892.68901: getting the remaining hosts for this loop 30564 1726882892.68902: done getting the remaining hosts for this loop 30564 1726882892.68906: getting the next task for host managed_node2 30564 1726882892.68915: done getting next task for host managed_node2 30564 1726882892.68919: ^ task is: TASK: meta (role_complete) 30564 1726882892.68924: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882892.68938: getting variables 30564 1726882892.68939: in VariableManager get_vars() 30564 1726882892.68985: Calling all_inventory to load vars for managed_node2 30564 1726882892.68988: Calling groups_inventory to load vars for managed_node2 30564 1726882892.68990: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882892.69000: Calling all_plugins_play to load vars for managed_node2 30564 1726882892.69003: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882892.69005: Calling groups_plugins_play to load vars for managed_node2 30564 1726882892.70437: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882892.71587: done with get_vars() 30564 1726882892.71604: done getting variables 30564 1726882892.71666: done queuing things up, now waiting for results queue to drain 30564 1726882892.71669: results queue empty 30564 1726882892.71670: checking for any_errors_fatal 30564 1726882892.71672: done checking for any_errors_fatal 30564 1726882892.71672: checking for max_fail_percentage 30564 1726882892.71673: done checking for max_fail_percentage 30564 1726882892.71674: checking to see if all hosts have failed and the running result is not ok 30564 1726882892.71675: done checking to see if all hosts have failed 30564 1726882892.71675: getting the remaining hosts for this loop 30564 1726882892.71676: done getting the remaining hosts for this loop 30564 1726882892.71678: getting the next task for host managed_node2 30564 1726882892.71685: done getting next task for host managed_node2 30564 1726882892.71686: ^ task is: TASK: Test 30564 1726882892.71688: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882892.71690: getting variables 30564 1726882892.71691: in VariableManager get_vars() 30564 1726882892.71700: Calling all_inventory to load vars for managed_node2 30564 1726882892.71702: Calling groups_inventory to load vars for managed_node2 30564 1726882892.71703: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882892.71707: Calling all_plugins_play to load vars for managed_node2 30564 1726882892.71708: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882892.71710: Calling groups_plugins_play to load vars for managed_node2 30564 1726882892.72482: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882892.73957: done with get_vars() 30564 1726882892.73982: done getting variables TASK [Test] ******************************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:30 Friday 20 September 2024 21:41:32 -0400 (0:00:00.425) 0:01:31.321 ****** 30564 1726882892.74054: entering _queue_task() for managed_node2/include_tasks 30564 1726882892.74379: worker is 1 (out of 1 available) 30564 1726882892.74391: exiting _queue_task() for managed_node2/include_tasks 30564 1726882892.74403: done queuing things up, now waiting for results queue to drain 30564 1726882892.74404: waiting for pending results... 30564 1726882892.74738: running TaskExecutor() for managed_node2/TASK: Test 30564 1726882892.74874: in run() - task 0e448fcc-3ce9-4216-acec-000000001748 30564 1726882892.74894: variable 'ansible_search_path' from source: unknown 30564 1726882892.74901: variable 'ansible_search_path' from source: unknown 30564 1726882892.74949: variable 'lsr_test' from source: include params 30564 1726882892.75186: variable 'lsr_test' from source: include params 30564 1726882892.75258: variable 'omit' from source: magic vars 30564 1726882892.75422: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882892.75439: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882892.75455: variable 'omit' from source: magic vars 30564 1726882892.75708: variable 'ansible_distribution_major_version' from source: facts 30564 1726882892.75729: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882892.75740: variable 'item' from source: unknown 30564 1726882892.75812: variable 'item' from source: unknown 30564 1726882892.75852: variable 'item' from source: unknown 30564 1726882892.75920: variable 'item' from source: unknown 30564 1726882892.76086: dumping result to json 30564 1726882892.76094: done dumping result, returning 30564 1726882892.76103: done running TaskExecutor() for managed_node2/TASK: Test [0e448fcc-3ce9-4216-acec-000000001748] 30564 1726882892.76112: sending task result for task 0e448fcc-3ce9-4216-acec-000000001748 30564 1726882892.76202: no more pending results, returning what we have 30564 1726882892.76208: in VariableManager get_vars() 30564 1726882892.76253: Calling all_inventory to load vars for managed_node2 30564 1726882892.76256: Calling groups_inventory to load vars for managed_node2 30564 1726882892.76260: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882892.76279: Calling all_plugins_play to load vars for managed_node2 30564 1726882892.76284: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882892.76287: Calling groups_plugins_play to load vars for managed_node2 30564 1726882892.77383: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001748 30564 1726882892.77387: WORKER PROCESS EXITING 30564 1726882892.78014: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882892.79709: done with get_vars() 30564 1726882892.79730: variable 'ansible_search_path' from source: unknown 30564 1726882892.79731: variable 'ansible_search_path' from source: unknown 30564 1726882892.79775: we have included files to process 30564 1726882892.79777: generating all_blocks data 30564 1726882892.79779: done generating all_blocks data 30564 1726882892.79784: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml 30564 1726882892.79786: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml 30564 1726882892.79788: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml 30564 1726882892.79980: done processing included file 30564 1726882892.79983: iterating over new_blocks loaded from include file 30564 1726882892.79984: in VariableManager get_vars() 30564 1726882892.80001: done with get_vars() 30564 1726882892.80003: filtering new block on tags 30564 1726882892.80031: done filtering new block on tags 30564 1726882892.80033: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml for managed_node2 => (item=tasks/remove+down_profile.yml) 30564 1726882892.80039: extending task lists for all hosts with included blocks 30564 1726882892.81031: done extending task lists 30564 1726882892.81032: done processing included files 30564 1726882892.81033: results queue empty 30564 1726882892.81033: checking for any_errors_fatal 30564 1726882892.81035: done checking for any_errors_fatal 30564 1726882892.81035: checking for max_fail_percentage 30564 1726882892.81036: done checking for max_fail_percentage 30564 1726882892.81037: checking to see if all hosts have failed and the running result is not ok 30564 1726882892.81037: done checking to see if all hosts have failed 30564 1726882892.81037: getting the remaining hosts for this loop 30564 1726882892.81038: done getting the remaining hosts for this loop 30564 1726882892.81040: getting the next task for host managed_node2 30564 1726882892.81043: done getting next task for host managed_node2 30564 1726882892.81045: ^ task is: TASK: Include network role 30564 1726882892.81047: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882892.81049: getting variables 30564 1726882892.81050: in VariableManager get_vars() 30564 1726882892.81057: Calling all_inventory to load vars for managed_node2 30564 1726882892.81059: Calling groups_inventory to load vars for managed_node2 30564 1726882892.81060: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882892.81066: Calling all_plugins_play to load vars for managed_node2 30564 1726882892.81071: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882892.81073: Calling groups_plugins_play to load vars for managed_node2 30564 1726882892.86411: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882892.87339: done with get_vars() 30564 1726882892.87357: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml:3 Friday 20 September 2024 21:41:32 -0400 (0:00:00.133) 0:01:31.455 ****** 30564 1726882892.87415: entering _queue_task() for managed_node2/include_role 30564 1726882892.87683: worker is 1 (out of 1 available) 30564 1726882892.87696: exiting _queue_task() for managed_node2/include_role 30564 1726882892.87712: done queuing things up, now waiting for results queue to drain 30564 1726882892.87714: waiting for pending results... 30564 1726882892.87972: running TaskExecutor() for managed_node2/TASK: Include network role 30564 1726882892.88146: in run() - task 0e448fcc-3ce9-4216-acec-000000001ca9 30564 1726882892.88179: variable 'ansible_search_path' from source: unknown 30564 1726882892.88188: variable 'ansible_search_path' from source: unknown 30564 1726882892.88233: calling self._execute() 30564 1726882892.88363: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882892.88391: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882892.88407: variable 'omit' from source: magic vars 30564 1726882892.88884: variable 'ansible_distribution_major_version' from source: facts 30564 1726882892.88907: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882892.88926: _execute() done 30564 1726882892.88938: dumping result to json 30564 1726882892.88946: done dumping result, returning 30564 1726882892.88955: done running TaskExecutor() for managed_node2/TASK: Include network role [0e448fcc-3ce9-4216-acec-000000001ca9] 30564 1726882892.88967: sending task result for task 0e448fcc-3ce9-4216-acec-000000001ca9 30564 1726882892.89137: no more pending results, returning what we have 30564 1726882892.89143: in VariableManager get_vars() 30564 1726882892.89195: Calling all_inventory to load vars for managed_node2 30564 1726882892.89199: Calling groups_inventory to load vars for managed_node2 30564 1726882892.89203: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882892.89219: Calling all_plugins_play to load vars for managed_node2 30564 1726882892.89224: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882892.89227: Calling groups_plugins_play to load vars for managed_node2 30564 1726882892.90053: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001ca9 30564 1726882892.90057: WORKER PROCESS EXITING 30564 1726882892.90423: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882892.91470: done with get_vars() 30564 1726882892.91486: variable 'ansible_search_path' from source: unknown 30564 1726882892.91488: variable 'ansible_search_path' from source: unknown 30564 1726882892.91579: variable 'omit' from source: magic vars 30564 1726882892.91609: variable 'omit' from source: magic vars 30564 1726882892.91619: variable 'omit' from source: magic vars 30564 1726882892.91622: we have included files to process 30564 1726882892.91622: generating all_blocks data 30564 1726882892.91624: done generating all_blocks data 30564 1726882892.91625: processing included file: fedora.linux_system_roles.network 30564 1726882892.91637: in VariableManager get_vars() 30564 1726882892.91647: done with get_vars() 30564 1726882892.91667: in VariableManager get_vars() 30564 1726882892.91679: done with get_vars() 30564 1726882892.91707: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 30564 1726882892.91781: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 30564 1726882892.91831: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 30564 1726882892.92101: in VariableManager get_vars() 30564 1726882892.92115: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30564 1726882892.93807: iterating over new_blocks loaded from include file 30564 1726882892.93808: in VariableManager get_vars() 30564 1726882892.93820: done with get_vars() 30564 1726882892.93821: filtering new block on tags 30564 1726882892.93994: done filtering new block on tags 30564 1726882892.93998: in VariableManager get_vars() 30564 1726882892.94010: done with get_vars() 30564 1726882892.94011: filtering new block on tags 30564 1726882892.94031: done filtering new block on tags 30564 1726882892.94033: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node2 30564 1726882892.94041: extending task lists for all hosts with included blocks 30564 1726882892.94118: done extending task lists 30564 1726882892.94119: done processing included files 30564 1726882892.94120: results queue empty 30564 1726882892.94124: checking for any_errors_fatal 30564 1726882892.94129: done checking for any_errors_fatal 30564 1726882892.94131: checking for max_fail_percentage 30564 1726882892.94132: done checking for max_fail_percentage 30564 1726882892.94133: checking to see if all hosts have failed and the running result is not ok 30564 1726882892.94134: done checking to see if all hosts have failed 30564 1726882892.94134: getting the remaining hosts for this loop 30564 1726882892.94136: done getting the remaining hosts for this loop 30564 1726882892.94139: getting the next task for host managed_node2 30564 1726882892.94145: done getting next task for host managed_node2 30564 1726882892.94147: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30564 1726882892.94150: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882892.94157: getting variables 30564 1726882892.94157: in VariableManager get_vars() 30564 1726882892.94170: Calling all_inventory to load vars for managed_node2 30564 1726882892.94171: Calling groups_inventory to load vars for managed_node2 30564 1726882892.94172: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882892.94176: Calling all_plugins_play to load vars for managed_node2 30564 1726882892.94177: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882892.94179: Calling groups_plugins_play to load vars for managed_node2 30564 1726882892.95398: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882892.96543: done with get_vars() 30564 1726882892.96558: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:41:32 -0400 (0:00:00.091) 0:01:31.547 ****** 30564 1726882892.96615: entering _queue_task() for managed_node2/include_tasks 30564 1726882892.96859: worker is 1 (out of 1 available) 30564 1726882892.96877: exiting _queue_task() for managed_node2/include_tasks 30564 1726882892.96890: done queuing things up, now waiting for results queue to drain 30564 1726882892.96891: waiting for pending results... 30564 1726882892.97092: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30564 1726882892.97177: in run() - task 0e448fcc-3ce9-4216-acec-000000001d2b 30564 1726882892.97189: variable 'ansible_search_path' from source: unknown 30564 1726882892.97193: variable 'ansible_search_path' from source: unknown 30564 1726882892.97223: calling self._execute() 30564 1726882892.97308: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882892.97312: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882892.97321: variable 'omit' from source: magic vars 30564 1726882892.97616: variable 'ansible_distribution_major_version' from source: facts 30564 1726882892.97626: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882892.97632: _execute() done 30564 1726882892.97635: dumping result to json 30564 1726882892.97637: done dumping result, returning 30564 1726882892.97644: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0e448fcc-3ce9-4216-acec-000000001d2b] 30564 1726882892.97649: sending task result for task 0e448fcc-3ce9-4216-acec-000000001d2b 30564 1726882892.97738: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001d2b 30564 1726882892.97741: WORKER PROCESS EXITING 30564 1726882892.97814: no more pending results, returning what we have 30564 1726882892.97818: in VariableManager get_vars() 30564 1726882892.97874: Calling all_inventory to load vars for managed_node2 30564 1726882892.97877: Calling groups_inventory to load vars for managed_node2 30564 1726882892.97880: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882892.97889: Calling all_plugins_play to load vars for managed_node2 30564 1726882892.97892: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882892.97895: Calling groups_plugins_play to load vars for managed_node2 30564 1726882892.99340: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882893.01415: done with get_vars() 30564 1726882893.01438: variable 'ansible_search_path' from source: unknown 30564 1726882893.01439: variable 'ansible_search_path' from source: unknown 30564 1726882893.01497: we have included files to process 30564 1726882893.01498: generating all_blocks data 30564 1726882893.01500: done generating all_blocks data 30564 1726882893.01503: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30564 1726882893.01504: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30564 1726882893.01506: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30564 1726882893.02110: done processing included file 30564 1726882893.02112: iterating over new_blocks loaded from include file 30564 1726882893.02113: in VariableManager get_vars() 30564 1726882893.02133: done with get_vars() 30564 1726882893.02134: filtering new block on tags 30564 1726882893.02154: done filtering new block on tags 30564 1726882893.02156: in VariableManager get_vars() 30564 1726882893.02173: done with get_vars() 30564 1726882893.02174: filtering new block on tags 30564 1726882893.02202: done filtering new block on tags 30564 1726882893.02204: in VariableManager get_vars() 30564 1726882893.02218: done with get_vars() 30564 1726882893.02219: filtering new block on tags 30564 1726882893.02247: done filtering new block on tags 30564 1726882893.02249: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 30564 1726882893.02253: extending task lists for all hosts with included blocks 30564 1726882893.03280: done extending task lists 30564 1726882893.03281: done processing included files 30564 1726882893.03281: results queue empty 30564 1726882893.03282: checking for any_errors_fatal 30564 1726882893.03284: done checking for any_errors_fatal 30564 1726882893.03285: checking for max_fail_percentage 30564 1726882893.03285: done checking for max_fail_percentage 30564 1726882893.03286: checking to see if all hosts have failed and the running result is not ok 30564 1726882893.03286: done checking to see if all hosts have failed 30564 1726882893.03287: getting the remaining hosts for this loop 30564 1726882893.03288: done getting the remaining hosts for this loop 30564 1726882893.03290: getting the next task for host managed_node2 30564 1726882893.03293: done getting next task for host managed_node2 30564 1726882893.03295: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30564 1726882893.03298: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882893.03305: getting variables 30564 1726882893.03305: in VariableManager get_vars() 30564 1726882893.03317: Calling all_inventory to load vars for managed_node2 30564 1726882893.03319: Calling groups_inventory to load vars for managed_node2 30564 1726882893.03320: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882893.03324: Calling all_plugins_play to load vars for managed_node2 30564 1726882893.03325: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882893.03327: Calling groups_plugins_play to load vars for managed_node2 30564 1726882893.04039: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882893.04980: done with get_vars() 30564 1726882893.04996: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:41:33 -0400 (0:00:00.084) 0:01:31.631 ****** 30564 1726882893.05050: entering _queue_task() for managed_node2/setup 30564 1726882893.05304: worker is 1 (out of 1 available) 30564 1726882893.05315: exiting _queue_task() for managed_node2/setup 30564 1726882893.05329: done queuing things up, now waiting for results queue to drain 30564 1726882893.05330: waiting for pending results... 30564 1726882893.05536: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30564 1726882893.05637: in run() - task 0e448fcc-3ce9-4216-acec-000000001d82 30564 1726882893.05649: variable 'ansible_search_path' from source: unknown 30564 1726882893.05654: variable 'ansible_search_path' from source: unknown 30564 1726882893.05688: calling self._execute() 30564 1726882893.05775: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882893.05781: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882893.05790: variable 'omit' from source: magic vars 30564 1726882893.06089: variable 'ansible_distribution_major_version' from source: facts 30564 1726882893.06100: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882893.06253: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882893.07850: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882893.07901: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882893.07927: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882893.07953: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882893.07980: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882893.08035: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882893.08056: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882893.08081: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882893.08109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882893.08120: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882893.08157: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882893.08178: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882893.08198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882893.08223: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882893.08234: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882893.08358: variable '__network_required_facts' from source: role '' defaults 30564 1726882893.08367: variable 'ansible_facts' from source: unknown 30564 1726882893.08868: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 30564 1726882893.08872: when evaluation is False, skipping this task 30564 1726882893.08876: _execute() done 30564 1726882893.08879: dumping result to json 30564 1726882893.08882: done dumping result, returning 30564 1726882893.08889: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0e448fcc-3ce9-4216-acec-000000001d82] 30564 1726882893.08893: sending task result for task 0e448fcc-3ce9-4216-acec-000000001d82 30564 1726882893.08982: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001d82 30564 1726882893.08985: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30564 1726882893.09032: no more pending results, returning what we have 30564 1726882893.09035: results queue empty 30564 1726882893.09036: checking for any_errors_fatal 30564 1726882893.09039: done checking for any_errors_fatal 30564 1726882893.09039: checking for max_fail_percentage 30564 1726882893.09041: done checking for max_fail_percentage 30564 1726882893.09042: checking to see if all hosts have failed and the running result is not ok 30564 1726882893.09042: done checking to see if all hosts have failed 30564 1726882893.09043: getting the remaining hosts for this loop 30564 1726882893.09045: done getting the remaining hosts for this loop 30564 1726882893.09048: getting the next task for host managed_node2 30564 1726882893.09061: done getting next task for host managed_node2 30564 1726882893.09066: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 30564 1726882893.09074: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882893.09107: getting variables 30564 1726882893.09109: in VariableManager get_vars() 30564 1726882893.09151: Calling all_inventory to load vars for managed_node2 30564 1726882893.09153: Calling groups_inventory to load vars for managed_node2 30564 1726882893.09156: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882893.09167: Calling all_plugins_play to load vars for managed_node2 30564 1726882893.09170: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882893.09180: Calling groups_plugins_play to load vars for managed_node2 30564 1726882893.10042: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882893.11114: done with get_vars() 30564 1726882893.11130: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:41:33 -0400 (0:00:00.061) 0:01:31.693 ****** 30564 1726882893.11208: entering _queue_task() for managed_node2/stat 30564 1726882893.11422: worker is 1 (out of 1 available) 30564 1726882893.11438: exiting _queue_task() for managed_node2/stat 30564 1726882893.11448: done queuing things up, now waiting for results queue to drain 30564 1726882893.11450: waiting for pending results... 30564 1726882893.11645: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 30564 1726882893.11744: in run() - task 0e448fcc-3ce9-4216-acec-000000001d84 30564 1726882893.11756: variable 'ansible_search_path' from source: unknown 30564 1726882893.11759: variable 'ansible_search_path' from source: unknown 30564 1726882893.11794: calling self._execute() 30564 1726882893.11880: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882893.11886: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882893.11895: variable 'omit' from source: magic vars 30564 1726882893.12172: variable 'ansible_distribution_major_version' from source: facts 30564 1726882893.12185: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882893.12304: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882893.12497: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882893.12528: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882893.12551: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882893.12583: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882893.12644: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882893.12662: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882893.12692: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882893.12708: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882893.12768: variable '__network_is_ostree' from source: set_fact 30564 1726882893.12781: Evaluated conditional (not __network_is_ostree is defined): False 30564 1726882893.12784: when evaluation is False, skipping this task 30564 1726882893.12787: _execute() done 30564 1726882893.12789: dumping result to json 30564 1726882893.12793: done dumping result, returning 30564 1726882893.12796: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0e448fcc-3ce9-4216-acec-000000001d84] 30564 1726882893.12803: sending task result for task 0e448fcc-3ce9-4216-acec-000000001d84 30564 1726882893.13044: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001d84 30564 1726882893.13048: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30564 1726882893.13147: no more pending results, returning what we have 30564 1726882893.13151: results queue empty 30564 1726882893.13152: checking for any_errors_fatal 30564 1726882893.13162: done checking for any_errors_fatal 30564 1726882893.13165: checking for max_fail_percentage 30564 1726882893.13167: done checking for max_fail_percentage 30564 1726882893.13168: checking to see if all hosts have failed and the running result is not ok 30564 1726882893.13168: done checking to see if all hosts have failed 30564 1726882893.13169: getting the remaining hosts for this loop 30564 1726882893.13171: done getting the remaining hosts for this loop 30564 1726882893.13182: getting the next task for host managed_node2 30564 1726882893.13191: done getting next task for host managed_node2 30564 1726882893.13195: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30564 1726882893.13202: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882893.13227: getting variables 30564 1726882893.13229: in VariableManager get_vars() 30564 1726882893.13270: Calling all_inventory to load vars for managed_node2 30564 1726882893.13273: Calling groups_inventory to load vars for managed_node2 30564 1726882893.13275: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882893.13293: Calling all_plugins_play to load vars for managed_node2 30564 1726882893.13296: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882893.13300: Calling groups_plugins_play to load vars for managed_node2 30564 1726882893.14120: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882893.15622: done with get_vars() 30564 1726882893.15647: done getting variables 30564 1726882893.15716: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:41:33 -0400 (0:00:00.045) 0:01:31.738 ****** 30564 1726882893.15753: entering _queue_task() for managed_node2/set_fact 30564 1726882893.16077: worker is 1 (out of 1 available) 30564 1726882893.16089: exiting _queue_task() for managed_node2/set_fact 30564 1726882893.16106: done queuing things up, now waiting for results queue to drain 30564 1726882893.16108: waiting for pending results... 30564 1726882893.16417: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30564 1726882893.16602: in run() - task 0e448fcc-3ce9-4216-acec-000000001d85 30564 1726882893.16620: variable 'ansible_search_path' from source: unknown 30564 1726882893.16627: variable 'ansible_search_path' from source: unknown 30564 1726882893.16682: calling self._execute() 30564 1726882893.16806: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882893.16818: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882893.16832: variable 'omit' from source: magic vars 30564 1726882893.17250: variable 'ansible_distribution_major_version' from source: facts 30564 1726882893.17271: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882893.17460: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882893.17760: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882893.17813: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882893.17861: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882893.17902: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882893.18005: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882893.18034: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882893.18082: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882893.18115: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882893.18223: variable '__network_is_ostree' from source: set_fact 30564 1726882893.18234: Evaluated conditional (not __network_is_ostree is defined): False 30564 1726882893.18241: when evaluation is False, skipping this task 30564 1726882893.18247: _execute() done 30564 1726882893.18253: dumping result to json 30564 1726882893.18260: done dumping result, returning 30564 1726882893.18273: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0e448fcc-3ce9-4216-acec-000000001d85] 30564 1726882893.18297: sending task result for task 0e448fcc-3ce9-4216-acec-000000001d85 skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30564 1726882893.18450: no more pending results, returning what we have 30564 1726882893.18454: results queue empty 30564 1726882893.18455: checking for any_errors_fatal 30564 1726882893.18465: done checking for any_errors_fatal 30564 1726882893.18466: checking for max_fail_percentage 30564 1726882893.18468: done checking for max_fail_percentage 30564 1726882893.18470: checking to see if all hosts have failed and the running result is not ok 30564 1726882893.18470: done checking to see if all hosts have failed 30564 1726882893.18471: getting the remaining hosts for this loop 30564 1726882893.18473: done getting the remaining hosts for this loop 30564 1726882893.18477: getting the next task for host managed_node2 30564 1726882893.18491: done getting next task for host managed_node2 30564 1726882893.18495: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 30564 1726882893.18503: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882893.18531: getting variables 30564 1726882893.18533: in VariableManager get_vars() 30564 1726882893.18581: Calling all_inventory to load vars for managed_node2 30564 1726882893.18584: Calling groups_inventory to load vars for managed_node2 30564 1726882893.18587: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882893.18598: Calling all_plugins_play to load vars for managed_node2 30564 1726882893.18601: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882893.18605: Calling groups_plugins_play to load vars for managed_node2 30564 1726882893.19451: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001d85 30564 1726882893.19455: WORKER PROCESS EXITING 30564 1726882893.19951: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882893.20987: done with get_vars() 30564 1726882893.21009: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:41:33 -0400 (0:00:00.053) 0:01:31.792 ****** 30564 1726882893.21113: entering _queue_task() for managed_node2/service_facts 30564 1726882893.21425: worker is 1 (out of 1 available) 30564 1726882893.21438: exiting _queue_task() for managed_node2/service_facts 30564 1726882893.21451: done queuing things up, now waiting for results queue to drain 30564 1726882893.21452: waiting for pending results... 30564 1726882893.21818: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 30564 1726882893.21998: in run() - task 0e448fcc-3ce9-4216-acec-000000001d87 30564 1726882893.22022: variable 'ansible_search_path' from source: unknown 30564 1726882893.22029: variable 'ansible_search_path' from source: unknown 30564 1726882893.22082: calling self._execute() 30564 1726882893.22202: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882893.22214: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882893.22232: variable 'omit' from source: magic vars 30564 1726882893.22654: variable 'ansible_distribution_major_version' from source: facts 30564 1726882893.22678: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882893.22689: variable 'omit' from source: magic vars 30564 1726882893.22791: variable 'omit' from source: magic vars 30564 1726882893.22831: variable 'omit' from source: magic vars 30564 1726882893.22891: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882893.22936: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882893.22971: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882893.22999: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882893.23014: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882893.23052: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882893.23067: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882893.23076: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882893.23189: Set connection var ansible_timeout to 10 30564 1726882893.23200: Set connection var ansible_pipelining to False 30564 1726882893.23208: Set connection var ansible_shell_type to sh 30564 1726882893.23217: Set connection var ansible_shell_executable to /bin/sh 30564 1726882893.23227: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882893.23233: Set connection var ansible_connection to ssh 30564 1726882893.23269: variable 'ansible_shell_executable' from source: unknown 30564 1726882893.23282: variable 'ansible_connection' from source: unknown 30564 1726882893.23290: variable 'ansible_module_compression' from source: unknown 30564 1726882893.23296: variable 'ansible_shell_type' from source: unknown 30564 1726882893.23302: variable 'ansible_shell_executable' from source: unknown 30564 1726882893.23310: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882893.23320: variable 'ansible_pipelining' from source: unknown 30564 1726882893.23327: variable 'ansible_timeout' from source: unknown 30564 1726882893.23334: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882893.23557: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30564 1726882893.23577: variable 'omit' from source: magic vars 30564 1726882893.23590: starting attempt loop 30564 1726882893.23600: running the handler 30564 1726882893.23621: _low_level_execute_command(): starting 30564 1726882893.23631: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882893.24422: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882893.24436: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882893.24464: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882893.24487: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882893.24536: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882893.24547: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882893.24560: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882893.24586: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882893.24598: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882893.24613: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882893.24627: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882893.24640: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882893.24655: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882893.24670: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882893.24686: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882893.24704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882893.24787: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882893.24816: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882893.24834: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882893.24977: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882893.26636: stdout chunk (state=3): >>>/root <<< 30564 1726882893.26741: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882893.26827: stderr chunk (state=3): >>><<< 30564 1726882893.26838: stdout chunk (state=3): >>><<< 30564 1726882893.26966: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882893.26970: _low_level_execute_command(): starting 30564 1726882893.26974: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882893.2686353-34634-238590192859273 `" && echo ansible-tmp-1726882893.2686353-34634-238590192859273="` echo /root/.ansible/tmp/ansible-tmp-1726882893.2686353-34634-238590192859273 `" ) && sleep 0' 30564 1726882893.27581: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882893.27594: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882893.27618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882893.27634: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882893.27677: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882893.27690: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882893.27703: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882893.27727: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882893.27741: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882893.27752: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882893.27768: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882893.27783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882893.27798: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882893.27810: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882893.27821: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882893.27844: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882893.27921: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882893.27941: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882893.27966: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882893.28095: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882893.29991: stdout chunk (state=3): >>>ansible-tmp-1726882893.2686353-34634-238590192859273=/root/.ansible/tmp/ansible-tmp-1726882893.2686353-34634-238590192859273 <<< 30564 1726882893.30192: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882893.30195: stdout chunk (state=3): >>><<< 30564 1726882893.30197: stderr chunk (state=3): >>><<< 30564 1726882893.30475: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882893.2686353-34634-238590192859273=/root/.ansible/tmp/ansible-tmp-1726882893.2686353-34634-238590192859273 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882893.30480: variable 'ansible_module_compression' from source: unknown 30564 1726882893.30482: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 30564 1726882893.30484: variable 'ansible_facts' from source: unknown 30564 1726882893.30487: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882893.2686353-34634-238590192859273/AnsiballZ_service_facts.py 30564 1726882893.31151: Sending initial data 30564 1726882893.31154: Sent initial data (162 bytes) 30564 1726882893.35629: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882893.35633: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882893.35670: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882893.35674: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882893.35678: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882893.35786: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882893.35855: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882893.35972: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882893.37743: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882893.37842: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882893.37942: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmp3zv5n469 /root/.ansible/tmp/ansible-tmp-1726882893.2686353-34634-238590192859273/AnsiballZ_service_facts.py <<< 30564 1726882893.38047: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882893.39482: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882893.39576: stderr chunk (state=3): >>><<< 30564 1726882893.39596: stdout chunk (state=3): >>><<< 30564 1726882893.39599: done transferring module to remote 30564 1726882893.39614: _low_level_execute_command(): starting 30564 1726882893.39617: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882893.2686353-34634-238590192859273/ /root/.ansible/tmp/ansible-tmp-1726882893.2686353-34634-238590192859273/AnsiballZ_service_facts.py && sleep 0' 30564 1726882893.40502: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882893.40510: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882893.40519: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882893.40534: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882893.40575: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882893.40583: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882893.40593: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882893.40605: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882893.40614: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882893.40621: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882893.40626: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882893.40636: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882893.40646: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882893.40653: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882893.40660: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882893.40675: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882893.40741: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882893.40751: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882893.40786: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882893.41079: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882893.42815: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882893.42819: stdout chunk (state=3): >>><<< 30564 1726882893.42826: stderr chunk (state=3): >>><<< 30564 1726882893.42841: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882893.42844: _low_level_execute_command(): starting 30564 1726882893.42849: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882893.2686353-34634-238590192859273/AnsiballZ_service_facts.py && sleep 0' 30564 1726882893.43423: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882893.43431: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882893.43441: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882893.43454: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882893.43498: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882893.43505: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882893.43515: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882893.43529: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882893.43536: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882893.43544: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882893.43551: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882893.43558: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882893.43577: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882893.43584: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882893.43591: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882893.43601: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882893.43674: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882893.43689: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882893.43699: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882893.43835: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882894.76935: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "s<<< 30564 1726882894.76950: stdout chunk (state=3): >>>tate": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static<<< 30564 1726882894.76966: stdout chunk (state=3): >>>", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"n<<< 30564 1726882894.76973: stdout chunk (state=3): >>>ame": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", <<< 30564 1726882894.76977: stdout chunk (state=3): >>>"status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "sys<<< 30564 1726882894.76981: stdout chunk (state=3): >>>temd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 30564 1726882894.78177: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882894.78234: stderr chunk (state=3): >>><<< 30564 1726882894.78238: stdout chunk (state=3): >>><<< 30564 1726882894.78273: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882894.78753: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882893.2686353-34634-238590192859273/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882894.78760: _low_level_execute_command(): starting 30564 1726882894.78766: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882893.2686353-34634-238590192859273/ > /dev/null 2>&1 && sleep 0' 30564 1726882894.79242: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882894.79246: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882894.79282: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 30564 1726882894.79296: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882894.79306: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882894.79354: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882894.79371: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882894.79482: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882894.81281: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882894.81333: stderr chunk (state=3): >>><<< 30564 1726882894.81336: stdout chunk (state=3): >>><<< 30564 1726882894.81352: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882894.81358: handler run complete 30564 1726882894.81474: variable 'ansible_facts' from source: unknown 30564 1726882894.81578: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882894.81834: variable 'ansible_facts' from source: unknown 30564 1726882894.81914: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882894.82024: attempt loop complete, returning result 30564 1726882894.82027: _execute() done 30564 1726882894.82032: dumping result to json 30564 1726882894.82067: done dumping result, returning 30564 1726882894.82077: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [0e448fcc-3ce9-4216-acec-000000001d87] 30564 1726882894.82082: sending task result for task 0e448fcc-3ce9-4216-acec-000000001d87 ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30564 1726882894.82742: no more pending results, returning what we have 30564 1726882894.82745: results queue empty 30564 1726882894.82746: checking for any_errors_fatal 30564 1726882894.82752: done checking for any_errors_fatal 30564 1726882894.82752: checking for max_fail_percentage 30564 1726882894.82754: done checking for max_fail_percentage 30564 1726882894.82755: checking to see if all hosts have failed and the running result is not ok 30564 1726882894.82755: done checking to see if all hosts have failed 30564 1726882894.82756: getting the remaining hosts for this loop 30564 1726882894.82757: done getting the remaining hosts for this loop 30564 1726882894.82760: getting the next task for host managed_node2 30564 1726882894.82787: done getting next task for host managed_node2 30564 1726882894.82791: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 30564 1726882894.82798: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882894.82810: getting variables 30564 1726882894.82812: in VariableManager get_vars() 30564 1726882894.82847: Calling all_inventory to load vars for managed_node2 30564 1726882894.82849: Calling groups_inventory to load vars for managed_node2 30564 1726882894.82852: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882894.82861: Calling all_plugins_play to load vars for managed_node2 30564 1726882894.82867: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882894.82873: Calling groups_plugins_play to load vars for managed_node2 30564 1726882894.83938: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001d87 30564 1726882894.83942: WORKER PROCESS EXITING 30564 1726882894.84810: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882894.86569: done with get_vars() 30564 1726882894.86598: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:41:34 -0400 (0:00:01.655) 0:01:33.448 ****** 30564 1726882894.86703: entering _queue_task() for managed_node2/package_facts 30564 1726882894.87060: worker is 1 (out of 1 available) 30564 1726882894.87073: exiting _queue_task() for managed_node2/package_facts 30564 1726882894.87087: done queuing things up, now waiting for results queue to drain 30564 1726882894.87088: waiting for pending results... 30564 1726882894.87398: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 30564 1726882894.87580: in run() - task 0e448fcc-3ce9-4216-acec-000000001d88 30564 1726882894.87604: variable 'ansible_search_path' from source: unknown 30564 1726882894.87612: variable 'ansible_search_path' from source: unknown 30564 1726882894.87661: calling self._execute() 30564 1726882894.87785: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882894.87797: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882894.87815: variable 'omit' from source: magic vars 30564 1726882894.88226: variable 'ansible_distribution_major_version' from source: facts 30564 1726882894.88246: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882894.88258: variable 'omit' from source: magic vars 30564 1726882894.88346: variable 'omit' from source: magic vars 30564 1726882894.88385: variable 'omit' from source: magic vars 30564 1726882894.88440: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882894.88483: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882894.88513: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882894.88537: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882894.88554: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882894.88589: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882894.88598: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882894.88606: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882894.88717: Set connection var ansible_timeout to 10 30564 1726882894.88731: Set connection var ansible_pipelining to False 30564 1726882894.88739: Set connection var ansible_shell_type to sh 30564 1726882894.88749: Set connection var ansible_shell_executable to /bin/sh 30564 1726882894.88762: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882894.88773: Set connection var ansible_connection to ssh 30564 1726882894.88803: variable 'ansible_shell_executable' from source: unknown 30564 1726882894.88812: variable 'ansible_connection' from source: unknown 30564 1726882894.88819: variable 'ansible_module_compression' from source: unknown 30564 1726882894.88828: variable 'ansible_shell_type' from source: unknown 30564 1726882894.88838: variable 'ansible_shell_executable' from source: unknown 30564 1726882894.88845: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882894.88853: variable 'ansible_pipelining' from source: unknown 30564 1726882894.88859: variable 'ansible_timeout' from source: unknown 30564 1726882894.88870: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882894.89077: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30564 1726882894.89097: variable 'omit' from source: magic vars 30564 1726882894.89107: starting attempt loop 30564 1726882894.89114: running the handler 30564 1726882894.89133: _low_level_execute_command(): starting 30564 1726882894.89145: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882894.89916: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882894.89937: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882894.89952: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882894.89974: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882894.90019: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882894.90035: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882894.90050: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882894.90072: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882894.90086: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882894.90099: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882894.90113: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882894.90128: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882894.90149: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882894.90166: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882894.90180: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882894.90196: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882894.90276: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882894.90298: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882894.90314: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882894.90449: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882894.92110: stdout chunk (state=3): >>>/root <<< 30564 1726882894.92223: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882894.92329: stderr chunk (state=3): >>><<< 30564 1726882894.92342: stdout chunk (state=3): >>><<< 30564 1726882894.92477: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882894.92481: _low_level_execute_command(): starting 30564 1726882894.92484: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882894.9237692-34677-96528507853657 `" && echo ansible-tmp-1726882894.9237692-34677-96528507853657="` echo /root/.ansible/tmp/ansible-tmp-1726882894.9237692-34677-96528507853657 `" ) && sleep 0' 30564 1726882894.93506: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882894.93509: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882894.93543: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 30564 1726882894.93547: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882894.93549: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882894.93622: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882894.93641: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882894.93775: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882894.95624: stdout chunk (state=3): >>>ansible-tmp-1726882894.9237692-34677-96528507853657=/root/.ansible/tmp/ansible-tmp-1726882894.9237692-34677-96528507853657 <<< 30564 1726882894.95815: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882894.95819: stdout chunk (state=3): >>><<< 30564 1726882894.95821: stderr chunk (state=3): >>><<< 30564 1726882894.95870: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882894.9237692-34677-96528507853657=/root/.ansible/tmp/ansible-tmp-1726882894.9237692-34677-96528507853657 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882894.96072: variable 'ansible_module_compression' from source: unknown 30564 1726882894.96075: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 30564 1726882894.96078: variable 'ansible_facts' from source: unknown 30564 1726882894.96236: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882894.9237692-34677-96528507853657/AnsiballZ_package_facts.py 30564 1726882894.96424: Sending initial data 30564 1726882894.96427: Sent initial data (161 bytes) 30564 1726882894.97756: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882894.97761: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882894.97806: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 30564 1726882894.97810: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882894.97814: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882894.97889: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882894.97901: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882894.98025: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882894.99754: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 <<< 30564 1726882894.99758: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882894.99849: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882894.99958: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmpgb4_g0mj /root/.ansible/tmp/ansible-tmp-1726882894.9237692-34677-96528507853657/AnsiballZ_package_facts.py <<< 30564 1726882895.00053: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882895.02623: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882895.02708: stderr chunk (state=3): >>><<< 30564 1726882895.02712: stdout chunk (state=3): >>><<< 30564 1726882895.02733: done transferring module to remote 30564 1726882895.02743: _low_level_execute_command(): starting 30564 1726882895.02748: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882894.9237692-34677-96528507853657/ /root/.ansible/tmp/ansible-tmp-1726882894.9237692-34677-96528507853657/AnsiballZ_package_facts.py && sleep 0' 30564 1726882895.03383: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882895.03389: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882895.03400: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882895.03414: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882895.03452: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882895.03459: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882895.03476: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882895.03487: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882895.03496: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882895.03504: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882895.03511: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882895.03518: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882895.03530: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882895.03537: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882895.03543: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882895.03553: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882895.03629: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882895.03642: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882895.03652: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882895.03779: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882895.05631: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882895.05635: stdout chunk (state=3): >>><<< 30564 1726882895.05638: stderr chunk (state=3): >>><<< 30564 1726882895.05729: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882895.05733: _low_level_execute_command(): starting 30564 1726882895.05736: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882894.9237692-34677-96528507853657/AnsiballZ_package_facts.py && sleep 0' 30564 1726882895.06909: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882895.07182: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882895.07200: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882895.07220: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882895.07262: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882895.07279: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882895.07295: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882895.07314: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882895.07328: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882895.07339: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882895.07352: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882895.07367: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882895.07384: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882895.07397: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882895.08079: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882895.08096: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882895.08175: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882895.08198: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882895.08216: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882895.08358: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882895.54444: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "e<<< 30564 1726882895.54481: stdout chunk (state=3): >>>poch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects"<<< 30564 1726882895.54494: stdout chunk (state=3): >>>: [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source"<<< 30564 1726882895.54499: stdout chunk (state=3): >>>: "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release<<< 30564 1726882895.54506: stdout chunk (state=3): >>>": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]<<< 30564 1726882895.54509: stdout chunk (state=3): >>>, "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.1<<< 30564 1726882895.54560: stdout chunk (state=3): >>>6.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202<<< 30564 1726882895.54568: stdout chunk (state=3): >>>", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "sour<<< 30564 1726882895.54581: stdout chunk (state=3): >>>ce": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300<<< 30564 1726882895.54590: stdout chunk (state=3): >>>", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64"<<< 30564 1726882895.54596: stdout chunk (state=3): >>>, "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_6<<< 30564 1726882895.54619: stdout chunk (state=3): >>>4", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", <<< 30564 1726882895.54645: stdout chunk (state=3): >>>"release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch<<< 30564 1726882895.54651: stdout chunk (state=3): >>>", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 30564 1726882895.56170: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882895.56219: stderr chunk (state=3): >>><<< 30564 1726882895.56221: stdout chunk (state=3): >>><<< 30564 1726882895.56275: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882895.58798: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882894.9237692-34677-96528507853657/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882895.58826: _low_level_execute_command(): starting 30564 1726882895.58834: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882894.9237692-34677-96528507853657/ > /dev/null 2>&1 && sleep 0' 30564 1726882895.59488: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882895.59502: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882895.59517: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882895.59536: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882895.59583: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882895.59596: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882895.59609: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882895.59626: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882895.59638: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882895.59648: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882895.59659: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882895.59678: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882895.59696: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882895.59708: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882895.59718: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882895.59731: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882895.59811: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882895.59828: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882895.59842: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882895.59978: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882895.61792: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882895.61873: stderr chunk (state=3): >>><<< 30564 1726882895.61887: stdout chunk (state=3): >>><<< 30564 1726882895.61969: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882895.61973: handler run complete 30564 1726882895.62888: variable 'ansible_facts' from source: unknown 30564 1726882895.63439: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882895.65847: variable 'ansible_facts' from source: unknown 30564 1726882895.66362: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882895.67244: attempt loop complete, returning result 30564 1726882895.67267: _execute() done 30564 1726882895.67283: dumping result to json 30564 1726882895.67538: done dumping result, returning 30564 1726882895.67552: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0e448fcc-3ce9-4216-acec-000000001d88] 30564 1726882895.67563: sending task result for task 0e448fcc-3ce9-4216-acec-000000001d88 30564 1726882895.69871: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001d88 30564 1726882895.69875: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30564 1726882895.70129: no more pending results, returning what we have 30564 1726882895.70132: results queue empty 30564 1726882895.70134: checking for any_errors_fatal 30564 1726882895.70143: done checking for any_errors_fatal 30564 1726882895.70144: checking for max_fail_percentage 30564 1726882895.70146: done checking for max_fail_percentage 30564 1726882895.70147: checking to see if all hosts have failed and the running result is not ok 30564 1726882895.70148: done checking to see if all hosts have failed 30564 1726882895.70151: getting the remaining hosts for this loop 30564 1726882895.70153: done getting the remaining hosts for this loop 30564 1726882895.70158: getting the next task for host managed_node2 30564 1726882895.70169: done getting next task for host managed_node2 30564 1726882895.70174: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 30564 1726882895.70182: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882895.70196: getting variables 30564 1726882895.70198: in VariableManager get_vars() 30564 1726882895.70240: Calling all_inventory to load vars for managed_node2 30564 1726882895.70243: Calling groups_inventory to load vars for managed_node2 30564 1726882895.70251: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882895.70280: Calling all_plugins_play to load vars for managed_node2 30564 1726882895.70285: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882895.70289: Calling groups_plugins_play to load vars for managed_node2 30564 1726882895.71974: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882895.74826: done with get_vars() 30564 1726882895.74859: done getting variables 30564 1726882895.74987: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:41:35 -0400 (0:00:00.884) 0:01:34.332 ****** 30564 1726882895.75151: entering _queue_task() for managed_node2/debug 30564 1726882895.75724: worker is 1 (out of 1 available) 30564 1726882895.75736: exiting _queue_task() for managed_node2/debug 30564 1726882895.75748: done queuing things up, now waiting for results queue to drain 30564 1726882895.75749: waiting for pending results... 30564 1726882895.75846: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 30564 1726882895.76021: in run() - task 0e448fcc-3ce9-4216-acec-000000001d2c 30564 1726882895.76044: variable 'ansible_search_path' from source: unknown 30564 1726882895.76053: variable 'ansible_search_path' from source: unknown 30564 1726882895.76101: calling self._execute() 30564 1726882895.76232: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882895.76245: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882895.76261: variable 'omit' from source: magic vars 30564 1726882895.76697: variable 'ansible_distribution_major_version' from source: facts 30564 1726882895.76718: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882895.76730: variable 'omit' from source: magic vars 30564 1726882895.76808: variable 'omit' from source: magic vars 30564 1726882895.76916: variable 'network_provider' from source: set_fact 30564 1726882895.76939: variable 'omit' from source: magic vars 30564 1726882895.76996: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882895.77039: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882895.77071: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882895.77107: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882895.77125: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882895.77160: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882895.77174: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882895.77183: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882895.77297: Set connection var ansible_timeout to 10 30564 1726882895.77318: Set connection var ansible_pipelining to False 30564 1726882895.77326: Set connection var ansible_shell_type to sh 30564 1726882895.77338: Set connection var ansible_shell_executable to /bin/sh 30564 1726882895.78822: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882895.78828: Set connection var ansible_connection to ssh 30564 1726882895.78857: variable 'ansible_shell_executable' from source: unknown 30564 1726882895.78930: variable 'ansible_connection' from source: unknown 30564 1726882895.78939: variable 'ansible_module_compression' from source: unknown 30564 1726882895.78946: variable 'ansible_shell_type' from source: unknown 30564 1726882895.78953: variable 'ansible_shell_executable' from source: unknown 30564 1726882895.78959: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882895.78972: variable 'ansible_pipelining' from source: unknown 30564 1726882895.78980: variable 'ansible_timeout' from source: unknown 30564 1726882895.78987: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882895.79371: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882895.79390: variable 'omit' from source: magic vars 30564 1726882895.79400: starting attempt loop 30564 1726882895.79408: running the handler 30564 1726882895.79462: handler run complete 30564 1726882895.79597: attempt loop complete, returning result 30564 1726882895.79604: _execute() done 30564 1726882895.79612: dumping result to json 30564 1726882895.79619: done dumping result, returning 30564 1726882895.79631: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [0e448fcc-3ce9-4216-acec-000000001d2c] 30564 1726882895.79642: sending task result for task 0e448fcc-3ce9-4216-acec-000000001d2c ok: [managed_node2] => {} MSG: Using network provider: nm 30564 1726882895.79809: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001d2c 30564 1726882895.79814: WORKER PROCESS EXITING 30564 1726882895.79837: no more pending results, returning what we have 30564 1726882895.79840: results queue empty 30564 1726882895.79841: checking for any_errors_fatal 30564 1726882895.79851: done checking for any_errors_fatal 30564 1726882895.79852: checking for max_fail_percentage 30564 1726882895.79854: done checking for max_fail_percentage 30564 1726882895.79855: checking to see if all hosts have failed and the running result is not ok 30564 1726882895.79856: done checking to see if all hosts have failed 30564 1726882895.79857: getting the remaining hosts for this loop 30564 1726882895.79859: done getting the remaining hosts for this loop 30564 1726882895.79863: getting the next task for host managed_node2 30564 1726882895.79872: done getting next task for host managed_node2 30564 1726882895.79876: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30564 1726882895.79883: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882895.79895: getting variables 30564 1726882895.79896: in VariableManager get_vars() 30564 1726882895.79940: Calling all_inventory to load vars for managed_node2 30564 1726882895.79942: Calling groups_inventory to load vars for managed_node2 30564 1726882895.79944: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882895.79954: Calling all_plugins_play to load vars for managed_node2 30564 1726882895.79956: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882895.79959: Calling groups_plugins_play to load vars for managed_node2 30564 1726882895.82640: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882895.84906: done with get_vars() 30564 1726882895.84935: done getting variables 30564 1726882895.84999: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:41:35 -0400 (0:00:00.098) 0:01:34.431 ****** 30564 1726882895.85049: entering _queue_task() for managed_node2/fail 30564 1726882895.85410: worker is 1 (out of 1 available) 30564 1726882895.85425: exiting _queue_task() for managed_node2/fail 30564 1726882895.85438: done queuing things up, now waiting for results queue to drain 30564 1726882895.85439: waiting for pending results... 30564 1726882895.85771: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30564 1726882895.85925: in run() - task 0e448fcc-3ce9-4216-acec-000000001d2d 30564 1726882895.85935: variable 'ansible_search_path' from source: unknown 30564 1726882895.85939: variable 'ansible_search_path' from source: unknown 30564 1726882895.85974: calling self._execute() 30564 1726882895.86081: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882895.86086: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882895.86098: variable 'omit' from source: magic vars 30564 1726882895.86726: variable 'ansible_distribution_major_version' from source: facts 30564 1726882895.86740: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882895.86871: variable 'network_state' from source: role '' defaults 30564 1726882895.86880: Evaluated conditional (network_state != {}): False 30564 1726882895.86883: when evaluation is False, skipping this task 30564 1726882895.86886: _execute() done 30564 1726882895.86890: dumping result to json 30564 1726882895.86893: done dumping result, returning 30564 1726882895.86899: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0e448fcc-3ce9-4216-acec-000000001d2d] 30564 1726882895.86904: sending task result for task 0e448fcc-3ce9-4216-acec-000000001d2d 30564 1726882895.87030: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001d2d 30564 1726882895.87034: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30564 1726882895.87090: no more pending results, returning what we have 30564 1726882895.87095: results queue empty 30564 1726882895.87096: checking for any_errors_fatal 30564 1726882895.87106: done checking for any_errors_fatal 30564 1726882895.87107: checking for max_fail_percentage 30564 1726882895.87109: done checking for max_fail_percentage 30564 1726882895.87110: checking to see if all hosts have failed and the running result is not ok 30564 1726882895.87111: done checking to see if all hosts have failed 30564 1726882895.87112: getting the remaining hosts for this loop 30564 1726882895.87114: done getting the remaining hosts for this loop 30564 1726882895.87118: getting the next task for host managed_node2 30564 1726882895.87129: done getting next task for host managed_node2 30564 1726882895.87134: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30564 1726882895.87141: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882895.87173: getting variables 30564 1726882895.87176: in VariableManager get_vars() 30564 1726882895.87225: Calling all_inventory to load vars for managed_node2 30564 1726882895.87229: Calling groups_inventory to load vars for managed_node2 30564 1726882895.87232: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882895.87245: Calling all_plugins_play to load vars for managed_node2 30564 1726882895.87249: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882895.87253: Calling groups_plugins_play to load vars for managed_node2 30564 1726882895.89108: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882895.90909: done with get_vars() 30564 1726882895.90944: done getting variables 30564 1726882895.91007: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:41:35 -0400 (0:00:00.060) 0:01:34.492 ****** 30564 1726882895.91084: entering _queue_task() for managed_node2/fail 30564 1726882895.92298: worker is 1 (out of 1 available) 30564 1726882895.92311: exiting _queue_task() for managed_node2/fail 30564 1726882895.92439: done queuing things up, now waiting for results queue to drain 30564 1726882895.92441: waiting for pending results... 30564 1726882895.94201: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30564 1726882895.94626: in run() - task 0e448fcc-3ce9-4216-acec-000000001d2e 30564 1726882895.94738: variable 'ansible_search_path' from source: unknown 30564 1726882895.94746: variable 'ansible_search_path' from source: unknown 30564 1726882895.94896: calling self._execute() 30564 1726882895.95010: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882895.95287: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882895.95304: variable 'omit' from source: magic vars 30564 1726882895.95888: variable 'ansible_distribution_major_version' from source: facts 30564 1726882895.95907: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882895.96195: variable 'network_state' from source: role '' defaults 30564 1726882895.96211: Evaluated conditional (network_state != {}): False 30564 1726882895.96219: when evaluation is False, skipping this task 30564 1726882895.96226: _execute() done 30564 1726882895.96233: dumping result to json 30564 1726882895.96279: done dumping result, returning 30564 1726882895.96293: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0e448fcc-3ce9-4216-acec-000000001d2e] 30564 1726882895.96305: sending task result for task 0e448fcc-3ce9-4216-acec-000000001d2e skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30564 1726882895.96465: no more pending results, returning what we have 30564 1726882895.96470: results queue empty 30564 1726882895.96471: checking for any_errors_fatal 30564 1726882895.96483: done checking for any_errors_fatal 30564 1726882895.96484: checking for max_fail_percentage 30564 1726882895.96486: done checking for max_fail_percentage 30564 1726882895.96487: checking to see if all hosts have failed and the running result is not ok 30564 1726882895.96488: done checking to see if all hosts have failed 30564 1726882895.96489: getting the remaining hosts for this loop 30564 1726882895.96491: done getting the remaining hosts for this loop 30564 1726882895.96496: getting the next task for host managed_node2 30564 1726882895.96504: done getting next task for host managed_node2 30564 1726882895.96510: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30564 1726882895.96516: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882895.96534: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001d2e 30564 1726882895.96538: WORKER PROCESS EXITING 30564 1726882895.96557: getting variables 30564 1726882895.96559: in VariableManager get_vars() 30564 1726882895.96608: Calling all_inventory to load vars for managed_node2 30564 1726882895.96611: Calling groups_inventory to load vars for managed_node2 30564 1726882895.96613: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882895.96625: Calling all_plugins_play to load vars for managed_node2 30564 1726882895.96628: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882895.96632: Calling groups_plugins_play to load vars for managed_node2 30564 1726882895.99615: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882896.03323: done with get_vars() 30564 1726882896.03360: done getting variables 30564 1726882896.03538: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:41:36 -0400 (0:00:00.124) 0:01:34.617 ****** 30564 1726882896.03576: entering _queue_task() for managed_node2/fail 30564 1726882896.04381: worker is 1 (out of 1 available) 30564 1726882896.04394: exiting _queue_task() for managed_node2/fail 30564 1726882896.04408: done queuing things up, now waiting for results queue to drain 30564 1726882896.04409: waiting for pending results... 30564 1726882896.04961: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30564 1726882896.05525: in run() - task 0e448fcc-3ce9-4216-acec-000000001d2f 30564 1726882896.05545: variable 'ansible_search_path' from source: unknown 30564 1726882896.05578: variable 'ansible_search_path' from source: unknown 30564 1726882896.05620: calling self._execute() 30564 1726882896.05884: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882896.05982: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882896.05999: variable 'omit' from source: magic vars 30564 1726882896.06579: variable 'ansible_distribution_major_version' from source: facts 30564 1726882896.06596: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882896.06770: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882896.10666: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882896.11251: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882896.11303: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882896.11343: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882896.11380: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882896.11546: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882896.11646: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882896.11902: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882896.11949: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882896.11976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882896.12254: variable 'ansible_distribution_major_version' from source: facts 30564 1726882896.12488: Evaluated conditional (ansible_distribution_major_version | int > 9): False 30564 1726882896.12496: when evaluation is False, skipping this task 30564 1726882896.12502: _execute() done 30564 1726882896.12508: dumping result to json 30564 1726882896.12514: done dumping result, returning 30564 1726882896.12525: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0e448fcc-3ce9-4216-acec-000000001d2f] 30564 1726882896.12534: sending task result for task 0e448fcc-3ce9-4216-acec-000000001d2f 30564 1726882896.12652: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001d2f skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 30564 1726882896.12704: no more pending results, returning what we have 30564 1726882896.12708: results queue empty 30564 1726882896.12709: checking for any_errors_fatal 30564 1726882896.12716: done checking for any_errors_fatal 30564 1726882896.12717: checking for max_fail_percentage 30564 1726882896.12719: done checking for max_fail_percentage 30564 1726882896.12720: checking to see if all hosts have failed and the running result is not ok 30564 1726882896.12721: done checking to see if all hosts have failed 30564 1726882896.12721: getting the remaining hosts for this loop 30564 1726882896.12724: done getting the remaining hosts for this loop 30564 1726882896.12727: getting the next task for host managed_node2 30564 1726882896.12736: done getting next task for host managed_node2 30564 1726882896.12740: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30564 1726882896.12746: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882896.12779: getting variables 30564 1726882896.12781: in VariableManager get_vars() 30564 1726882896.12824: Calling all_inventory to load vars for managed_node2 30564 1726882896.12826: Calling groups_inventory to load vars for managed_node2 30564 1726882896.12828: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882896.12841: Calling all_plugins_play to load vars for managed_node2 30564 1726882896.12844: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882896.12848: Calling groups_plugins_play to load vars for managed_node2 30564 1726882896.13411: WORKER PROCESS EXITING 30564 1726882896.15998: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882896.19496: done with get_vars() 30564 1726882896.19522: done getting variables 30564 1726882896.19597: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:41:36 -0400 (0:00:00.160) 0:01:34.777 ****** 30564 1726882896.19633: entering _queue_task() for managed_node2/dnf 30564 1726882896.20036: worker is 1 (out of 1 available) 30564 1726882896.20051: exiting _queue_task() for managed_node2/dnf 30564 1726882896.20070: done queuing things up, now waiting for results queue to drain 30564 1726882896.20071: waiting for pending results... 30564 1726882896.20295: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30564 1726882896.20397: in run() - task 0e448fcc-3ce9-4216-acec-000000001d30 30564 1726882896.20411: variable 'ansible_search_path' from source: unknown 30564 1726882896.20415: variable 'ansible_search_path' from source: unknown 30564 1726882896.20444: calling self._execute() 30564 1726882896.20530: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882896.20535: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882896.20541: variable 'omit' from source: magic vars 30564 1726882896.20839: variable 'ansible_distribution_major_version' from source: facts 30564 1726882896.20852: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882896.20998: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882896.30576: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882896.30659: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882896.30714: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882896.30754: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882896.30799: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882896.30888: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882896.30931: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882896.30974: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882896.31021: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882896.31054: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882896.31182: variable 'ansible_distribution' from source: facts 30564 1726882896.31192: variable 'ansible_distribution_major_version' from source: facts 30564 1726882896.31210: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 30564 1726882896.31341: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882896.31500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882896.31526: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882896.31553: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882896.31612: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882896.31633: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882896.31686: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882896.31724: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882896.31754: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882896.31810: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882896.31835: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882896.31916: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882896.31958: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882896.32005: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882896.32087: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882896.32107: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882896.32312: variable 'network_connections' from source: include params 30564 1726882896.32325: variable 'interface' from source: play vars 30564 1726882896.32435: variable 'interface' from source: play vars 30564 1726882896.32533: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882896.32755: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882896.32804: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882896.32845: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882896.32884: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882896.32940: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882896.32970: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882896.33017: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882896.33067: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882896.33115: variable '__network_team_connections_defined' from source: role '' defaults 30564 1726882896.33405: variable 'network_connections' from source: include params 30564 1726882896.33415: variable 'interface' from source: play vars 30564 1726882896.33493: variable 'interface' from source: play vars 30564 1726882896.33518: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30564 1726882896.33527: when evaluation is False, skipping this task 30564 1726882896.33537: _execute() done 30564 1726882896.33546: dumping result to json 30564 1726882896.33552: done dumping result, returning 30564 1726882896.33562: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0e448fcc-3ce9-4216-acec-000000001d30] 30564 1726882896.33572: sending task result for task 0e448fcc-3ce9-4216-acec-000000001d30 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30564 1726882896.33738: no more pending results, returning what we have 30564 1726882896.33741: results queue empty 30564 1726882896.33742: checking for any_errors_fatal 30564 1726882896.33749: done checking for any_errors_fatal 30564 1726882896.33750: checking for max_fail_percentage 30564 1726882896.33752: done checking for max_fail_percentage 30564 1726882896.33753: checking to see if all hosts have failed and the running result is not ok 30564 1726882896.33753: done checking to see if all hosts have failed 30564 1726882896.33754: getting the remaining hosts for this loop 30564 1726882896.33756: done getting the remaining hosts for this loop 30564 1726882896.33759: getting the next task for host managed_node2 30564 1726882896.33769: done getting next task for host managed_node2 30564 1726882896.33773: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30564 1726882896.33779: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882896.33801: getting variables 30564 1726882896.33802: in VariableManager get_vars() 30564 1726882896.33843: Calling all_inventory to load vars for managed_node2 30564 1726882896.33846: Calling groups_inventory to load vars for managed_node2 30564 1726882896.33849: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882896.33858: Calling all_plugins_play to load vars for managed_node2 30564 1726882896.33861: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882896.33865: Calling groups_plugins_play to load vars for managed_node2 30564 1726882896.40661: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001d30 30564 1726882896.40667: WORKER PROCESS EXITING 30564 1726882896.41461: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882896.42407: done with get_vars() 30564 1726882896.42425: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30564 1726882896.42474: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:41:36 -0400 (0:00:00.228) 0:01:35.006 ****** 30564 1726882896.42495: entering _queue_task() for managed_node2/yum 30564 1726882896.42741: worker is 1 (out of 1 available) 30564 1726882896.42758: exiting _queue_task() for managed_node2/yum 30564 1726882896.42771: done queuing things up, now waiting for results queue to drain 30564 1726882896.42773: waiting for pending results... 30564 1726882896.42972: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30564 1726882896.43110: in run() - task 0e448fcc-3ce9-4216-acec-000000001d31 30564 1726882896.43152: variable 'ansible_search_path' from source: unknown 30564 1726882896.43157: variable 'ansible_search_path' from source: unknown 30564 1726882896.43185: calling self._execute() 30564 1726882896.43298: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882896.43303: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882896.43313: variable 'omit' from source: magic vars 30564 1726882896.43730: variable 'ansible_distribution_major_version' from source: facts 30564 1726882896.43753: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882896.43960: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882896.46284: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882896.46340: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882896.46370: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882896.46397: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882896.46416: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882896.46478: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882896.46498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882896.46516: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882896.46545: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882896.46558: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882896.46630: variable 'ansible_distribution_major_version' from source: facts 30564 1726882896.46649: Evaluated conditional (ansible_distribution_major_version | int < 8): False 30564 1726882896.46652: when evaluation is False, skipping this task 30564 1726882896.46655: _execute() done 30564 1726882896.46660: dumping result to json 30564 1726882896.46662: done dumping result, returning 30564 1726882896.46665: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0e448fcc-3ce9-4216-acec-000000001d31] 30564 1726882896.46676: sending task result for task 0e448fcc-3ce9-4216-acec-000000001d31 30564 1726882896.46766: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001d31 30564 1726882896.46771: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 30564 1726882896.46819: no more pending results, returning what we have 30564 1726882896.46823: results queue empty 30564 1726882896.46824: checking for any_errors_fatal 30564 1726882896.46835: done checking for any_errors_fatal 30564 1726882896.46835: checking for max_fail_percentage 30564 1726882896.46837: done checking for max_fail_percentage 30564 1726882896.46838: checking to see if all hosts have failed and the running result is not ok 30564 1726882896.46838: done checking to see if all hosts have failed 30564 1726882896.46839: getting the remaining hosts for this loop 30564 1726882896.46842: done getting the remaining hosts for this loop 30564 1726882896.46846: getting the next task for host managed_node2 30564 1726882896.46854: done getting next task for host managed_node2 30564 1726882896.46857: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30564 1726882896.46865: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882896.46891: getting variables 30564 1726882896.46893: in VariableManager get_vars() 30564 1726882896.46931: Calling all_inventory to load vars for managed_node2 30564 1726882896.46934: Calling groups_inventory to load vars for managed_node2 30564 1726882896.46936: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882896.46945: Calling all_plugins_play to load vars for managed_node2 30564 1726882896.46947: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882896.46950: Calling groups_plugins_play to load vars for managed_node2 30564 1726882896.48127: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882896.49453: done with get_vars() 30564 1726882896.49474: done getting variables 30564 1726882896.49515: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:41:36 -0400 (0:00:00.070) 0:01:35.076 ****** 30564 1726882896.49540: entering _queue_task() for managed_node2/fail 30564 1726882896.49775: worker is 1 (out of 1 available) 30564 1726882896.49788: exiting _queue_task() for managed_node2/fail 30564 1726882896.49802: done queuing things up, now waiting for results queue to drain 30564 1726882896.49803: waiting for pending results... 30564 1726882896.50001: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30564 1726882896.50100: in run() - task 0e448fcc-3ce9-4216-acec-000000001d32 30564 1726882896.50115: variable 'ansible_search_path' from source: unknown 30564 1726882896.50124: variable 'ansible_search_path' from source: unknown 30564 1726882896.50154: calling self._execute() 30564 1726882896.50242: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882896.50246: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882896.50255: variable 'omit' from source: magic vars 30564 1726882896.50542: variable 'ansible_distribution_major_version' from source: facts 30564 1726882896.50554: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882896.50640: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882896.50779: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882896.52356: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882896.52413: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882896.52439: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882896.52465: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882896.52488: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882896.52546: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882896.52567: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882896.52586: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882896.52619: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882896.52626: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882896.52657: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882896.52677: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882896.52695: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882896.52727: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882896.52734: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882896.52762: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882896.52780: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882896.52798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882896.52823: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882896.52836: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882896.52951: variable 'network_connections' from source: include params 30564 1726882896.52961: variable 'interface' from source: play vars 30564 1726882896.53008: variable 'interface' from source: play vars 30564 1726882896.53060: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882896.53175: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882896.53208: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882896.53231: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882896.53251: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882896.53288: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882896.53303: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882896.53320: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882896.53337: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882896.53379: variable '__network_team_connections_defined' from source: role '' defaults 30564 1726882896.53530: variable 'network_connections' from source: include params 30564 1726882896.53533: variable 'interface' from source: play vars 30564 1726882896.53578: variable 'interface' from source: play vars 30564 1726882896.53600: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30564 1726882896.53604: when evaluation is False, skipping this task 30564 1726882896.53611: _execute() done 30564 1726882896.53613: dumping result to json 30564 1726882896.53615: done dumping result, returning 30564 1726882896.53618: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-4216-acec-000000001d32] 30564 1726882896.53620: sending task result for task 0e448fcc-3ce9-4216-acec-000000001d32 30564 1726882896.53720: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001d32 30564 1726882896.53723: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30564 1726882896.53787: no more pending results, returning what we have 30564 1726882896.53790: results queue empty 30564 1726882896.53792: checking for any_errors_fatal 30564 1726882896.53797: done checking for any_errors_fatal 30564 1726882896.53798: checking for max_fail_percentage 30564 1726882896.53804: done checking for max_fail_percentage 30564 1726882896.53805: checking to see if all hosts have failed and the running result is not ok 30564 1726882896.53806: done checking to see if all hosts have failed 30564 1726882896.53806: getting the remaining hosts for this loop 30564 1726882896.53808: done getting the remaining hosts for this loop 30564 1726882896.53815: getting the next task for host managed_node2 30564 1726882896.53825: done getting next task for host managed_node2 30564 1726882896.53830: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 30564 1726882896.53835: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882896.53854: getting variables 30564 1726882896.53855: in VariableManager get_vars() 30564 1726882896.53895: Calling all_inventory to load vars for managed_node2 30564 1726882896.53898: Calling groups_inventory to load vars for managed_node2 30564 1726882896.53899: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882896.53912: Calling all_plugins_play to load vars for managed_node2 30564 1726882896.53915: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882896.53918: Calling groups_plugins_play to load vars for managed_node2 30564 1726882896.54734: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882896.55718: done with get_vars() 30564 1726882896.55733: done getting variables 30564 1726882896.55780: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:41:36 -0400 (0:00:00.062) 0:01:35.139 ****** 30564 1726882896.55807: entering _queue_task() for managed_node2/package 30564 1726882896.56023: worker is 1 (out of 1 available) 30564 1726882896.56037: exiting _queue_task() for managed_node2/package 30564 1726882896.56050: done queuing things up, now waiting for results queue to drain 30564 1726882896.56051: waiting for pending results... 30564 1726882896.56243: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 30564 1726882896.56346: in run() - task 0e448fcc-3ce9-4216-acec-000000001d33 30564 1726882896.56358: variable 'ansible_search_path' from source: unknown 30564 1726882896.56361: variable 'ansible_search_path' from source: unknown 30564 1726882896.56392: calling self._execute() 30564 1726882896.56475: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882896.56479: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882896.56490: variable 'omit' from source: magic vars 30564 1726882896.56766: variable 'ansible_distribution_major_version' from source: facts 30564 1726882896.56777: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882896.56912: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882896.57109: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882896.57139: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882896.57165: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882896.57476: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882896.57553: variable 'network_packages' from source: role '' defaults 30564 1726882896.57631: variable '__network_provider_setup' from source: role '' defaults 30564 1726882896.57639: variable '__network_service_name_default_nm' from source: role '' defaults 30564 1726882896.57687: variable '__network_service_name_default_nm' from source: role '' defaults 30564 1726882896.57695: variable '__network_packages_default_nm' from source: role '' defaults 30564 1726882896.57740: variable '__network_packages_default_nm' from source: role '' defaults 30564 1726882896.57858: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882896.59274: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882896.59311: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882896.59341: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882896.59367: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882896.59387: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882896.59446: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882896.59479: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882896.59497: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882896.59523: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882896.59534: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882896.59573: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882896.59587: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882896.59603: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882896.59628: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882896.59638: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882896.59788: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30564 1726882896.59854: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882896.59877: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882896.59895: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882896.59921: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882896.59931: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882896.59995: variable 'ansible_python' from source: facts 30564 1726882896.60012: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30564 1726882896.60062: variable '__network_wpa_supplicant_required' from source: role '' defaults 30564 1726882896.60120: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30564 1726882896.60202: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882896.60221: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882896.60240: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882896.60265: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882896.60278: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882896.60311: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882896.60336: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882896.60351: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882896.60378: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882896.60389: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882896.60488: variable 'network_connections' from source: include params 30564 1726882896.60491: variable 'interface' from source: play vars 30564 1726882896.60566: variable 'interface' from source: play vars 30564 1726882896.60614: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882896.60636: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882896.60661: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882896.60685: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882896.60723: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882896.60908: variable 'network_connections' from source: include params 30564 1726882896.60912: variable 'interface' from source: play vars 30564 1726882896.60987: variable 'interface' from source: play vars 30564 1726882896.61007: variable '__network_packages_default_wireless' from source: role '' defaults 30564 1726882896.61059: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882896.61257: variable 'network_connections' from source: include params 30564 1726882896.61260: variable 'interface' from source: play vars 30564 1726882896.61313: variable 'interface' from source: play vars 30564 1726882896.61327: variable '__network_packages_default_team' from source: role '' defaults 30564 1726882896.61381: variable '__network_team_connections_defined' from source: role '' defaults 30564 1726882896.61580: variable 'network_connections' from source: include params 30564 1726882896.61583: variable 'interface' from source: play vars 30564 1726882896.61631: variable 'interface' from source: play vars 30564 1726882896.61672: variable '__network_service_name_default_initscripts' from source: role '' defaults 30564 1726882896.61717: variable '__network_service_name_default_initscripts' from source: role '' defaults 30564 1726882896.61720: variable '__network_packages_default_initscripts' from source: role '' defaults 30564 1726882896.61765: variable '__network_packages_default_initscripts' from source: role '' defaults 30564 1726882896.61904: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30564 1726882896.62235: variable 'network_connections' from source: include params 30564 1726882896.62238: variable 'interface' from source: play vars 30564 1726882896.62289: variable 'interface' from source: play vars 30564 1726882896.62293: variable 'ansible_distribution' from source: facts 30564 1726882896.62295: variable '__network_rh_distros' from source: role '' defaults 30564 1726882896.62301: variable 'ansible_distribution_major_version' from source: facts 30564 1726882896.62312: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30564 1726882896.62424: variable 'ansible_distribution' from source: facts 30564 1726882896.62427: variable '__network_rh_distros' from source: role '' defaults 30564 1726882896.62431: variable 'ansible_distribution_major_version' from source: facts 30564 1726882896.62442: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30564 1726882896.62550: variable 'ansible_distribution' from source: facts 30564 1726882896.62554: variable '__network_rh_distros' from source: role '' defaults 30564 1726882896.62558: variable 'ansible_distribution_major_version' from source: facts 30564 1726882896.62593: variable 'network_provider' from source: set_fact 30564 1726882896.62603: variable 'ansible_facts' from source: unknown 30564 1726882896.63144: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 30564 1726882896.63153: when evaluation is False, skipping this task 30564 1726882896.63172: _execute() done 30564 1726882896.63180: dumping result to json 30564 1726882896.63187: done dumping result, returning 30564 1726882896.63200: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [0e448fcc-3ce9-4216-acec-000000001d33] 30564 1726882896.63210: sending task result for task 0e448fcc-3ce9-4216-acec-000000001d33 skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 30564 1726882896.63402: no more pending results, returning what we have 30564 1726882896.63406: results queue empty 30564 1726882896.63407: checking for any_errors_fatal 30564 1726882896.63415: done checking for any_errors_fatal 30564 1726882896.63415: checking for max_fail_percentage 30564 1726882896.63417: done checking for max_fail_percentage 30564 1726882896.63418: checking to see if all hosts have failed and the running result is not ok 30564 1726882896.63419: done checking to see if all hosts have failed 30564 1726882896.63420: getting the remaining hosts for this loop 30564 1726882896.63422: done getting the remaining hosts for this loop 30564 1726882896.63426: getting the next task for host managed_node2 30564 1726882896.63435: done getting next task for host managed_node2 30564 1726882896.63439: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30564 1726882896.63448: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882896.63479: getting variables 30564 1726882896.63481: in VariableManager get_vars() 30564 1726882896.63532: Calling all_inventory to load vars for managed_node2 30564 1726882896.63535: Calling groups_inventory to load vars for managed_node2 30564 1726882896.63542: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882896.63599: Calling all_plugins_play to load vars for managed_node2 30564 1726882896.63604: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882896.63610: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001d33 30564 1726882896.63612: WORKER PROCESS EXITING 30564 1726882896.63616: Calling groups_plugins_play to load vars for managed_node2 30564 1726882896.64970: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882896.65927: done with get_vars() 30564 1726882896.65945: done getting variables 30564 1726882896.66021: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:41:36 -0400 (0:00:00.102) 0:01:35.241 ****** 30564 1726882896.66055: entering _queue_task() for managed_node2/package 30564 1726882896.66740: worker is 1 (out of 1 available) 30564 1726882896.66751: exiting _queue_task() for managed_node2/package 30564 1726882896.66766: done queuing things up, now waiting for results queue to drain 30564 1726882896.66768: waiting for pending results... 30564 1726882896.67058: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30564 1726882896.67203: in run() - task 0e448fcc-3ce9-4216-acec-000000001d34 30564 1726882896.67220: variable 'ansible_search_path' from source: unknown 30564 1726882896.67226: variable 'ansible_search_path' from source: unknown 30564 1726882896.67261: calling self._execute() 30564 1726882896.67370: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882896.67374: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882896.67384: variable 'omit' from source: magic vars 30564 1726882896.67772: variable 'ansible_distribution_major_version' from source: facts 30564 1726882896.67784: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882896.67911: variable 'network_state' from source: role '' defaults 30564 1726882896.67922: Evaluated conditional (network_state != {}): False 30564 1726882896.67925: when evaluation is False, skipping this task 30564 1726882896.67927: _execute() done 30564 1726882896.67930: dumping result to json 30564 1726882896.67932: done dumping result, returning 30564 1726882896.67942: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0e448fcc-3ce9-4216-acec-000000001d34] 30564 1726882896.67948: sending task result for task 0e448fcc-3ce9-4216-acec-000000001d34 30564 1726882896.68054: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001d34 30564 1726882896.68057: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30564 1726882896.68120: no more pending results, returning what we have 30564 1726882896.68124: results queue empty 30564 1726882896.68126: checking for any_errors_fatal 30564 1726882896.68134: done checking for any_errors_fatal 30564 1726882896.68134: checking for max_fail_percentage 30564 1726882896.68136: done checking for max_fail_percentage 30564 1726882896.68137: checking to see if all hosts have failed and the running result is not ok 30564 1726882896.68138: done checking to see if all hosts have failed 30564 1726882896.68139: getting the remaining hosts for this loop 30564 1726882896.68141: done getting the remaining hosts for this loop 30564 1726882896.68145: getting the next task for host managed_node2 30564 1726882896.68155: done getting next task for host managed_node2 30564 1726882896.68159: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30564 1726882896.68168: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882896.68198: getting variables 30564 1726882896.68200: in VariableManager get_vars() 30564 1726882896.68245: Calling all_inventory to load vars for managed_node2 30564 1726882896.68248: Calling groups_inventory to load vars for managed_node2 30564 1726882896.68250: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882896.68263: Calling all_plugins_play to load vars for managed_node2 30564 1726882896.68268: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882896.68271: Calling groups_plugins_play to load vars for managed_node2 30564 1726882896.69975: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882896.71718: done with get_vars() 30564 1726882896.71741: done getting variables 30564 1726882896.71797: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:41:36 -0400 (0:00:00.057) 0:01:35.299 ****** 30564 1726882896.71830: entering _queue_task() for managed_node2/package 30564 1726882896.72115: worker is 1 (out of 1 available) 30564 1726882896.72127: exiting _queue_task() for managed_node2/package 30564 1726882896.72139: done queuing things up, now waiting for results queue to drain 30564 1726882896.72140: waiting for pending results... 30564 1726882896.72434: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30564 1726882896.72572: in run() - task 0e448fcc-3ce9-4216-acec-000000001d35 30564 1726882896.72587: variable 'ansible_search_path' from source: unknown 30564 1726882896.72591: variable 'ansible_search_path' from source: unknown 30564 1726882896.72624: calling self._execute() 30564 1726882896.72728: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882896.72732: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882896.72743: variable 'omit' from source: magic vars 30564 1726882896.73117: variable 'ansible_distribution_major_version' from source: facts 30564 1726882896.73136: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882896.73263: variable 'network_state' from source: role '' defaults 30564 1726882896.73275: Evaluated conditional (network_state != {}): False 30564 1726882896.73278: when evaluation is False, skipping this task 30564 1726882896.73281: _execute() done 30564 1726882896.73284: dumping result to json 30564 1726882896.73286: done dumping result, returning 30564 1726882896.73295: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0e448fcc-3ce9-4216-acec-000000001d35] 30564 1726882896.73301: sending task result for task 0e448fcc-3ce9-4216-acec-000000001d35 30564 1726882896.73404: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001d35 30564 1726882896.73408: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30564 1726882896.73453: no more pending results, returning what we have 30564 1726882896.73458: results queue empty 30564 1726882896.73459: checking for any_errors_fatal 30564 1726882896.73470: done checking for any_errors_fatal 30564 1726882896.73471: checking for max_fail_percentage 30564 1726882896.73473: done checking for max_fail_percentage 30564 1726882896.73474: checking to see if all hosts have failed and the running result is not ok 30564 1726882896.73475: done checking to see if all hosts have failed 30564 1726882896.73476: getting the remaining hosts for this loop 30564 1726882896.73478: done getting the remaining hosts for this loop 30564 1726882896.73482: getting the next task for host managed_node2 30564 1726882896.73490: done getting next task for host managed_node2 30564 1726882896.73495: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30564 1726882896.73502: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882896.73530: getting variables 30564 1726882896.73532: in VariableManager get_vars() 30564 1726882896.73577: Calling all_inventory to load vars for managed_node2 30564 1726882896.73580: Calling groups_inventory to load vars for managed_node2 30564 1726882896.73583: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882896.73595: Calling all_plugins_play to load vars for managed_node2 30564 1726882896.73598: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882896.73601: Calling groups_plugins_play to load vars for managed_node2 30564 1726882896.75170: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882896.76875: done with get_vars() 30564 1726882896.76898: done getting variables 30564 1726882896.76954: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:41:36 -0400 (0:00:00.051) 0:01:35.351 ****** 30564 1726882896.76993: entering _queue_task() for managed_node2/service 30564 1726882896.77271: worker is 1 (out of 1 available) 30564 1726882896.77284: exiting _queue_task() for managed_node2/service 30564 1726882896.77296: done queuing things up, now waiting for results queue to drain 30564 1726882896.77297: waiting for pending results... 30564 1726882896.77595: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30564 1726882896.77732: in run() - task 0e448fcc-3ce9-4216-acec-000000001d36 30564 1726882896.77750: variable 'ansible_search_path' from source: unknown 30564 1726882896.77753: variable 'ansible_search_path' from source: unknown 30564 1726882896.77790: calling self._execute() 30564 1726882896.77889: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882896.77893: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882896.77903: variable 'omit' from source: magic vars 30564 1726882896.78271: variable 'ansible_distribution_major_version' from source: facts 30564 1726882896.78286: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882896.78411: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882896.78609: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882896.81306: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882896.81385: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882896.81424: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882896.81456: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882896.81484: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882896.81562: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882896.81592: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882896.81617: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882896.81662: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882896.81678: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882896.81722: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882896.81747: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882896.81778: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882896.81818: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882896.81832: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882896.81878: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882896.81901: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882896.81925: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882896.81972: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882896.81983: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882896.82156: variable 'network_connections' from source: include params 30564 1726882896.82171: variable 'interface' from source: play vars 30564 1726882896.82234: variable 'interface' from source: play vars 30564 1726882896.82310: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882896.82471: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882896.82503: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882896.82549: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882896.82578: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882896.82621: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882896.82646: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882896.82673: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882896.82702: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882896.82752: variable '__network_team_connections_defined' from source: role '' defaults 30564 1726882896.82996: variable 'network_connections' from source: include params 30564 1726882896.83001: variable 'interface' from source: play vars 30564 1726882896.83065: variable 'interface' from source: play vars 30564 1726882896.83088: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30564 1726882896.83092: when evaluation is False, skipping this task 30564 1726882896.83095: _execute() done 30564 1726882896.83097: dumping result to json 30564 1726882896.83099: done dumping result, returning 30564 1726882896.83107: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-4216-acec-000000001d36] 30564 1726882896.83113: sending task result for task 0e448fcc-3ce9-4216-acec-000000001d36 30564 1726882896.83215: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001d36 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30564 1726882896.83273: no more pending results, returning what we have 30564 1726882896.83277: results queue empty 30564 1726882896.83278: checking for any_errors_fatal 30564 1726882896.83285: done checking for any_errors_fatal 30564 1726882896.83286: checking for max_fail_percentage 30564 1726882896.83288: done checking for max_fail_percentage 30564 1726882896.83289: checking to see if all hosts have failed and the running result is not ok 30564 1726882896.83289: done checking to see if all hosts have failed 30564 1726882896.83290: getting the remaining hosts for this loop 30564 1726882896.83292: done getting the remaining hosts for this loop 30564 1726882896.83296: getting the next task for host managed_node2 30564 1726882896.83305: done getting next task for host managed_node2 30564 1726882896.83310: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30564 1726882896.83316: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882896.83333: WORKER PROCESS EXITING 30564 1726882896.83350: getting variables 30564 1726882896.83352: in VariableManager get_vars() 30564 1726882896.83397: Calling all_inventory to load vars for managed_node2 30564 1726882896.83400: Calling groups_inventory to load vars for managed_node2 30564 1726882896.83402: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882896.83413: Calling all_plugins_play to load vars for managed_node2 30564 1726882896.83416: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882896.83419: Calling groups_plugins_play to load vars for managed_node2 30564 1726882896.85236: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882896.86912: done with get_vars() 30564 1726882896.86935: done getting variables 30564 1726882896.86996: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:41:36 -0400 (0:00:00.100) 0:01:35.451 ****** 30564 1726882896.87030: entering _queue_task() for managed_node2/service 30564 1726882896.87348: worker is 1 (out of 1 available) 30564 1726882896.87362: exiting _queue_task() for managed_node2/service 30564 1726882896.87377: done queuing things up, now waiting for results queue to drain 30564 1726882896.87379: waiting for pending results... 30564 1726882896.87690: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30564 1726882896.87876: in run() - task 0e448fcc-3ce9-4216-acec-000000001d37 30564 1726882896.87881: variable 'ansible_search_path' from source: unknown 30564 1726882896.87886: variable 'ansible_search_path' from source: unknown 30564 1726882896.87889: calling self._execute() 30564 1726882896.88077: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882896.88081: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882896.88084: variable 'omit' from source: magic vars 30564 1726882896.88408: variable 'ansible_distribution_major_version' from source: facts 30564 1726882896.88422: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882896.88595: variable 'network_provider' from source: set_fact 30564 1726882896.88598: variable 'network_state' from source: role '' defaults 30564 1726882896.88611: Evaluated conditional (network_provider == "nm" or network_state != {}): True 30564 1726882896.88614: variable 'omit' from source: magic vars 30564 1726882896.88675: variable 'omit' from source: magic vars 30564 1726882896.88705: variable 'network_service_name' from source: role '' defaults 30564 1726882896.88770: variable 'network_service_name' from source: role '' defaults 30564 1726882896.88858: variable '__network_provider_setup' from source: role '' defaults 30564 1726882896.88864: variable '__network_service_name_default_nm' from source: role '' defaults 30564 1726882896.88931: variable '__network_service_name_default_nm' from source: role '' defaults 30564 1726882896.88938: variable '__network_packages_default_nm' from source: role '' defaults 30564 1726882896.88996: variable '__network_packages_default_nm' from source: role '' defaults 30564 1726882896.89213: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882896.91532: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882896.91603: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882896.91641: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882896.91676: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882896.91700: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882896.91780: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882896.91806: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882896.91830: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882896.91875: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882896.91888: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882896.91929: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882896.91953: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882896.91979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882896.92016: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882896.92029: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882896.92243: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30564 1726882896.92354: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882896.92381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882896.92405: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882896.92441: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882896.92453: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882896.92549: variable 'ansible_python' from source: facts 30564 1726882896.92567: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30564 1726882896.92650: variable '__network_wpa_supplicant_required' from source: role '' defaults 30564 1726882896.92734: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30564 1726882896.92859: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882896.92884: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882896.92908: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882896.92951: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882896.92967: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882896.93011: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882896.93038: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882896.93066: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882896.93109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882896.93121: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882896.93271: variable 'network_connections' from source: include params 30564 1726882896.93275: variable 'interface' from source: play vars 30564 1726882896.93347: variable 'interface' from source: play vars 30564 1726882896.93461: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882896.93874: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882896.93929: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882896.93977: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882896.94024: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882896.94091: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882896.94121: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882896.94161: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882896.94198: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882896.94254: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882896.94541: variable 'network_connections' from source: include params 30564 1726882896.94547: variable 'interface' from source: play vars 30564 1726882896.94625: variable 'interface' from source: play vars 30564 1726882896.94654: variable '__network_packages_default_wireless' from source: role '' defaults 30564 1726882896.94736: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882896.95031: variable 'network_connections' from source: include params 30564 1726882896.95034: variable 'interface' from source: play vars 30564 1726882896.95109: variable 'interface' from source: play vars 30564 1726882896.95130: variable '__network_packages_default_team' from source: role '' defaults 30564 1726882896.95207: variable '__network_team_connections_defined' from source: role '' defaults 30564 1726882896.95501: variable 'network_connections' from source: include params 30564 1726882896.95505: variable 'interface' from source: play vars 30564 1726882896.95578: variable 'interface' from source: play vars 30564 1726882896.95628: variable '__network_service_name_default_initscripts' from source: role '' defaults 30564 1726882896.95691: variable '__network_service_name_default_initscripts' from source: role '' defaults 30564 1726882896.95697: variable '__network_packages_default_initscripts' from source: role '' defaults 30564 1726882896.95759: variable '__network_packages_default_initscripts' from source: role '' defaults 30564 1726882896.95977: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30564 1726882896.96482: variable 'network_connections' from source: include params 30564 1726882896.96486: variable 'interface' from source: play vars 30564 1726882896.96548: variable 'interface' from source: play vars 30564 1726882896.96555: variable 'ansible_distribution' from source: facts 30564 1726882896.96558: variable '__network_rh_distros' from source: role '' defaults 30564 1726882896.96566: variable 'ansible_distribution_major_version' from source: facts 30564 1726882896.96580: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30564 1726882896.96756: variable 'ansible_distribution' from source: facts 30564 1726882896.96760: variable '__network_rh_distros' from source: role '' defaults 30564 1726882896.96766: variable 'ansible_distribution_major_version' from source: facts 30564 1726882896.96779: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30564 1726882896.96950: variable 'ansible_distribution' from source: facts 30564 1726882896.96954: variable '__network_rh_distros' from source: role '' defaults 30564 1726882896.96960: variable 'ansible_distribution_major_version' from source: facts 30564 1726882896.96994: variable 'network_provider' from source: set_fact 30564 1726882896.97016: variable 'omit' from source: magic vars 30564 1726882896.97042: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882896.97074: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882896.97093: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882896.97110: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882896.97122: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882896.97150: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882896.97153: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882896.97156: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882896.97257: Set connection var ansible_timeout to 10 30564 1726882896.97262: Set connection var ansible_pipelining to False 30564 1726882896.97267: Set connection var ansible_shell_type to sh 30564 1726882896.97278: Set connection var ansible_shell_executable to /bin/sh 30564 1726882896.97286: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882896.97289: Set connection var ansible_connection to ssh 30564 1726882896.97315: variable 'ansible_shell_executable' from source: unknown 30564 1726882896.97318: variable 'ansible_connection' from source: unknown 30564 1726882896.97320: variable 'ansible_module_compression' from source: unknown 30564 1726882896.97322: variable 'ansible_shell_type' from source: unknown 30564 1726882896.97327: variable 'ansible_shell_executable' from source: unknown 30564 1726882896.97329: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882896.97334: variable 'ansible_pipelining' from source: unknown 30564 1726882896.97337: variable 'ansible_timeout' from source: unknown 30564 1726882896.97339: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882896.97445: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882896.97454: variable 'omit' from source: magic vars 30564 1726882896.97457: starting attempt loop 30564 1726882896.97460: running the handler 30564 1726882896.97542: variable 'ansible_facts' from source: unknown 30564 1726882896.98360: _low_level_execute_command(): starting 30564 1726882896.98370: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882896.99094: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882896.99107: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882896.99116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882896.99134: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882896.99177: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882896.99185: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882896.99195: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882896.99209: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882896.99217: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882896.99220: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882896.99228: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882896.99243: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882896.99254: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882896.99261: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882896.99272: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882896.99280: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882896.99356: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882896.99375: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882896.99383: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882896.99515: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882897.01189: stdout chunk (state=3): >>>/root <<< 30564 1726882897.01377: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882897.01380: stdout chunk (state=3): >>><<< 30564 1726882897.01383: stderr chunk (state=3): >>><<< 30564 1726882897.01487: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882897.01491: _low_level_execute_command(): starting 30564 1726882897.01494: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882897.013998-34748-82586427956676 `" && echo ansible-tmp-1726882897.013998-34748-82586427956676="` echo /root/.ansible/tmp/ansible-tmp-1726882897.013998-34748-82586427956676 `" ) && sleep 0' 30564 1726882897.02310: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882897.02314: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882897.02357: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882897.02361: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882897.02365: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882897.02438: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882897.02488: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882897.02600: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882897.04473: stdout chunk (state=3): >>>ansible-tmp-1726882897.013998-34748-82586427956676=/root/.ansible/tmp/ansible-tmp-1726882897.013998-34748-82586427956676 <<< 30564 1726882897.04593: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882897.04645: stderr chunk (state=3): >>><<< 30564 1726882897.04647: stdout chunk (state=3): >>><<< 30564 1726882897.04675: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882897.013998-34748-82586427956676=/root/.ansible/tmp/ansible-tmp-1726882897.013998-34748-82586427956676 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882897.04691: variable 'ansible_module_compression' from source: unknown 30564 1726882897.04730: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 30564 1726882897.04798: variable 'ansible_facts' from source: unknown 30564 1726882897.05024: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882897.013998-34748-82586427956676/AnsiballZ_systemd.py 30564 1726882897.05207: Sending initial data 30564 1726882897.05210: Sent initial data (154 bytes) 30564 1726882897.06239: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882897.06242: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882897.06283: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882897.06286: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882897.06289: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882897.06337: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882897.06344: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882897.06446: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882897.08195: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882897.08301: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882897.08400: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmpoxhioc3o /root/.ansible/tmp/ansible-tmp-1726882897.013998-34748-82586427956676/AnsiballZ_systemd.py <<< 30564 1726882897.08500: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882897.10846: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882897.10965: stderr chunk (state=3): >>><<< 30564 1726882897.10968: stdout chunk (state=3): >>><<< 30564 1726882897.10970: done transferring module to remote 30564 1726882897.10973: _low_level_execute_command(): starting 30564 1726882897.10975: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882897.013998-34748-82586427956676/ /root/.ansible/tmp/ansible-tmp-1726882897.013998-34748-82586427956676/AnsiballZ_systemd.py && sleep 0' 30564 1726882897.11437: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882897.11441: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882897.11490: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882897.11497: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882897.11501: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882897.11503: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882897.11506: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882897.11523: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882897.11526: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882897.11615: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882897.11619: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882897.11622: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882897.11743: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882897.13510: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882897.13551: stderr chunk (state=3): >>><<< 30564 1726882897.13553: stdout chunk (state=3): >>><<< 30564 1726882897.13570: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882897.13572: _low_level_execute_command(): starting 30564 1726882897.13594: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882897.013998-34748-82586427956676/AnsiballZ_systemd.py && sleep 0' 30564 1726882897.13989: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882897.13997: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882897.14004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882897.14014: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882897.14043: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882897.14050: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882897.14061: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882897.14076: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882897.14082: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882897.14090: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882897.14098: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882897.14104: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882897.14152: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882897.14182: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882897.14185: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882897.14295: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882897.39477: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6692", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ExecMainStartTimestampMonotonic": "202392137", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6692", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3602", "MemoryCurrent": "9187328", "MemoryAvailable": "infinity", "CPUUsageNSec": "2303688000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft"<<< 30564 1726882897.39499: stdout chunk (state=3): >>>: "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service multi-user.target network.target shutdown.target cloud-init.service", "After": "cloud-init-local.service dbus-broker.service network-pre.target system.slice dbus.socket systemd-journald.socket basic.target sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:57 EDT", "StateChangeTimestampMonotonic": "316658837", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveExitTimestampMonotonic": "202392395", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveEnterTimestampMonotonic": "202472383", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveExitTimestampMonotonic": "202362940", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveEnterTimestampMonotonic": "202381901", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ConditionTimestampMonotonic": "202382734", "AssertTimestamp": "Fri 2024-09-20 21:31:03 EDT", "AssertTimestampMonotonic": "202382737", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "55e27919215348fab37a11b7ea324f90", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 30564 1726882897.41067: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882897.41077: stdout chunk (state=3): >>><<< 30564 1726882897.41086: stderr chunk (state=3): >>><<< 30564 1726882897.41106: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6692", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ExecMainStartTimestampMonotonic": "202392137", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6692", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3602", "MemoryCurrent": "9187328", "MemoryAvailable": "infinity", "CPUUsageNSec": "2303688000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service multi-user.target network.target shutdown.target cloud-init.service", "After": "cloud-init-local.service dbus-broker.service network-pre.target system.slice dbus.socket systemd-journald.socket basic.target sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:57 EDT", "StateChangeTimestampMonotonic": "316658837", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveExitTimestampMonotonic": "202392395", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveEnterTimestampMonotonic": "202472383", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveExitTimestampMonotonic": "202362940", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveEnterTimestampMonotonic": "202381901", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ConditionTimestampMonotonic": "202382734", "AssertTimestamp": "Fri 2024-09-20 21:31:03 EDT", "AssertTimestampMonotonic": "202382737", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "55e27919215348fab37a11b7ea324f90", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882897.41296: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882897.013998-34748-82586427956676/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882897.41315: _low_level_execute_command(): starting 30564 1726882897.41318: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882897.013998-34748-82586427956676/ > /dev/null 2>&1 && sleep 0' 30564 1726882897.42008: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882897.42017: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882897.42036: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882897.42050: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882897.42092: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882897.42099: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882897.42109: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882897.42121: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882897.42129: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882897.42144: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882897.42151: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882897.42160: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882897.42174: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882897.42181: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882897.42188: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882897.42197: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882897.42280: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882897.42297: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882897.42309: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882897.42432: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882897.44246: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882897.44318: stderr chunk (state=3): >>><<< 30564 1726882897.44324: stdout chunk (state=3): >>><<< 30564 1726882897.44343: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882897.44349: handler run complete 30564 1726882897.44411: attempt loop complete, returning result 30564 1726882897.44415: _execute() done 30564 1726882897.44417: dumping result to json 30564 1726882897.44436: done dumping result, returning 30564 1726882897.44446: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0e448fcc-3ce9-4216-acec-000000001d37] 30564 1726882897.44452: sending task result for task 0e448fcc-3ce9-4216-acec-000000001d37 30564 1726882897.44740: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001d37 30564 1726882897.44744: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30564 1726882897.44812: no more pending results, returning what we have 30564 1726882897.44815: results queue empty 30564 1726882897.44817: checking for any_errors_fatal 30564 1726882897.44823: done checking for any_errors_fatal 30564 1726882897.44824: checking for max_fail_percentage 30564 1726882897.44826: done checking for max_fail_percentage 30564 1726882897.44827: checking to see if all hosts have failed and the running result is not ok 30564 1726882897.44828: done checking to see if all hosts have failed 30564 1726882897.44829: getting the remaining hosts for this loop 30564 1726882897.44831: done getting the remaining hosts for this loop 30564 1726882897.44835: getting the next task for host managed_node2 30564 1726882897.44843: done getting next task for host managed_node2 30564 1726882897.44848: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30564 1726882897.44854: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882897.44868: getting variables 30564 1726882897.44870: in VariableManager get_vars() 30564 1726882897.44910: Calling all_inventory to load vars for managed_node2 30564 1726882897.44913: Calling groups_inventory to load vars for managed_node2 30564 1726882897.44915: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882897.44926: Calling all_plugins_play to load vars for managed_node2 30564 1726882897.44930: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882897.44937: Calling groups_plugins_play to load vars for managed_node2 30564 1726882897.46803: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882897.48665: done with get_vars() 30564 1726882897.48690: done getting variables 30564 1726882897.48746: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:41:37 -0400 (0:00:00.617) 0:01:36.069 ****** 30564 1726882897.48786: entering _queue_task() for managed_node2/service 30564 1726882897.49080: worker is 1 (out of 1 available) 30564 1726882897.49098: exiting _queue_task() for managed_node2/service 30564 1726882897.49110: done queuing things up, now waiting for results queue to drain 30564 1726882897.49112: waiting for pending results... 30564 1726882897.49413: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30564 1726882897.49560: in run() - task 0e448fcc-3ce9-4216-acec-000000001d38 30564 1726882897.49574: variable 'ansible_search_path' from source: unknown 30564 1726882897.49578: variable 'ansible_search_path' from source: unknown 30564 1726882897.49614: calling self._execute() 30564 1726882897.49717: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882897.49721: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882897.49732: variable 'omit' from source: magic vars 30564 1726882897.50174: variable 'ansible_distribution_major_version' from source: facts 30564 1726882897.50207: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882897.50323: variable 'network_provider' from source: set_fact 30564 1726882897.50331: Evaluated conditional (network_provider == "nm"): True 30564 1726882897.50438: variable '__network_wpa_supplicant_required' from source: role '' defaults 30564 1726882897.50539: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30564 1726882897.50731: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882897.53202: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882897.53277: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882897.53312: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882897.53353: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882897.53380: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882897.53475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882897.53505: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882897.53530: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882897.53581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882897.53595: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882897.53638: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882897.53672: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882897.53697: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882897.53735: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882897.53747: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882897.53797: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882897.53820: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882897.53842: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882897.53891: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882897.53905: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882897.54053: variable 'network_connections' from source: include params 30564 1726882897.54065: variable 'interface' from source: play vars 30564 1726882897.54137: variable 'interface' from source: play vars 30564 1726882897.54218: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882897.54386: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882897.54430: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882897.54460: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882897.54491: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882897.54540: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882897.54565: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882897.54590: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882897.54614: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882897.54674: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882897.54930: variable 'network_connections' from source: include params 30564 1726882897.54934: variable 'interface' from source: play vars 30564 1726882897.55008: variable 'interface' from source: play vars 30564 1726882897.55035: Evaluated conditional (__network_wpa_supplicant_required): False 30564 1726882897.55038: when evaluation is False, skipping this task 30564 1726882897.55041: _execute() done 30564 1726882897.55043: dumping result to json 30564 1726882897.55045: done dumping result, returning 30564 1726882897.55053: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0e448fcc-3ce9-4216-acec-000000001d38] 30564 1726882897.55066: sending task result for task 0e448fcc-3ce9-4216-acec-000000001d38 30564 1726882897.55159: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001d38 30564 1726882897.55161: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 30564 1726882897.55227: no more pending results, returning what we have 30564 1726882897.55232: results queue empty 30564 1726882897.55233: checking for any_errors_fatal 30564 1726882897.55256: done checking for any_errors_fatal 30564 1726882897.55257: checking for max_fail_percentage 30564 1726882897.55260: done checking for max_fail_percentage 30564 1726882897.55261: checking to see if all hosts have failed and the running result is not ok 30564 1726882897.55261: done checking to see if all hosts have failed 30564 1726882897.55262: getting the remaining hosts for this loop 30564 1726882897.55266: done getting the remaining hosts for this loop 30564 1726882897.55270: getting the next task for host managed_node2 30564 1726882897.55281: done getting next task for host managed_node2 30564 1726882897.55285: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 30564 1726882897.55292: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882897.55319: getting variables 30564 1726882897.55321: in VariableManager get_vars() 30564 1726882897.55366: Calling all_inventory to load vars for managed_node2 30564 1726882897.55369: Calling groups_inventory to load vars for managed_node2 30564 1726882897.55372: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882897.55383: Calling all_plugins_play to load vars for managed_node2 30564 1726882897.55386: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882897.55389: Calling groups_plugins_play to load vars for managed_node2 30564 1726882897.57163: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882897.59030: done with get_vars() 30564 1726882897.59060: done getting variables 30564 1726882897.59117: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:41:37 -0400 (0:00:00.103) 0:01:36.172 ****** 30564 1726882897.59149: entering _queue_task() for managed_node2/service 30564 1726882897.59453: worker is 1 (out of 1 available) 30564 1726882897.59468: exiting _queue_task() for managed_node2/service 30564 1726882897.59481: done queuing things up, now waiting for results queue to drain 30564 1726882897.59489: waiting for pending results... 30564 1726882897.59789: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 30564 1726882897.59936: in run() - task 0e448fcc-3ce9-4216-acec-000000001d39 30564 1726882897.59953: variable 'ansible_search_path' from source: unknown 30564 1726882897.59957: variable 'ansible_search_path' from source: unknown 30564 1726882897.59998: calling self._execute() 30564 1726882897.60116: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882897.60120: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882897.60131: variable 'omit' from source: magic vars 30564 1726882897.60574: variable 'ansible_distribution_major_version' from source: facts 30564 1726882897.60600: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882897.60731: variable 'network_provider' from source: set_fact 30564 1726882897.60738: Evaluated conditional (network_provider == "initscripts"): False 30564 1726882897.60741: when evaluation is False, skipping this task 30564 1726882897.60744: _execute() done 30564 1726882897.60746: dumping result to json 30564 1726882897.60748: done dumping result, returning 30564 1726882897.60757: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [0e448fcc-3ce9-4216-acec-000000001d39] 30564 1726882897.60764: sending task result for task 0e448fcc-3ce9-4216-acec-000000001d39 30564 1726882897.60873: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001d39 skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30564 1726882897.60930: no more pending results, returning what we have 30564 1726882897.60935: results queue empty 30564 1726882897.60936: checking for any_errors_fatal 30564 1726882897.60945: done checking for any_errors_fatal 30564 1726882897.60946: checking for max_fail_percentage 30564 1726882897.60948: done checking for max_fail_percentage 30564 1726882897.60950: checking to see if all hosts have failed and the running result is not ok 30564 1726882897.60950: done checking to see if all hosts have failed 30564 1726882897.60951: getting the remaining hosts for this loop 30564 1726882897.60954: done getting the remaining hosts for this loop 30564 1726882897.60958: getting the next task for host managed_node2 30564 1726882897.60970: done getting next task for host managed_node2 30564 1726882897.60975: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30564 1726882897.60982: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882897.61001: WORKER PROCESS EXITING 30564 1726882897.61022: getting variables 30564 1726882897.61025: in VariableManager get_vars() 30564 1726882897.61076: Calling all_inventory to load vars for managed_node2 30564 1726882897.61080: Calling groups_inventory to load vars for managed_node2 30564 1726882897.61083: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882897.61097: Calling all_plugins_play to load vars for managed_node2 30564 1726882897.61101: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882897.61104: Calling groups_plugins_play to load vars for managed_node2 30564 1726882897.63096: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882897.64922: done with get_vars() 30564 1726882897.64949: done getting variables 30564 1726882897.65014: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:41:37 -0400 (0:00:00.059) 0:01:36.231 ****** 30564 1726882897.65053: entering _queue_task() for managed_node2/copy 30564 1726882897.65398: worker is 1 (out of 1 available) 30564 1726882897.65411: exiting _queue_task() for managed_node2/copy 30564 1726882897.65423: done queuing things up, now waiting for results queue to drain 30564 1726882897.65424: waiting for pending results... 30564 1726882897.65741: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30564 1726882897.65889: in run() - task 0e448fcc-3ce9-4216-acec-000000001d3a 30564 1726882897.65908: variable 'ansible_search_path' from source: unknown 30564 1726882897.65912: variable 'ansible_search_path' from source: unknown 30564 1726882897.65948: calling self._execute() 30564 1726882897.66056: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882897.66060: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882897.66074: variable 'omit' from source: magic vars 30564 1726882897.66507: variable 'ansible_distribution_major_version' from source: facts 30564 1726882897.66531: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882897.66662: variable 'network_provider' from source: set_fact 30564 1726882897.66674: Evaluated conditional (network_provider == "initscripts"): False 30564 1726882897.66678: when evaluation is False, skipping this task 30564 1726882897.66681: _execute() done 30564 1726882897.66683: dumping result to json 30564 1726882897.66686: done dumping result, returning 30564 1726882897.66695: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0e448fcc-3ce9-4216-acec-000000001d3a] 30564 1726882897.66699: sending task result for task 0e448fcc-3ce9-4216-acec-000000001d3a 30564 1726882897.66808: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001d3a 30564 1726882897.66811: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 30564 1726882897.66868: no more pending results, returning what we have 30564 1726882897.66873: results queue empty 30564 1726882897.66875: checking for any_errors_fatal 30564 1726882897.66885: done checking for any_errors_fatal 30564 1726882897.66886: checking for max_fail_percentage 30564 1726882897.66888: done checking for max_fail_percentage 30564 1726882897.66889: checking to see if all hosts have failed and the running result is not ok 30564 1726882897.66890: done checking to see if all hosts have failed 30564 1726882897.66891: getting the remaining hosts for this loop 30564 1726882897.66893: done getting the remaining hosts for this loop 30564 1726882897.66898: getting the next task for host managed_node2 30564 1726882897.66909: done getting next task for host managed_node2 30564 1726882897.66914: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30564 1726882897.66922: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882897.66951: getting variables 30564 1726882897.66954: in VariableManager get_vars() 30564 1726882897.67006: Calling all_inventory to load vars for managed_node2 30564 1726882897.67010: Calling groups_inventory to load vars for managed_node2 30564 1726882897.67012: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882897.67026: Calling all_plugins_play to load vars for managed_node2 30564 1726882897.67029: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882897.67032: Calling groups_plugins_play to load vars for managed_node2 30564 1726882897.68965: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882897.70280: done with get_vars() 30564 1726882897.70299: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:41:37 -0400 (0:00:00.053) 0:01:36.284 ****** 30564 1726882897.70365: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 30564 1726882897.70602: worker is 1 (out of 1 available) 30564 1726882897.70617: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 30564 1726882897.70630: done queuing things up, now waiting for results queue to drain 30564 1726882897.70631: waiting for pending results... 30564 1726882897.70833: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30564 1726882897.70930: in run() - task 0e448fcc-3ce9-4216-acec-000000001d3b 30564 1726882897.70940: variable 'ansible_search_path' from source: unknown 30564 1726882897.70943: variable 'ansible_search_path' from source: unknown 30564 1726882897.70981: calling self._execute() 30564 1726882897.71069: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882897.71074: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882897.71085: variable 'omit' from source: magic vars 30564 1726882897.71378: variable 'ansible_distribution_major_version' from source: facts 30564 1726882897.71390: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882897.71399: variable 'omit' from source: magic vars 30564 1726882897.71463: variable 'omit' from source: magic vars 30564 1726882897.71745: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882897.73616: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882897.73660: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882897.73689: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882897.73718: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882897.73738: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882897.73798: variable 'network_provider' from source: set_fact 30564 1726882897.73895: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882897.73919: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882897.73937: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882897.73966: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882897.73978: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882897.74034: variable 'omit' from source: magic vars 30564 1726882897.74106: variable 'omit' from source: magic vars 30564 1726882897.74182: variable 'network_connections' from source: include params 30564 1726882897.74190: variable 'interface' from source: play vars 30564 1726882897.74234: variable 'interface' from source: play vars 30564 1726882897.74341: variable 'omit' from source: magic vars 30564 1726882897.74352: variable '__lsr_ansible_managed' from source: task vars 30564 1726882897.74394: variable '__lsr_ansible_managed' from source: task vars 30564 1726882897.74524: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 30564 1726882897.74658: Loaded config def from plugin (lookup/template) 30564 1726882897.74663: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 30564 1726882897.74689: File lookup term: get_ansible_managed.j2 30564 1726882897.74692: variable 'ansible_search_path' from source: unknown 30564 1726882897.74696: evaluation_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 30564 1726882897.74729: search_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 30564 1726882897.74733: variable 'ansible_search_path' from source: unknown 30564 1726882897.79448: variable 'ansible_managed' from source: unknown 30564 1726882897.79713: variable 'omit' from source: magic vars 30564 1726882897.79717: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882897.79720: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882897.79722: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882897.79724: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882897.79727: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882897.79729: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882897.79731: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882897.79733: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882897.80051: Set connection var ansible_timeout to 10 30564 1726882897.80055: Set connection var ansible_pipelining to False 30564 1726882897.80057: Set connection var ansible_shell_type to sh 30564 1726882897.80061: Set connection var ansible_shell_executable to /bin/sh 30564 1726882897.80065: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882897.80067: Set connection var ansible_connection to ssh 30564 1726882897.80068: variable 'ansible_shell_executable' from source: unknown 30564 1726882897.80070: variable 'ansible_connection' from source: unknown 30564 1726882897.80072: variable 'ansible_module_compression' from source: unknown 30564 1726882897.80074: variable 'ansible_shell_type' from source: unknown 30564 1726882897.80076: variable 'ansible_shell_executable' from source: unknown 30564 1726882897.80078: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882897.80080: variable 'ansible_pipelining' from source: unknown 30564 1726882897.80082: variable 'ansible_timeout' from source: unknown 30564 1726882897.80084: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882897.80086: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30564 1726882897.80097: variable 'omit' from source: magic vars 30564 1726882897.80099: starting attempt loop 30564 1726882897.80102: running the handler 30564 1726882897.80104: _low_level_execute_command(): starting 30564 1726882897.80106: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882897.80730: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882897.80734: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882897.80839: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882897.82486: stdout chunk (state=3): >>>/root <<< 30564 1726882897.82600: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882897.82651: stderr chunk (state=3): >>><<< 30564 1726882897.82653: stdout chunk (state=3): >>><<< 30564 1726882897.82731: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882897.82734: _low_level_execute_command(): starting 30564 1726882897.82737: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882897.8267136-34780-187626785570367 `" && echo ansible-tmp-1726882897.8267136-34780-187626785570367="` echo /root/.ansible/tmp/ansible-tmp-1726882897.8267136-34780-187626785570367 `" ) && sleep 0' 30564 1726882897.83115: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882897.83118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882897.83172: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882897.83236: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882897.83249: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882897.83255: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882897.83392: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882897.85252: stdout chunk (state=3): >>>ansible-tmp-1726882897.8267136-34780-187626785570367=/root/.ansible/tmp/ansible-tmp-1726882897.8267136-34780-187626785570367 <<< 30564 1726882897.85357: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882897.85405: stderr chunk (state=3): >>><<< 30564 1726882897.85408: stdout chunk (state=3): >>><<< 30564 1726882897.85420: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882897.8267136-34780-187626785570367=/root/.ansible/tmp/ansible-tmp-1726882897.8267136-34780-187626785570367 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882897.85457: variable 'ansible_module_compression' from source: unknown 30564 1726882897.85498: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 30564 1726882897.85521: variable 'ansible_facts' from source: unknown 30564 1726882897.85588: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882897.8267136-34780-187626785570367/AnsiballZ_network_connections.py 30564 1726882897.85702: Sending initial data 30564 1726882897.85706: Sent initial data (168 bytes) 30564 1726882897.86339: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882897.86345: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882897.86381: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882897.86384: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882897.86391: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882897.86432: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882897.86439: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882897.86554: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882897.88284: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 <<< 30564 1726882897.88289: stderr chunk (state=3): >>>debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882897.88382: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882897.88485: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmpk74ku1sx /root/.ansible/tmp/ansible-tmp-1726882897.8267136-34780-187626785570367/AnsiballZ_network_connections.py <<< 30564 1726882897.88583: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882897.89947: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882897.90051: stderr chunk (state=3): >>><<< 30564 1726882897.90054: stdout chunk (state=3): >>><<< 30564 1726882897.90076: done transferring module to remote 30564 1726882897.90084: _low_level_execute_command(): starting 30564 1726882897.90088: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882897.8267136-34780-187626785570367/ /root/.ansible/tmp/ansible-tmp-1726882897.8267136-34780-187626785570367/AnsiballZ_network_connections.py && sleep 0' 30564 1726882897.90529: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882897.90535: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882897.90570: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882897.90582: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882897.90594: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882897.90644: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882897.90651: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882897.90768: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882897.92518: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882897.92571: stderr chunk (state=3): >>><<< 30564 1726882897.92575: stdout chunk (state=3): >>><<< 30564 1726882897.92587: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882897.92592: _low_level_execute_command(): starting 30564 1726882897.92594: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882897.8267136-34780-187626785570367/AnsiballZ_network_connections.py && sleep 0' 30564 1726882897.93062: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882897.93070: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882897.93103: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 30564 1726882897.93107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882897.93109: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882897.93157: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882897.93163: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882897.93276: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882898.23236: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[001] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 30564 1726882898.24881: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882898.24891: stdout chunk (state=3): >>><<< 30564 1726882898.24895: stderr chunk (state=3): >>><<< 30564 1726882898.24911: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[001] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882898.24938: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882897.8267136-34780-187626785570367/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882898.24946: _low_level_execute_command(): starting 30564 1726882898.24951: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882897.8267136-34780-187626785570367/ > /dev/null 2>&1 && sleep 0' 30564 1726882898.25402: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882898.25406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882898.25436: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882898.25441: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882898.25443: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882898.25498: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882898.25506: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882898.25616: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882898.27425: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882898.27470: stderr chunk (state=3): >>><<< 30564 1726882898.27474: stdout chunk (state=3): >>><<< 30564 1726882898.27488: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882898.27493: handler run complete 30564 1726882898.27516: attempt loop complete, returning result 30564 1726882898.27519: _execute() done 30564 1726882898.27521: dumping result to json 30564 1726882898.27525: done dumping result, returning 30564 1726882898.27533: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0e448fcc-3ce9-4216-acec-000000001d3b] 30564 1726882898.27539: sending task result for task 0e448fcc-3ce9-4216-acec-000000001d3b 30564 1726882898.27641: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001d3b 30564 1726882898.27644: WORKER PROCESS EXITING changed: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [001] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete 30564 1726882898.27743: no more pending results, returning what we have 30564 1726882898.27746: results queue empty 30564 1726882898.27747: checking for any_errors_fatal 30564 1726882898.27754: done checking for any_errors_fatal 30564 1726882898.27754: checking for max_fail_percentage 30564 1726882898.27756: done checking for max_fail_percentage 30564 1726882898.27757: checking to see if all hosts have failed and the running result is not ok 30564 1726882898.27758: done checking to see if all hosts have failed 30564 1726882898.27763: getting the remaining hosts for this loop 30564 1726882898.27775: done getting the remaining hosts for this loop 30564 1726882898.27779: getting the next task for host managed_node2 30564 1726882898.27786: done getting next task for host managed_node2 30564 1726882898.27790: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 30564 1726882898.27794: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882898.27806: getting variables 30564 1726882898.27808: in VariableManager get_vars() 30564 1726882898.27847: Calling all_inventory to load vars for managed_node2 30564 1726882898.27849: Calling groups_inventory to load vars for managed_node2 30564 1726882898.27851: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882898.27861: Calling all_plugins_play to load vars for managed_node2 30564 1726882898.27866: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882898.27871: Calling groups_plugins_play to load vars for managed_node2 30564 1726882898.28757: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882898.29729: done with get_vars() 30564 1726882898.29746: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:41:38 -0400 (0:00:00.594) 0:01:36.879 ****** 30564 1726882898.29810: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 30564 1726882898.30021: worker is 1 (out of 1 available) 30564 1726882898.30032: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 30564 1726882898.30044: done queuing things up, now waiting for results queue to drain 30564 1726882898.30046: waiting for pending results... 30564 1726882898.30240: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 30564 1726882898.30335: in run() - task 0e448fcc-3ce9-4216-acec-000000001d3c 30564 1726882898.30347: variable 'ansible_search_path' from source: unknown 30564 1726882898.30350: variable 'ansible_search_path' from source: unknown 30564 1726882898.30383: calling self._execute() 30564 1726882898.30458: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882898.30462: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882898.30477: variable 'omit' from source: magic vars 30564 1726882898.30755: variable 'ansible_distribution_major_version' from source: facts 30564 1726882898.30767: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882898.30852: variable 'network_state' from source: role '' defaults 30564 1726882898.30862: Evaluated conditional (network_state != {}): False 30564 1726882898.30867: when evaluation is False, skipping this task 30564 1726882898.30872: _execute() done 30564 1726882898.30875: dumping result to json 30564 1726882898.30877: done dumping result, returning 30564 1726882898.30880: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [0e448fcc-3ce9-4216-acec-000000001d3c] 30564 1726882898.30886: sending task result for task 0e448fcc-3ce9-4216-acec-000000001d3c 30564 1726882898.30978: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001d3c 30564 1726882898.30981: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30564 1726882898.31033: no more pending results, returning what we have 30564 1726882898.31037: results queue empty 30564 1726882898.31038: checking for any_errors_fatal 30564 1726882898.31046: done checking for any_errors_fatal 30564 1726882898.31047: checking for max_fail_percentage 30564 1726882898.31049: done checking for max_fail_percentage 30564 1726882898.31049: checking to see if all hosts have failed and the running result is not ok 30564 1726882898.31050: done checking to see if all hosts have failed 30564 1726882898.31051: getting the remaining hosts for this loop 30564 1726882898.31052: done getting the remaining hosts for this loop 30564 1726882898.31055: getting the next task for host managed_node2 30564 1726882898.31061: done getting next task for host managed_node2 30564 1726882898.31067: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30564 1726882898.31074: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882898.31102: getting variables 30564 1726882898.31104: in VariableManager get_vars() 30564 1726882898.31134: Calling all_inventory to load vars for managed_node2 30564 1726882898.31136: Calling groups_inventory to load vars for managed_node2 30564 1726882898.31137: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882898.31144: Calling all_plugins_play to load vars for managed_node2 30564 1726882898.31146: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882898.31147: Calling groups_plugins_play to load vars for managed_node2 30564 1726882898.32050: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882898.32996: done with get_vars() 30564 1726882898.33010: done getting variables 30564 1726882898.33052: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:41:38 -0400 (0:00:00.032) 0:01:36.912 ****** 30564 1726882898.33078: entering _queue_task() for managed_node2/debug 30564 1726882898.33265: worker is 1 (out of 1 available) 30564 1726882898.33277: exiting _queue_task() for managed_node2/debug 30564 1726882898.33290: done queuing things up, now waiting for results queue to drain 30564 1726882898.33291: waiting for pending results... 30564 1726882898.33486: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30564 1726882898.33579: in run() - task 0e448fcc-3ce9-4216-acec-000000001d3d 30564 1726882898.33590: variable 'ansible_search_path' from source: unknown 30564 1726882898.33594: variable 'ansible_search_path' from source: unknown 30564 1726882898.33630: calling self._execute() 30564 1726882898.33710: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882898.33716: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882898.33725: variable 'omit' from source: magic vars 30564 1726882898.34005: variable 'ansible_distribution_major_version' from source: facts 30564 1726882898.34016: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882898.34022: variable 'omit' from source: magic vars 30564 1726882898.34071: variable 'omit' from source: magic vars 30564 1726882898.34096: variable 'omit' from source: magic vars 30564 1726882898.34129: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882898.34158: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882898.34178: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882898.34191: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882898.34201: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882898.34223: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882898.34226: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882898.34228: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882898.34303: Set connection var ansible_timeout to 10 30564 1726882898.34306: Set connection var ansible_pipelining to False 30564 1726882898.34309: Set connection var ansible_shell_type to sh 30564 1726882898.34314: Set connection var ansible_shell_executable to /bin/sh 30564 1726882898.34321: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882898.34323: Set connection var ansible_connection to ssh 30564 1726882898.34341: variable 'ansible_shell_executable' from source: unknown 30564 1726882898.34344: variable 'ansible_connection' from source: unknown 30564 1726882898.34347: variable 'ansible_module_compression' from source: unknown 30564 1726882898.34349: variable 'ansible_shell_type' from source: unknown 30564 1726882898.34351: variable 'ansible_shell_executable' from source: unknown 30564 1726882898.34355: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882898.34357: variable 'ansible_pipelining' from source: unknown 30564 1726882898.34359: variable 'ansible_timeout' from source: unknown 30564 1726882898.34370: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882898.34461: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882898.34477: variable 'omit' from source: magic vars 30564 1726882898.34481: starting attempt loop 30564 1726882898.34483: running the handler 30564 1726882898.34578: variable '__network_connections_result' from source: set_fact 30564 1726882898.34620: handler run complete 30564 1726882898.34632: attempt loop complete, returning result 30564 1726882898.34635: _execute() done 30564 1726882898.34637: dumping result to json 30564 1726882898.34639: done dumping result, returning 30564 1726882898.34648: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0e448fcc-3ce9-4216-acec-000000001d3d] 30564 1726882898.34652: sending task result for task 0e448fcc-3ce9-4216-acec-000000001d3d 30564 1726882898.34740: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001d3d 30564 1726882898.34743: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "[001] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete" ] } 30564 1726882898.34815: no more pending results, returning what we have 30564 1726882898.34817: results queue empty 30564 1726882898.34818: checking for any_errors_fatal 30564 1726882898.34825: done checking for any_errors_fatal 30564 1726882898.34826: checking for max_fail_percentage 30564 1726882898.34827: done checking for max_fail_percentage 30564 1726882898.34828: checking to see if all hosts have failed and the running result is not ok 30564 1726882898.34829: done checking to see if all hosts have failed 30564 1726882898.34830: getting the remaining hosts for this loop 30564 1726882898.34831: done getting the remaining hosts for this loop 30564 1726882898.34834: getting the next task for host managed_node2 30564 1726882898.34840: done getting next task for host managed_node2 30564 1726882898.34843: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30564 1726882898.34848: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882898.34869: getting variables 30564 1726882898.34871: in VariableManager get_vars() 30564 1726882898.34902: Calling all_inventory to load vars for managed_node2 30564 1726882898.34904: Calling groups_inventory to load vars for managed_node2 30564 1726882898.34905: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882898.34912: Calling all_plugins_play to load vars for managed_node2 30564 1726882898.34913: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882898.34915: Calling groups_plugins_play to load vars for managed_node2 30564 1726882898.35717: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882898.36802: done with get_vars() 30564 1726882898.36820: done getting variables 30564 1726882898.36858: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:41:38 -0400 (0:00:00.038) 0:01:36.950 ****** 30564 1726882898.36886: entering _queue_task() for managed_node2/debug 30564 1726882898.37073: worker is 1 (out of 1 available) 30564 1726882898.37086: exiting _queue_task() for managed_node2/debug 30564 1726882898.37098: done queuing things up, now waiting for results queue to drain 30564 1726882898.37099: waiting for pending results... 30564 1726882898.37291: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30564 1726882898.37389: in run() - task 0e448fcc-3ce9-4216-acec-000000001d3e 30564 1726882898.37399: variable 'ansible_search_path' from source: unknown 30564 1726882898.37403: variable 'ansible_search_path' from source: unknown 30564 1726882898.37431: calling self._execute() 30564 1726882898.37509: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882898.37513: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882898.37522: variable 'omit' from source: magic vars 30564 1726882898.37826: variable 'ansible_distribution_major_version' from source: facts 30564 1726882898.37831: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882898.37836: variable 'omit' from source: magic vars 30564 1726882898.37905: variable 'omit' from source: magic vars 30564 1726882898.37951: variable 'omit' from source: magic vars 30564 1726882898.37998: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882898.38040: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882898.38071: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882898.38093: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882898.38116: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882898.38151: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882898.38170: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882898.38179: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882898.38302: Set connection var ansible_timeout to 10 30564 1726882898.38312: Set connection var ansible_pipelining to False 30564 1726882898.38318: Set connection var ansible_shell_type to sh 30564 1726882898.38332: Set connection var ansible_shell_executable to /bin/sh 30564 1726882898.38343: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882898.38349: Set connection var ansible_connection to ssh 30564 1726882898.38378: variable 'ansible_shell_executable' from source: unknown 30564 1726882898.38394: variable 'ansible_connection' from source: unknown 30564 1726882898.38402: variable 'ansible_module_compression' from source: unknown 30564 1726882898.38408: variable 'ansible_shell_type' from source: unknown 30564 1726882898.38414: variable 'ansible_shell_executable' from source: unknown 30564 1726882898.38419: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882898.38426: variable 'ansible_pipelining' from source: unknown 30564 1726882898.38431: variable 'ansible_timeout' from source: unknown 30564 1726882898.38442: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882898.38588: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882898.38614: variable 'omit' from source: magic vars 30564 1726882898.38624: starting attempt loop 30564 1726882898.38630: running the handler 30564 1726882898.38689: variable '__network_connections_result' from source: set_fact 30564 1726882898.38784: variable '__network_connections_result' from source: set_fact 30564 1726882898.38905: handler run complete 30564 1726882898.38946: attempt loop complete, returning result 30564 1726882898.38953: _execute() done 30564 1726882898.38960: dumping result to json 30564 1726882898.38971: done dumping result, returning 30564 1726882898.38986: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0e448fcc-3ce9-4216-acec-000000001d3e] 30564 1726882898.38995: sending task result for task 0e448fcc-3ce9-4216-acec-000000001d3e 30564 1726882898.39135: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001d3e 30564 1726882898.39137: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[001] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete\n", "stderr_lines": [ "[001] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete" ] } } 30564 1726882898.39247: no more pending results, returning what we have 30564 1726882898.39250: results queue empty 30564 1726882898.39251: checking for any_errors_fatal 30564 1726882898.39256: done checking for any_errors_fatal 30564 1726882898.39257: checking for max_fail_percentage 30564 1726882898.39258: done checking for max_fail_percentage 30564 1726882898.39259: checking to see if all hosts have failed and the running result is not ok 30564 1726882898.39260: done checking to see if all hosts have failed 30564 1726882898.39260: getting the remaining hosts for this loop 30564 1726882898.39262: done getting the remaining hosts for this loop 30564 1726882898.39267: getting the next task for host managed_node2 30564 1726882898.39273: done getting next task for host managed_node2 30564 1726882898.39277: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30564 1726882898.39281: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882898.39293: getting variables 30564 1726882898.39294: in VariableManager get_vars() 30564 1726882898.39326: Calling all_inventory to load vars for managed_node2 30564 1726882898.39328: Calling groups_inventory to load vars for managed_node2 30564 1726882898.39330: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882898.39339: Calling all_plugins_play to load vars for managed_node2 30564 1726882898.39341: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882898.39348: Calling groups_plugins_play to load vars for managed_node2 30564 1726882898.40125: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882898.41159: done with get_vars() 30564 1726882898.41181: done getting variables 30564 1726882898.41232: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:41:38 -0400 (0:00:00.043) 0:01:36.993 ****** 30564 1726882898.41260: entering _queue_task() for managed_node2/debug 30564 1726882898.41484: worker is 1 (out of 1 available) 30564 1726882898.41496: exiting _queue_task() for managed_node2/debug 30564 1726882898.41508: done queuing things up, now waiting for results queue to drain 30564 1726882898.41510: waiting for pending results... 30564 1726882898.41803: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30564 1726882898.41947: in run() - task 0e448fcc-3ce9-4216-acec-000000001d3f 30564 1726882898.41970: variable 'ansible_search_path' from source: unknown 30564 1726882898.41977: variable 'ansible_search_path' from source: unknown 30564 1726882898.42013: calling self._execute() 30564 1726882898.42115: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882898.42127: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882898.42140: variable 'omit' from source: magic vars 30564 1726882898.42525: variable 'ansible_distribution_major_version' from source: facts 30564 1726882898.42543: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882898.42672: variable 'network_state' from source: role '' defaults 30564 1726882898.42690: Evaluated conditional (network_state != {}): False 30564 1726882898.42697: when evaluation is False, skipping this task 30564 1726882898.42704: _execute() done 30564 1726882898.42712: dumping result to json 30564 1726882898.42721: done dumping result, returning 30564 1726882898.42731: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0e448fcc-3ce9-4216-acec-000000001d3f] 30564 1726882898.42740: sending task result for task 0e448fcc-3ce9-4216-acec-000000001d3f skipping: [managed_node2] => { "false_condition": "network_state != {}" } 30564 1726882898.42879: no more pending results, returning what we have 30564 1726882898.42884: results queue empty 30564 1726882898.42885: checking for any_errors_fatal 30564 1726882898.42897: done checking for any_errors_fatal 30564 1726882898.42897: checking for max_fail_percentage 30564 1726882898.42899: done checking for max_fail_percentage 30564 1726882898.42900: checking to see if all hosts have failed and the running result is not ok 30564 1726882898.42901: done checking to see if all hosts have failed 30564 1726882898.42902: getting the remaining hosts for this loop 30564 1726882898.42904: done getting the remaining hosts for this loop 30564 1726882898.42907: getting the next task for host managed_node2 30564 1726882898.42917: done getting next task for host managed_node2 30564 1726882898.42921: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 30564 1726882898.42927: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882898.42952: getting variables 30564 1726882898.42954: in VariableManager get_vars() 30564 1726882898.42998: Calling all_inventory to load vars for managed_node2 30564 1726882898.43001: Calling groups_inventory to load vars for managed_node2 30564 1726882898.43003: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882898.43016: Calling all_plugins_play to load vars for managed_node2 30564 1726882898.43019: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882898.43022: Calling groups_plugins_play to load vars for managed_node2 30564 1726882898.44120: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001d3f 30564 1726882898.44124: WORKER PROCESS EXITING 30564 1726882898.44783: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882898.46552: done with get_vars() 30564 1726882898.46578: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:41:38 -0400 (0:00:00.054) 0:01:37.048 ****** 30564 1726882898.46673: entering _queue_task() for managed_node2/ping 30564 1726882898.46931: worker is 1 (out of 1 available) 30564 1726882898.46944: exiting _queue_task() for managed_node2/ping 30564 1726882898.46957: done queuing things up, now waiting for results queue to drain 30564 1726882898.46958: waiting for pending results... 30564 1726882898.47308: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 30564 1726882898.47449: in run() - task 0e448fcc-3ce9-4216-acec-000000001d40 30564 1726882898.47472: variable 'ansible_search_path' from source: unknown 30564 1726882898.47480: variable 'ansible_search_path' from source: unknown 30564 1726882898.47517: calling self._execute() 30564 1726882898.47620: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882898.47631: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882898.47644: variable 'omit' from source: magic vars 30564 1726882898.48032: variable 'ansible_distribution_major_version' from source: facts 30564 1726882898.48052: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882898.48065: variable 'omit' from source: magic vars 30564 1726882898.48141: variable 'omit' from source: magic vars 30564 1726882898.48179: variable 'omit' from source: magic vars 30564 1726882898.48228: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882898.48269: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882898.48293: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882898.48320: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882898.48336: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882898.48373: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882898.48382: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882898.48390: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882898.48497: Set connection var ansible_timeout to 10 30564 1726882898.48641: Set connection var ansible_pipelining to False 30564 1726882898.48649: Set connection var ansible_shell_type to sh 30564 1726882898.48659: Set connection var ansible_shell_executable to /bin/sh 30564 1726882898.48674: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882898.48682: Set connection var ansible_connection to ssh 30564 1726882898.48712: variable 'ansible_shell_executable' from source: unknown 30564 1726882898.48720: variable 'ansible_connection' from source: unknown 30564 1726882898.48727: variable 'ansible_module_compression' from source: unknown 30564 1726882898.48736: variable 'ansible_shell_type' from source: unknown 30564 1726882898.48745: variable 'ansible_shell_executable' from source: unknown 30564 1726882898.48752: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882898.48760: variable 'ansible_pipelining' from source: unknown 30564 1726882898.48769: variable 'ansible_timeout' from source: unknown 30564 1726882898.48777: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882898.49208: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30564 1726882898.49223: variable 'omit' from source: magic vars 30564 1726882898.49232: starting attempt loop 30564 1726882898.49239: running the handler 30564 1726882898.49257: _low_level_execute_command(): starting 30564 1726882898.49299: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882898.50614: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882898.50632: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882898.50646: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882898.50666: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882898.50710: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882898.50721: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882898.50734: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882898.50755: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882898.50768: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882898.50780: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882898.50794: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882898.50809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882898.50824: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882898.50835: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882898.50847: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882898.50862: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882898.50938: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882898.50965: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882898.50984: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882898.51122: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882898.52795: stdout chunk (state=3): >>>/root <<< 30564 1726882898.53046: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882898.53139: stderr chunk (state=3): >>><<< 30564 1726882898.53142: stdout chunk (state=3): >>><<< 30564 1726882898.53259: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882898.53263: _low_level_execute_command(): starting 30564 1726882898.53268: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882898.5317192-34815-196684343360699 `" && echo ansible-tmp-1726882898.5317192-34815-196684343360699="` echo /root/.ansible/tmp/ansible-tmp-1726882898.5317192-34815-196684343360699 `" ) && sleep 0' 30564 1726882898.55486: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882898.55542: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882898.55895: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882898.56108: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882898.58022: stdout chunk (state=3): >>>ansible-tmp-1726882898.5317192-34815-196684343360699=/root/.ansible/tmp/ansible-tmp-1726882898.5317192-34815-196684343360699 <<< 30564 1726882898.58180: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882898.58214: stderr chunk (state=3): >>><<< 30564 1726882898.58217: stdout chunk (state=3): >>><<< 30564 1726882898.58236: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882898.5317192-34815-196684343360699=/root/.ansible/tmp/ansible-tmp-1726882898.5317192-34815-196684343360699 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882898.58288: variable 'ansible_module_compression' from source: unknown 30564 1726882898.58332: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 30564 1726882898.58366: variable 'ansible_facts' from source: unknown 30564 1726882898.58442: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882898.5317192-34815-196684343360699/AnsiballZ_ping.py 30564 1726882898.59090: Sending initial data 30564 1726882898.59093: Sent initial data (153 bytes) 30564 1726882898.62053: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882898.62056: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882898.62080: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882898.62115: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882898.62122: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882898.62136: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 30564 1726882898.62141: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882898.62222: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882898.62226: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882898.62238: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882898.62365: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882898.64199: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882898.64294: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882898.64399: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmp9na94zi8 /root/.ansible/tmp/ansible-tmp-1726882898.5317192-34815-196684343360699/AnsiballZ_ping.py <<< 30564 1726882898.64494: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882898.66025: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882898.66151: stderr chunk (state=3): >>><<< 30564 1726882898.66154: stdout chunk (state=3): >>><<< 30564 1726882898.66156: done transferring module to remote 30564 1726882898.66159: _low_level_execute_command(): starting 30564 1726882898.66161: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882898.5317192-34815-196684343360699/ /root/.ansible/tmp/ansible-tmp-1726882898.5317192-34815-196684343360699/AnsiballZ_ping.py && sleep 0' 30564 1726882898.66747: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882898.66759: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882898.66779: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882898.66800: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882898.66847: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882898.66859: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882898.66875: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882898.66892: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882898.66902: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882898.66917: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882898.66941: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882898.66954: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882898.66971: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882898.66983: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882898.66993: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882898.67006: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882898.67092: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882898.67113: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882898.67131: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882898.67269: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882898.69099: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882898.69102: stdout chunk (state=3): >>><<< 30564 1726882898.69104: stderr chunk (state=3): >>><<< 30564 1726882898.69173: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882898.69177: _low_level_execute_command(): starting 30564 1726882898.69181: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882898.5317192-34815-196684343360699/AnsiballZ_ping.py && sleep 0' 30564 1726882898.71129: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882898.71137: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882898.71148: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882898.71162: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882898.71202: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882898.71212: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882898.71221: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882898.71231: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882898.71237: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882898.71244: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882898.71251: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882898.71259: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882898.71276: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882898.71283: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882898.71290: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882898.71298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882898.71369: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882898.71387: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882898.71397: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882898.71523: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882898.84460: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 30564 1726882898.85472: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882898.85479: stdout chunk (state=3): >>><<< 30564 1726882898.85490: stderr chunk (state=3): >>><<< 30564 1726882898.85507: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882898.85530: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882898.5317192-34815-196684343360699/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882898.85538: _low_level_execute_command(): starting 30564 1726882898.85543: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882898.5317192-34815-196684343360699/ > /dev/null 2>&1 && sleep 0' 30564 1726882898.87673: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882898.87677: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882898.87679: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882898.87682: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882898.87688: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882898.87690: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882898.87693: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882898.87695: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882898.87783: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882898.87787: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882898.87796: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882898.87806: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882898.87816: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882898.87824: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882898.87830: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882898.87840: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882898.87929: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882898.87933: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882898.87941: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882898.88112: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882898.89959: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882898.89963: stdout chunk (state=3): >>><<< 30564 1726882898.89976: stderr chunk (state=3): >>><<< 30564 1726882898.89991: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882898.89998: handler run complete 30564 1726882898.90015: attempt loop complete, returning result 30564 1726882898.90018: _execute() done 30564 1726882898.90020: dumping result to json 30564 1726882898.90022: done dumping result, returning 30564 1726882898.90033: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0e448fcc-3ce9-4216-acec-000000001d40] 30564 1726882898.90039: sending task result for task 0e448fcc-3ce9-4216-acec-000000001d40 30564 1726882898.90143: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001d40 30564 1726882898.90146: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 30564 1726882898.90237: no more pending results, returning what we have 30564 1726882898.90240: results queue empty 30564 1726882898.90241: checking for any_errors_fatal 30564 1726882898.90249: done checking for any_errors_fatal 30564 1726882898.90250: checking for max_fail_percentage 30564 1726882898.90251: done checking for max_fail_percentage 30564 1726882898.90252: checking to see if all hosts have failed and the running result is not ok 30564 1726882898.90253: done checking to see if all hosts have failed 30564 1726882898.90256: getting the remaining hosts for this loop 30564 1726882898.90258: done getting the remaining hosts for this loop 30564 1726882898.90261: getting the next task for host managed_node2 30564 1726882898.90273: done getting next task for host managed_node2 30564 1726882898.90275: ^ task is: TASK: meta (role_complete) 30564 1726882898.90281: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882898.90293: getting variables 30564 1726882898.90294: in VariableManager get_vars() 30564 1726882898.90337: Calling all_inventory to load vars for managed_node2 30564 1726882898.90339: Calling groups_inventory to load vars for managed_node2 30564 1726882898.90342: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882898.90351: Calling all_plugins_play to load vars for managed_node2 30564 1726882898.90354: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882898.90356: Calling groups_plugins_play to load vars for managed_node2 30564 1726882898.92821: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882899.08902: done with get_vars() 30564 1726882899.08932: done getting variables 30564 1726882899.09127: done queuing things up, now waiting for results queue to drain 30564 1726882899.09130: results queue empty 30564 1726882899.09131: checking for any_errors_fatal 30564 1726882899.09134: done checking for any_errors_fatal 30564 1726882899.09134: checking for max_fail_percentage 30564 1726882899.09135: done checking for max_fail_percentage 30564 1726882899.09136: checking to see if all hosts have failed and the running result is not ok 30564 1726882899.09137: done checking to see if all hosts have failed 30564 1726882899.09138: getting the remaining hosts for this loop 30564 1726882899.09139: done getting the remaining hosts for this loop 30564 1726882899.09142: getting the next task for host managed_node2 30564 1726882899.09147: done getting next task for host managed_node2 30564 1726882899.09149: ^ task is: TASK: Asserts 30564 1726882899.09152: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882899.09159: getting variables 30564 1726882899.09160: in VariableManager get_vars() 30564 1726882899.09287: Calling all_inventory to load vars for managed_node2 30564 1726882899.09290: Calling groups_inventory to load vars for managed_node2 30564 1726882899.09297: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882899.09303: Calling all_plugins_play to load vars for managed_node2 30564 1726882899.09305: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882899.09308: Calling groups_plugins_play to load vars for managed_node2 30564 1726882899.12029: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882899.16267: done with get_vars() 30564 1726882899.16294: done getting variables TASK [Asserts] ***************************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:36 Friday 20 September 2024 21:41:39 -0400 (0:00:00.698) 0:01:37.746 ****** 30564 1726882899.16491: entering _queue_task() for managed_node2/include_tasks 30564 1726882899.17298: worker is 1 (out of 1 available) 30564 1726882899.17369: exiting _queue_task() for managed_node2/include_tasks 30564 1726882899.17385: done queuing things up, now waiting for results queue to drain 30564 1726882899.17387: waiting for pending results... 30564 1726882899.17840: running TaskExecutor() for managed_node2/TASK: Asserts 30564 1726882899.17990: in run() - task 0e448fcc-3ce9-4216-acec-000000001749 30564 1726882899.18012: variable 'ansible_search_path' from source: unknown 30564 1726882899.18020: variable 'ansible_search_path' from source: unknown 30564 1726882899.18082: variable 'lsr_assert' from source: include params 30564 1726882899.18353: variable 'lsr_assert' from source: include params 30564 1726882899.18444: variable 'omit' from source: magic vars 30564 1726882899.18605: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882899.18631: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882899.18648: variable 'omit' from source: magic vars 30564 1726882899.18922: variable 'ansible_distribution_major_version' from source: facts 30564 1726882899.18951: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882899.18968: variable 'item' from source: unknown 30564 1726882899.19037: variable 'item' from source: unknown 30564 1726882899.19087: variable 'item' from source: unknown 30564 1726882899.19154: variable 'item' from source: unknown 30564 1726882899.19322: dumping result to json 30564 1726882899.19329: done dumping result, returning 30564 1726882899.19337: done running TaskExecutor() for managed_node2/TASK: Asserts [0e448fcc-3ce9-4216-acec-000000001749] 30564 1726882899.19346: sending task result for task 0e448fcc-3ce9-4216-acec-000000001749 30564 1726882899.19440: no more pending results, returning what we have 30564 1726882899.19446: in VariableManager get_vars() 30564 1726882899.19495: Calling all_inventory to load vars for managed_node2 30564 1726882899.19498: Calling groups_inventory to load vars for managed_node2 30564 1726882899.19503: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882899.19518: Calling all_plugins_play to load vars for managed_node2 30564 1726882899.19522: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882899.19525: Calling groups_plugins_play to load vars for managed_node2 30564 1726882899.20754: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001749 30564 1726882899.20758: WORKER PROCESS EXITING 30564 1726882899.23446: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882899.26868: done with get_vars() 30564 1726882899.26889: variable 'ansible_search_path' from source: unknown 30564 1726882899.26890: variable 'ansible_search_path' from source: unknown 30564 1726882899.27050: we have included files to process 30564 1726882899.27052: generating all_blocks data 30564 1726882899.27053: done generating all_blocks data 30564 1726882899.27058: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 30564 1726882899.27059: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 30564 1726882899.27061: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 30564 1726882899.27297: in VariableManager get_vars() 30564 1726882899.27323: done with get_vars() 30564 1726882899.27515: done processing included file 30564 1726882899.27518: iterating over new_blocks loaded from include file 30564 1726882899.27519: in VariableManager get_vars() 30564 1726882899.27538: done with get_vars() 30564 1726882899.27539: filtering new block on tags 30564 1726882899.27589: done filtering new block on tags 30564 1726882899.27591: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml for managed_node2 => (item=tasks/assert_profile_absent.yml) 30564 1726882899.27596: extending task lists for all hosts with included blocks 30564 1726882899.29502: done extending task lists 30564 1726882899.29504: done processing included files 30564 1726882899.29505: results queue empty 30564 1726882899.29506: checking for any_errors_fatal 30564 1726882899.29508: done checking for any_errors_fatal 30564 1726882899.29509: checking for max_fail_percentage 30564 1726882899.29510: done checking for max_fail_percentage 30564 1726882899.29511: checking to see if all hosts have failed and the running result is not ok 30564 1726882899.29512: done checking to see if all hosts have failed 30564 1726882899.29513: getting the remaining hosts for this loop 30564 1726882899.29514: done getting the remaining hosts for this loop 30564 1726882899.29548: getting the next task for host managed_node2 30564 1726882899.29601: done getting next task for host managed_node2 30564 1726882899.29604: ^ task is: TASK: Include the task 'get_profile_stat.yml' 30564 1726882899.29607: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882899.29610: getting variables 30564 1726882899.29611: in VariableManager get_vars() 30564 1726882899.29626: Calling all_inventory to load vars for managed_node2 30564 1726882899.29658: Calling groups_inventory to load vars for managed_node2 30564 1726882899.29661: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882899.29667: Calling all_plugins_play to load vars for managed_node2 30564 1726882899.29669: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882899.29672: Calling groups_plugins_play to load vars for managed_node2 30564 1726882899.30665: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882899.33491: done with get_vars() 30564 1726882899.33558: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:3 Friday 20 September 2024 21:41:39 -0400 (0:00:00.171) 0:01:37.917 ****** 30564 1726882899.33638: entering _queue_task() for managed_node2/include_tasks 30564 1726882899.33991: worker is 1 (out of 1 available) 30564 1726882899.34005: exiting _queue_task() for managed_node2/include_tasks 30564 1726882899.34015: done queuing things up, now waiting for results queue to drain 30564 1726882899.34016: waiting for pending results... 30564 1726882899.34215: running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' 30564 1726882899.34319: in run() - task 0e448fcc-3ce9-4216-acec-000000001e99 30564 1726882899.34338: variable 'ansible_search_path' from source: unknown 30564 1726882899.34341: variable 'ansible_search_path' from source: unknown 30564 1726882899.34385: calling self._execute() 30564 1726882899.34744: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882899.34748: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882899.34751: variable 'omit' from source: magic vars 30564 1726882899.35018: variable 'ansible_distribution_major_version' from source: facts 30564 1726882899.35022: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882899.35024: _execute() done 30564 1726882899.35026: dumping result to json 30564 1726882899.35028: done dumping result, returning 30564 1726882899.35030: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' [0e448fcc-3ce9-4216-acec-000000001e99] 30564 1726882899.35031: sending task result for task 0e448fcc-3ce9-4216-acec-000000001e99 30564 1726882899.35101: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001e99 30564 1726882899.35104: WORKER PROCESS EXITING 30564 1726882899.35251: no more pending results, returning what we have 30564 1726882899.35255: in VariableManager get_vars() 30564 1726882899.35297: Calling all_inventory to load vars for managed_node2 30564 1726882899.35299: Calling groups_inventory to load vars for managed_node2 30564 1726882899.35302: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882899.35311: Calling all_plugins_play to load vars for managed_node2 30564 1726882899.35318: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882899.35323: Calling groups_plugins_play to load vars for managed_node2 30564 1726882899.36720: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882899.38513: done with get_vars() 30564 1726882899.38542: variable 'ansible_search_path' from source: unknown 30564 1726882899.38544: variable 'ansible_search_path' from source: unknown 30564 1726882899.38551: variable 'item' from source: include params 30564 1726882899.38670: variable 'item' from source: include params 30564 1726882899.38704: we have included files to process 30564 1726882899.38706: generating all_blocks data 30564 1726882899.38707: done generating all_blocks data 30564 1726882899.38708: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30564 1726882899.38709: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30564 1726882899.38711: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30564 1726882899.39462: done processing included file 30564 1726882899.39466: iterating over new_blocks loaded from include file 30564 1726882899.39467: in VariableManager get_vars() 30564 1726882899.39479: done with get_vars() 30564 1726882899.39481: filtering new block on tags 30564 1726882899.39525: done filtering new block on tags 30564 1726882899.39527: in VariableManager get_vars() 30564 1726882899.39538: done with get_vars() 30564 1726882899.39539: filtering new block on tags 30564 1726882899.39574: done filtering new block on tags 30564 1726882899.39576: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node2 30564 1726882899.39579: extending task lists for all hosts with included blocks 30564 1726882899.39796: done extending task lists 30564 1726882899.39798: done processing included files 30564 1726882899.39798: results queue empty 30564 1726882899.39799: checking for any_errors_fatal 30564 1726882899.39803: done checking for any_errors_fatal 30564 1726882899.39804: checking for max_fail_percentage 30564 1726882899.39805: done checking for max_fail_percentage 30564 1726882899.39805: checking to see if all hosts have failed and the running result is not ok 30564 1726882899.39806: done checking to see if all hosts have failed 30564 1726882899.39807: getting the remaining hosts for this loop 30564 1726882899.39808: done getting the remaining hosts for this loop 30564 1726882899.39811: getting the next task for host managed_node2 30564 1726882899.39815: done getting next task for host managed_node2 30564 1726882899.39818: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 30564 1726882899.39821: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882899.39823: getting variables 30564 1726882899.39824: in VariableManager get_vars() 30564 1726882899.39850: Calling all_inventory to load vars for managed_node2 30564 1726882899.39853: Calling groups_inventory to load vars for managed_node2 30564 1726882899.39855: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882899.39860: Calling all_plugins_play to load vars for managed_node2 30564 1726882899.39863: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882899.39871: Calling groups_plugins_play to load vars for managed_node2 30564 1726882899.41026: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882899.42415: done with get_vars() 30564 1726882899.42433: done getting variables 30564 1726882899.42468: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 21:41:39 -0400 (0:00:00.088) 0:01:38.006 ****** 30564 1726882899.42498: entering _queue_task() for managed_node2/set_fact 30564 1726882899.42750: worker is 1 (out of 1 available) 30564 1726882899.42762: exiting _queue_task() for managed_node2/set_fact 30564 1726882899.42778: done queuing things up, now waiting for results queue to drain 30564 1726882899.42779: waiting for pending results... 30564 1726882899.42979: running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag 30564 1726882899.43051: in run() - task 0e448fcc-3ce9-4216-acec-000000001f17 30564 1726882899.43062: variable 'ansible_search_path' from source: unknown 30564 1726882899.43070: variable 'ansible_search_path' from source: unknown 30564 1726882899.43098: calling self._execute() 30564 1726882899.43177: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882899.43181: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882899.43191: variable 'omit' from source: magic vars 30564 1726882899.43481: variable 'ansible_distribution_major_version' from source: facts 30564 1726882899.43492: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882899.43497: variable 'omit' from source: magic vars 30564 1726882899.43532: variable 'omit' from source: magic vars 30564 1726882899.43557: variable 'omit' from source: magic vars 30564 1726882899.43591: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882899.43618: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882899.43635: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882899.43653: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882899.43661: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882899.43689: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882899.43692: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882899.43695: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882899.43763: Set connection var ansible_timeout to 10 30564 1726882899.43771: Set connection var ansible_pipelining to False 30564 1726882899.43774: Set connection var ansible_shell_type to sh 30564 1726882899.43777: Set connection var ansible_shell_executable to /bin/sh 30564 1726882899.43784: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882899.43787: Set connection var ansible_connection to ssh 30564 1726882899.43805: variable 'ansible_shell_executable' from source: unknown 30564 1726882899.43808: variable 'ansible_connection' from source: unknown 30564 1726882899.43811: variable 'ansible_module_compression' from source: unknown 30564 1726882899.43813: variable 'ansible_shell_type' from source: unknown 30564 1726882899.43816: variable 'ansible_shell_executable' from source: unknown 30564 1726882899.43818: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882899.43820: variable 'ansible_pipelining' from source: unknown 30564 1726882899.43823: variable 'ansible_timeout' from source: unknown 30564 1726882899.43825: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882899.43926: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882899.43934: variable 'omit' from source: magic vars 30564 1726882899.43939: starting attempt loop 30564 1726882899.43942: running the handler 30564 1726882899.43953: handler run complete 30564 1726882899.43961: attempt loop complete, returning result 30564 1726882899.43965: _execute() done 30564 1726882899.43970: dumping result to json 30564 1726882899.43974: done dumping result, returning 30564 1726882899.43977: done running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag [0e448fcc-3ce9-4216-acec-000000001f17] 30564 1726882899.43987: sending task result for task 0e448fcc-3ce9-4216-acec-000000001f17 30564 1726882899.44070: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001f17 30564 1726882899.44074: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 30564 1726882899.44151: no more pending results, returning what we have 30564 1726882899.44154: results queue empty 30564 1726882899.44155: checking for any_errors_fatal 30564 1726882899.44157: done checking for any_errors_fatal 30564 1726882899.44157: checking for max_fail_percentage 30564 1726882899.44159: done checking for max_fail_percentage 30564 1726882899.44159: checking to see if all hosts have failed and the running result is not ok 30564 1726882899.44160: done checking to see if all hosts have failed 30564 1726882899.44161: getting the remaining hosts for this loop 30564 1726882899.44163: done getting the remaining hosts for this loop 30564 1726882899.44170: getting the next task for host managed_node2 30564 1726882899.44177: done getting next task for host managed_node2 30564 1726882899.44179: ^ task is: TASK: Stat profile file 30564 1726882899.44184: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882899.44188: getting variables 30564 1726882899.44189: in VariableManager get_vars() 30564 1726882899.44228: Calling all_inventory to load vars for managed_node2 30564 1726882899.44231: Calling groups_inventory to load vars for managed_node2 30564 1726882899.44234: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882899.44243: Calling all_plugins_play to load vars for managed_node2 30564 1726882899.44246: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882899.44248: Calling groups_plugins_play to load vars for managed_node2 30564 1726882899.45059: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882899.46017: done with get_vars() 30564 1726882899.46033: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 21:41:39 -0400 (0:00:00.035) 0:01:38.042 ****** 30564 1726882899.46097: entering _queue_task() for managed_node2/stat 30564 1726882899.46284: worker is 1 (out of 1 available) 30564 1726882899.46297: exiting _queue_task() for managed_node2/stat 30564 1726882899.46309: done queuing things up, now waiting for results queue to drain 30564 1726882899.46310: waiting for pending results... 30564 1726882899.46503: running TaskExecutor() for managed_node2/TASK: Stat profile file 30564 1726882899.46583: in run() - task 0e448fcc-3ce9-4216-acec-000000001f18 30564 1726882899.46600: variable 'ansible_search_path' from source: unknown 30564 1726882899.46604: variable 'ansible_search_path' from source: unknown 30564 1726882899.46627: calling self._execute() 30564 1726882899.46708: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882899.46712: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882899.46721: variable 'omit' from source: magic vars 30564 1726882899.47006: variable 'ansible_distribution_major_version' from source: facts 30564 1726882899.47019: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882899.47023: variable 'omit' from source: magic vars 30564 1726882899.47061: variable 'omit' from source: magic vars 30564 1726882899.47130: variable 'profile' from source: play vars 30564 1726882899.47136: variable 'interface' from source: play vars 30564 1726882899.47187: variable 'interface' from source: play vars 30564 1726882899.47200: variable 'omit' from source: magic vars 30564 1726882899.47234: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882899.47261: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882899.47281: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882899.47294: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882899.47305: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882899.47327: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882899.47330: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882899.47333: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882899.47407: Set connection var ansible_timeout to 10 30564 1726882899.47410: Set connection var ansible_pipelining to False 30564 1726882899.47413: Set connection var ansible_shell_type to sh 30564 1726882899.47419: Set connection var ansible_shell_executable to /bin/sh 30564 1726882899.47425: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882899.47428: Set connection var ansible_connection to ssh 30564 1726882899.47446: variable 'ansible_shell_executable' from source: unknown 30564 1726882899.47449: variable 'ansible_connection' from source: unknown 30564 1726882899.47451: variable 'ansible_module_compression' from source: unknown 30564 1726882899.47454: variable 'ansible_shell_type' from source: unknown 30564 1726882899.47456: variable 'ansible_shell_executable' from source: unknown 30564 1726882899.47458: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882899.47464: variable 'ansible_pipelining' from source: unknown 30564 1726882899.47466: variable 'ansible_timeout' from source: unknown 30564 1726882899.47475: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882899.47617: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30564 1726882899.47625: variable 'omit' from source: magic vars 30564 1726882899.47631: starting attempt loop 30564 1726882899.47634: running the handler 30564 1726882899.47644: _low_level_execute_command(): starting 30564 1726882899.47651: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882899.48168: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882899.48172: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882899.48204: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 30564 1726882899.48208: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882899.48211: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882899.48257: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882899.48273: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882899.48396: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882899.50053: stdout chunk (state=3): >>>/root <<< 30564 1726882899.50156: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882899.50209: stderr chunk (state=3): >>><<< 30564 1726882899.50213: stdout chunk (state=3): >>><<< 30564 1726882899.50231: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882899.50241: _low_level_execute_command(): starting 30564 1726882899.50245: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882899.50229-34865-79273458860301 `" && echo ansible-tmp-1726882899.50229-34865-79273458860301="` echo /root/.ansible/tmp/ansible-tmp-1726882899.50229-34865-79273458860301 `" ) && sleep 0' 30564 1726882899.50670: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882899.50674: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882899.50724: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882899.50733: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882899.50736: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 30564 1726882899.50738: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882899.50782: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882899.50795: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882899.50901: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882899.52769: stdout chunk (state=3): >>>ansible-tmp-1726882899.50229-34865-79273458860301=/root/.ansible/tmp/ansible-tmp-1726882899.50229-34865-79273458860301 <<< 30564 1726882899.52882: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882899.52923: stderr chunk (state=3): >>><<< 30564 1726882899.52926: stdout chunk (state=3): >>><<< 30564 1726882899.52939: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882899.50229-34865-79273458860301=/root/.ansible/tmp/ansible-tmp-1726882899.50229-34865-79273458860301 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882899.52977: variable 'ansible_module_compression' from source: unknown 30564 1726882899.53027: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 30564 1726882899.53057: variable 'ansible_facts' from source: unknown 30564 1726882899.53123: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882899.50229-34865-79273458860301/AnsiballZ_stat.py 30564 1726882899.53222: Sending initial data 30564 1726882899.53225: Sent initial data (150 bytes) 30564 1726882899.53870: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882899.53875: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882899.53943: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882899.53948: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882899.53951: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882899.53989: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882899.53994: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882899.54095: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882899.55822: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882899.55915: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882899.56013: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmpgdiuu5h1 /root/.ansible/tmp/ansible-tmp-1726882899.50229-34865-79273458860301/AnsiballZ_stat.py <<< 30564 1726882899.56113: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882899.57122: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882899.57227: stderr chunk (state=3): >>><<< 30564 1726882899.57230: stdout chunk (state=3): >>><<< 30564 1726882899.57249: done transferring module to remote 30564 1726882899.57257: _low_level_execute_command(): starting 30564 1726882899.57261: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882899.50229-34865-79273458860301/ /root/.ansible/tmp/ansible-tmp-1726882899.50229-34865-79273458860301/AnsiballZ_stat.py && sleep 0' 30564 1726882899.58167: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882899.58177: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882899.58217: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882899.58223: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration <<< 30564 1726882899.58235: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882899.58241: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882899.58247: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882899.58259: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882899.58336: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882899.58339: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882899.58359: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882899.58482: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882899.60301: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882899.60304: stdout chunk (state=3): >>><<< 30564 1726882899.60312: stderr chunk (state=3): >>><<< 30564 1726882899.60327: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882899.60330: _low_level_execute_command(): starting 30564 1726882899.60335: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882899.50229-34865-79273458860301/AnsiballZ_stat.py && sleep 0' 30564 1726882899.60912: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882899.60921: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882899.60931: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882899.60970: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882899.61005: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882899.61012: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882899.61022: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882899.61036: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882899.61044: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882899.61051: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882899.61059: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882899.61081: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882899.61087: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882899.61095: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882899.61103: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882899.61113: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882899.61200: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882899.61219: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882899.61228: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882899.61379: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882899.74281: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 30564 1726882899.75346: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882899.75355: stderr chunk (state=3): >>><<< 30564 1726882899.75358: stdout chunk (state=3): >>><<< 30564 1726882899.75575: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882899.75581: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882899.50229-34865-79273458860301/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882899.75584: _low_level_execute_command(): starting 30564 1726882899.75586: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882899.50229-34865-79273458860301/ > /dev/null 2>&1 && sleep 0' 30564 1726882899.76174: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882899.76178: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882899.76180: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882899.76183: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882899.76185: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882899.76188: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882899.76190: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882899.76192: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882899.76198: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882899.76228: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882899.76231: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882899.76234: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882899.76236: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882899.76238: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882899.76240: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882899.76250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882899.76329: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882899.76343: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882899.76346: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882899.76479: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882899.78351: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882899.78354: stdout chunk (state=3): >>><<< 30564 1726882899.78362: stderr chunk (state=3): >>><<< 30564 1726882899.78390: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882899.78396: handler run complete 30564 1726882899.78420: attempt loop complete, returning result 30564 1726882899.78423: _execute() done 30564 1726882899.78426: dumping result to json 30564 1726882899.78428: done dumping result, returning 30564 1726882899.78439: done running TaskExecutor() for managed_node2/TASK: Stat profile file [0e448fcc-3ce9-4216-acec-000000001f18] 30564 1726882899.78444: sending task result for task 0e448fcc-3ce9-4216-acec-000000001f18 30564 1726882899.78553: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001f18 30564 1726882899.78556: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 30564 1726882899.78634: no more pending results, returning what we have 30564 1726882899.78638: results queue empty 30564 1726882899.78640: checking for any_errors_fatal 30564 1726882899.78648: done checking for any_errors_fatal 30564 1726882899.78649: checking for max_fail_percentage 30564 1726882899.78651: done checking for max_fail_percentage 30564 1726882899.78652: checking to see if all hosts have failed and the running result is not ok 30564 1726882899.78653: done checking to see if all hosts have failed 30564 1726882899.78654: getting the remaining hosts for this loop 30564 1726882899.78656: done getting the remaining hosts for this loop 30564 1726882899.78660: getting the next task for host managed_node2 30564 1726882899.78677: done getting next task for host managed_node2 30564 1726882899.78682: ^ task is: TASK: Set NM profile exist flag based on the profile files 30564 1726882899.78688: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882899.78693: getting variables 30564 1726882899.78695: in VariableManager get_vars() 30564 1726882899.78739: Calling all_inventory to load vars for managed_node2 30564 1726882899.78742: Calling groups_inventory to load vars for managed_node2 30564 1726882899.78746: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882899.78758: Calling all_plugins_play to load vars for managed_node2 30564 1726882899.78761: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882899.78767: Calling groups_plugins_play to load vars for managed_node2 30564 1726882899.80906: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882899.82970: done with get_vars() 30564 1726882899.83001: done getting variables 30564 1726882899.83072: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 21:41:39 -0400 (0:00:00.370) 0:01:38.412 ****** 30564 1726882899.83108: entering _queue_task() for managed_node2/set_fact 30564 1726882899.83462: worker is 1 (out of 1 available) 30564 1726882899.83480: exiting _queue_task() for managed_node2/set_fact 30564 1726882899.83493: done queuing things up, now waiting for results queue to drain 30564 1726882899.83494: waiting for pending results... 30564 1726882899.83816: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files 30564 1726882899.83941: in run() - task 0e448fcc-3ce9-4216-acec-000000001f19 30564 1726882899.83960: variable 'ansible_search_path' from source: unknown 30564 1726882899.83972: variable 'ansible_search_path' from source: unknown 30564 1726882899.84005: calling self._execute() 30564 1726882899.84116: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882899.84120: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882899.84132: variable 'omit' from source: magic vars 30564 1726882899.84555: variable 'ansible_distribution_major_version' from source: facts 30564 1726882899.84570: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882899.84719: variable 'profile_stat' from source: set_fact 30564 1726882899.84728: Evaluated conditional (profile_stat.stat.exists): False 30564 1726882899.84731: when evaluation is False, skipping this task 30564 1726882899.84734: _execute() done 30564 1726882899.84737: dumping result to json 30564 1726882899.84739: done dumping result, returning 30564 1726882899.84745: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files [0e448fcc-3ce9-4216-acec-000000001f19] 30564 1726882899.84755: sending task result for task 0e448fcc-3ce9-4216-acec-000000001f19 30564 1726882899.84853: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001f19 30564 1726882899.84857: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30564 1726882899.84905: no more pending results, returning what we have 30564 1726882899.84909: results queue empty 30564 1726882899.84910: checking for any_errors_fatal 30564 1726882899.84922: done checking for any_errors_fatal 30564 1726882899.84923: checking for max_fail_percentage 30564 1726882899.84924: done checking for max_fail_percentage 30564 1726882899.84925: checking to see if all hosts have failed and the running result is not ok 30564 1726882899.84926: done checking to see if all hosts have failed 30564 1726882899.84927: getting the remaining hosts for this loop 30564 1726882899.84929: done getting the remaining hosts for this loop 30564 1726882899.84933: getting the next task for host managed_node2 30564 1726882899.84942: done getting next task for host managed_node2 30564 1726882899.84944: ^ task is: TASK: Get NM profile info 30564 1726882899.84950: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882899.84954: getting variables 30564 1726882899.84956: in VariableManager get_vars() 30564 1726882899.85002: Calling all_inventory to load vars for managed_node2 30564 1726882899.85006: Calling groups_inventory to load vars for managed_node2 30564 1726882899.85010: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882899.85023: Calling all_plugins_play to load vars for managed_node2 30564 1726882899.85027: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882899.85030: Calling groups_plugins_play to load vars for managed_node2 30564 1726882899.86624: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882899.87719: done with get_vars() 30564 1726882899.87734: done getting variables 30564 1726882899.87780: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 21:41:39 -0400 (0:00:00.046) 0:01:38.459 ****** 30564 1726882899.87803: entering _queue_task() for managed_node2/shell 30564 1726882899.88023: worker is 1 (out of 1 available) 30564 1726882899.88044: exiting _queue_task() for managed_node2/shell 30564 1726882899.88055: done queuing things up, now waiting for results queue to drain 30564 1726882899.88056: waiting for pending results... 30564 1726882899.88280: running TaskExecutor() for managed_node2/TASK: Get NM profile info 30564 1726882899.88376: in run() - task 0e448fcc-3ce9-4216-acec-000000001f1a 30564 1726882899.88388: variable 'ansible_search_path' from source: unknown 30564 1726882899.88393: variable 'ansible_search_path' from source: unknown 30564 1726882899.88421: calling self._execute() 30564 1726882899.88498: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882899.88503: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882899.88512: variable 'omit' from source: magic vars 30564 1726882899.88790: variable 'ansible_distribution_major_version' from source: facts 30564 1726882899.88801: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882899.88806: variable 'omit' from source: magic vars 30564 1726882899.88845: variable 'omit' from source: magic vars 30564 1726882899.88914: variable 'profile' from source: play vars 30564 1726882899.88917: variable 'interface' from source: play vars 30564 1726882899.88966: variable 'interface' from source: play vars 30564 1726882899.88981: variable 'omit' from source: magic vars 30564 1726882899.89015: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882899.89040: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882899.89165: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882899.89170: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882899.89173: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882899.89176: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882899.89178: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882899.89180: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882899.89226: Set connection var ansible_timeout to 10 30564 1726882899.89231: Set connection var ansible_pipelining to False 30564 1726882899.89234: Set connection var ansible_shell_type to sh 30564 1726882899.89240: Set connection var ansible_shell_executable to /bin/sh 30564 1726882899.89247: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882899.89250: Set connection var ansible_connection to ssh 30564 1726882899.89278: variable 'ansible_shell_executable' from source: unknown 30564 1726882899.89281: variable 'ansible_connection' from source: unknown 30564 1726882899.89283: variable 'ansible_module_compression' from source: unknown 30564 1726882899.89286: variable 'ansible_shell_type' from source: unknown 30564 1726882899.89288: variable 'ansible_shell_executable' from source: unknown 30564 1726882899.89290: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882899.89294: variable 'ansible_pipelining' from source: unknown 30564 1726882899.89297: variable 'ansible_timeout' from source: unknown 30564 1726882899.89300: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882899.89425: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882899.89435: variable 'omit' from source: magic vars 30564 1726882899.89440: starting attempt loop 30564 1726882899.89443: running the handler 30564 1726882899.89454: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882899.89474: _low_level_execute_command(): starting 30564 1726882899.89482: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882899.90130: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882899.90134: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882899.90166: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882899.90171: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882899.90226: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882899.90230: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882899.90234: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882899.90339: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882899.92006: stdout chunk (state=3): >>>/root <<< 30564 1726882899.92108: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882899.92156: stderr chunk (state=3): >>><<< 30564 1726882899.92158: stdout chunk (state=3): >>><<< 30564 1726882899.92245: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882899.92250: _low_level_execute_command(): starting 30564 1726882899.92253: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882899.9217858-34890-205063534990523 `" && echo ansible-tmp-1726882899.9217858-34890-205063534990523="` echo /root/.ansible/tmp/ansible-tmp-1726882899.9217858-34890-205063534990523 `" ) && sleep 0' 30564 1726882899.92811: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882899.92821: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882899.92850: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882899.92877: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882899.92880: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882899.92882: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882899.92936: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882899.92939: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882899.92944: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882899.93049: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882899.94950: stdout chunk (state=3): >>>ansible-tmp-1726882899.9217858-34890-205063534990523=/root/.ansible/tmp/ansible-tmp-1726882899.9217858-34890-205063534990523 <<< 30564 1726882899.95059: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882899.95124: stderr chunk (state=3): >>><<< 30564 1726882899.95127: stdout chunk (state=3): >>><<< 30564 1726882899.95416: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882899.9217858-34890-205063534990523=/root/.ansible/tmp/ansible-tmp-1726882899.9217858-34890-205063534990523 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882899.95420: variable 'ansible_module_compression' from source: unknown 30564 1726882899.95422: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30564 1726882899.95424: variable 'ansible_facts' from source: unknown 30564 1726882899.95425: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882899.9217858-34890-205063534990523/AnsiballZ_command.py 30564 1726882899.95485: Sending initial data 30564 1726882899.95488: Sent initial data (156 bytes) 30564 1726882899.96375: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882899.96388: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882899.96400: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882899.96415: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882899.96452: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882899.96462: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882899.96481: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882899.96497: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882899.96506: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882899.96515: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882899.96524: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882899.96534: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882899.96546: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882899.96555: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882899.96567: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882899.96579: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882899.96648: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882899.96665: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882899.96680: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882899.96987: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882899.98719: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882899.98811: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882899.98913: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmpjg_l5kci /root/.ansible/tmp/ansible-tmp-1726882899.9217858-34890-205063534990523/AnsiballZ_command.py <<< 30564 1726882899.99048: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882900.00077: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882900.00167: stderr chunk (state=3): >>><<< 30564 1726882900.00176: stdout chunk (state=3): >>><<< 30564 1726882900.00192: done transferring module to remote 30564 1726882900.00201: _low_level_execute_command(): starting 30564 1726882900.00205: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882899.9217858-34890-205063534990523/ /root/.ansible/tmp/ansible-tmp-1726882899.9217858-34890-205063534990523/AnsiballZ_command.py && sleep 0' 30564 1726882900.00785: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882900.00793: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882900.00803: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882900.00817: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882900.00856: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882900.00866: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882900.00881: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882900.00899: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882900.00905: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882900.00912: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882900.00920: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882900.00929: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882900.00945: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882900.00952: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882900.00958: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882900.00969: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882900.02480: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882900.02494: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882900.02504: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882900.02647: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882900.04468: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882900.04476: stdout chunk (state=3): >>><<< 30564 1726882900.04484: stderr chunk (state=3): >>><<< 30564 1726882900.04500: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882900.04503: _low_level_execute_command(): starting 30564 1726882900.04509: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882899.9217858-34890-205063534990523/AnsiballZ_command.py && sleep 0' 30564 1726882900.05959: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882900.05967: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882900.06030: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882900.06036: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration <<< 30564 1726882900.06116: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882900.06122: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 30564 1726882900.06127: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882900.06249: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882900.06252: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882900.06266: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882900.06400: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882900.21352: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-20 21:41:40.193106", "end": "2024-09-20 21:41:40.211424", "delta": "0:00:00.018318", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30564 1726882900.22550: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.11.158 closed. <<< 30564 1726882900.22554: stdout chunk (state=3): >>><<< 30564 1726882900.22556: stderr chunk (state=3): >>><<< 30564 1726882900.22709: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-20 21:41:40.193106", "end": "2024-09-20 21:41:40.211424", "delta": "0:00:00.018318", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.11.158 closed. 30564 1726882900.22713: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882899.9217858-34890-205063534990523/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882900.22720: _low_level_execute_command(): starting 30564 1726882900.22723: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882899.9217858-34890-205063534990523/ > /dev/null 2>&1 && sleep 0' 30564 1726882900.24123: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882900.24182: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882900.24198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882900.24215: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882900.24261: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882900.24379: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882900.24394: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882900.24412: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882900.24423: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882900.24433: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882900.24450: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882900.24467: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882900.24483: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882900.24495: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882900.24506: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882900.24521: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882900.24599: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882900.24788: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882900.24804: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882900.24930: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882900.26804: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882900.26808: stdout chunk (state=3): >>><<< 30564 1726882900.26810: stderr chunk (state=3): >>><<< 30564 1726882900.26869: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882900.26873: handler run complete 30564 1726882900.26875: Evaluated conditional (False): False 30564 1726882900.26877: attempt loop complete, returning result 30564 1726882900.26879: _execute() done 30564 1726882900.27071: dumping result to json 30564 1726882900.27075: done dumping result, returning 30564 1726882900.27078: done running TaskExecutor() for managed_node2/TASK: Get NM profile info [0e448fcc-3ce9-4216-acec-000000001f1a] 30564 1726882900.27080: sending task result for task 0e448fcc-3ce9-4216-acec-000000001f1a 30564 1726882900.27149: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001f1a 30564 1726882900.27153: WORKER PROCESS EXITING fatal: [managed_node2]: FAILED! => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "delta": "0:00:00.018318", "end": "2024-09-20 21:41:40.211424", "rc": 1, "start": "2024-09-20 21:41:40.193106" } MSG: non-zero return code ...ignoring 30564 1726882900.27228: no more pending results, returning what we have 30564 1726882900.27231: results queue empty 30564 1726882900.27232: checking for any_errors_fatal 30564 1726882900.27237: done checking for any_errors_fatal 30564 1726882900.27238: checking for max_fail_percentage 30564 1726882900.27239: done checking for max_fail_percentage 30564 1726882900.27240: checking to see if all hosts have failed and the running result is not ok 30564 1726882900.27241: done checking to see if all hosts have failed 30564 1726882900.27241: getting the remaining hosts for this loop 30564 1726882900.27243: done getting the remaining hosts for this loop 30564 1726882900.27246: getting the next task for host managed_node2 30564 1726882900.27254: done getting next task for host managed_node2 30564 1726882900.27256: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 30564 1726882900.27261: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882900.27266: getting variables 30564 1726882900.27269: in VariableManager get_vars() 30564 1726882900.27304: Calling all_inventory to load vars for managed_node2 30564 1726882900.27307: Calling groups_inventory to load vars for managed_node2 30564 1726882900.27309: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882900.27319: Calling all_plugins_play to load vars for managed_node2 30564 1726882900.27321: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882900.27323: Calling groups_plugins_play to load vars for managed_node2 30564 1726882900.30961: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882900.36567: done with get_vars() 30564 1726882900.36607: done getting variables 30564 1726882900.36671: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 21:41:40 -0400 (0:00:00.489) 0:01:38.948 ****** 30564 1726882900.36713: entering _queue_task() for managed_node2/set_fact 30564 1726882900.37760: worker is 1 (out of 1 available) 30564 1726882900.37803: exiting _queue_task() for managed_node2/set_fact 30564 1726882900.37816: done queuing things up, now waiting for results queue to drain 30564 1726882900.37818: waiting for pending results... 30564 1726882900.38650: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 30564 1726882900.39178: in run() - task 0e448fcc-3ce9-4216-acec-000000001f1b 30564 1726882900.39197: variable 'ansible_search_path' from source: unknown 30564 1726882900.39204: variable 'ansible_search_path' from source: unknown 30564 1726882900.39241: calling self._execute() 30564 1726882900.39347: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882900.39360: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882900.39378: variable 'omit' from source: magic vars 30564 1726882900.40068: variable 'ansible_distribution_major_version' from source: facts 30564 1726882900.40287: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882900.40421: variable 'nm_profile_exists' from source: set_fact 30564 1726882900.40488: Evaluated conditional (nm_profile_exists.rc == 0): False 30564 1726882900.40773: when evaluation is False, skipping this task 30564 1726882900.40782: _execute() done 30564 1726882900.40791: dumping result to json 30564 1726882900.40798: done dumping result, returning 30564 1726882900.40811: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0e448fcc-3ce9-4216-acec-000000001f1b] 30564 1726882900.40821: sending task result for task 0e448fcc-3ce9-4216-acec-000000001f1b 30564 1726882900.40943: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001f1b 30564 1726882900.40952: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "nm_profile_exists.rc == 0", "skip_reason": "Conditional result was False" } 30564 1726882900.41307: no more pending results, returning what we have 30564 1726882900.41310: results queue empty 30564 1726882900.41311: checking for any_errors_fatal 30564 1726882900.41317: done checking for any_errors_fatal 30564 1726882900.41318: checking for max_fail_percentage 30564 1726882900.41320: done checking for max_fail_percentage 30564 1726882900.41321: checking to see if all hosts have failed and the running result is not ok 30564 1726882900.41322: done checking to see if all hosts have failed 30564 1726882900.41323: getting the remaining hosts for this loop 30564 1726882900.41324: done getting the remaining hosts for this loop 30564 1726882900.41328: getting the next task for host managed_node2 30564 1726882900.41336: done getting next task for host managed_node2 30564 1726882900.41338: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 30564 1726882900.41343: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882900.41346: getting variables 30564 1726882900.41347: in VariableManager get_vars() 30564 1726882900.41382: Calling all_inventory to load vars for managed_node2 30564 1726882900.41385: Calling groups_inventory to load vars for managed_node2 30564 1726882900.41388: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882900.41396: Calling all_plugins_play to load vars for managed_node2 30564 1726882900.41399: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882900.41401: Calling groups_plugins_play to load vars for managed_node2 30564 1726882900.44580: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882900.48426: done with get_vars() 30564 1726882900.48454: done getting variables 30564 1726882900.48519: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30564 1726882900.48644: variable 'profile' from source: play vars 30564 1726882900.48648: variable 'interface' from source: play vars 30564 1726882900.48714: variable 'interface' from source: play vars TASK [Get the ansible_managed comment in ifcfg-statebr] ************************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 21:41:40 -0400 (0:00:00.120) 0:01:39.068 ****** 30564 1726882900.48746: entering _queue_task() for managed_node2/command 30564 1726882900.49629: worker is 1 (out of 1 available) 30564 1726882900.49642: exiting _queue_task() for managed_node2/command 30564 1726882900.49655: done queuing things up, now waiting for results queue to drain 30564 1726882900.49656: waiting for pending results... 30564 1726882900.50072: running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-statebr 30564 1726882900.50182: in run() - task 0e448fcc-3ce9-4216-acec-000000001f1d 30564 1726882900.50202: variable 'ansible_search_path' from source: unknown 30564 1726882900.50205: variable 'ansible_search_path' from source: unknown 30564 1726882900.50241: calling self._execute() 30564 1726882900.50395: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882900.50399: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882900.50410: variable 'omit' from source: magic vars 30564 1726882900.50959: variable 'ansible_distribution_major_version' from source: facts 30564 1726882900.50977: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882900.51126: variable 'profile_stat' from source: set_fact 30564 1726882900.51139: Evaluated conditional (profile_stat.stat.exists): False 30564 1726882900.51143: when evaluation is False, skipping this task 30564 1726882900.51146: _execute() done 30564 1726882900.51148: dumping result to json 30564 1726882900.51150: done dumping result, returning 30564 1726882900.51156: done running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-statebr [0e448fcc-3ce9-4216-acec-000000001f1d] 30564 1726882900.51162: sending task result for task 0e448fcc-3ce9-4216-acec-000000001f1d 30564 1726882900.51262: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001f1d 30564 1726882900.51268: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30564 1726882900.51326: no more pending results, returning what we have 30564 1726882900.51331: results queue empty 30564 1726882900.51333: checking for any_errors_fatal 30564 1726882900.51344: done checking for any_errors_fatal 30564 1726882900.51345: checking for max_fail_percentage 30564 1726882900.51347: done checking for max_fail_percentage 30564 1726882900.51348: checking to see if all hosts have failed and the running result is not ok 30564 1726882900.51349: done checking to see if all hosts have failed 30564 1726882900.51350: getting the remaining hosts for this loop 30564 1726882900.51353: done getting the remaining hosts for this loop 30564 1726882900.51357: getting the next task for host managed_node2 30564 1726882900.51372: done getting next task for host managed_node2 30564 1726882900.51375: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 30564 1726882900.51382: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882900.51391: getting variables 30564 1726882900.51393: in VariableManager get_vars() 30564 1726882900.51449: Calling all_inventory to load vars for managed_node2 30564 1726882900.51453: Calling groups_inventory to load vars for managed_node2 30564 1726882900.51457: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882900.51495: Calling all_plugins_play to load vars for managed_node2 30564 1726882900.51500: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882900.51504: Calling groups_plugins_play to load vars for managed_node2 30564 1726882900.53492: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882900.55939: done with get_vars() 30564 1726882900.55970: done getting variables 30564 1726882900.56026: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30564 1726882900.56144: variable 'profile' from source: play vars 30564 1726882900.56148: variable 'interface' from source: play vars 30564 1726882900.56212: variable 'interface' from source: play vars TASK [Verify the ansible_managed comment in ifcfg-statebr] ********************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 21:41:40 -0400 (0:00:00.075) 0:01:39.144 ****** 30564 1726882900.56324: entering _queue_task() for managed_node2/set_fact 30564 1726882900.56676: worker is 1 (out of 1 available) 30564 1726882900.56688: exiting _queue_task() for managed_node2/set_fact 30564 1726882900.56700: done queuing things up, now waiting for results queue to drain 30564 1726882900.56701: waiting for pending results... 30564 1726882900.57078: running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-statebr 30564 1726882900.57207: in run() - task 0e448fcc-3ce9-4216-acec-000000001f1e 30564 1726882900.57227: variable 'ansible_search_path' from source: unknown 30564 1726882900.57233: variable 'ansible_search_path' from source: unknown 30564 1726882900.57281: calling self._execute() 30564 1726882900.57385: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882900.57396: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882900.57410: variable 'omit' from source: magic vars 30564 1726882900.57793: variable 'ansible_distribution_major_version' from source: facts 30564 1726882900.57817: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882900.57949: variable 'profile_stat' from source: set_fact 30564 1726882900.57966: Evaluated conditional (profile_stat.stat.exists): False 30564 1726882900.57974: when evaluation is False, skipping this task 30564 1726882900.57980: _execute() done 30564 1726882900.57987: dumping result to json 30564 1726882900.57993: done dumping result, returning 30564 1726882900.58002: done running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-statebr [0e448fcc-3ce9-4216-acec-000000001f1e] 30564 1726882900.58015: sending task result for task 0e448fcc-3ce9-4216-acec-000000001f1e skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30564 1726882900.58166: no more pending results, returning what we have 30564 1726882900.58171: results queue empty 30564 1726882900.58172: checking for any_errors_fatal 30564 1726882900.58180: done checking for any_errors_fatal 30564 1726882900.58181: checking for max_fail_percentage 30564 1726882900.58183: done checking for max_fail_percentage 30564 1726882900.58184: checking to see if all hosts have failed and the running result is not ok 30564 1726882900.58185: done checking to see if all hosts have failed 30564 1726882900.58186: getting the remaining hosts for this loop 30564 1726882900.58188: done getting the remaining hosts for this loop 30564 1726882900.58192: getting the next task for host managed_node2 30564 1726882900.58201: done getting next task for host managed_node2 30564 1726882900.58204: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 30564 1726882900.58210: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882900.58215: getting variables 30564 1726882900.58216: in VariableManager get_vars() 30564 1726882900.58259: Calling all_inventory to load vars for managed_node2 30564 1726882900.58263: Calling groups_inventory to load vars for managed_node2 30564 1726882900.58268: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882900.58283: Calling all_plugins_play to load vars for managed_node2 30564 1726882900.58287: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882900.58290: Calling groups_plugins_play to load vars for managed_node2 30564 1726882900.59450: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001f1e 30564 1726882900.59454: WORKER PROCESS EXITING 30564 1726882900.60321: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882900.62949: done with get_vars() 30564 1726882900.62989: done getting variables 30564 1726882900.63048: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30564 1726882900.63165: variable 'profile' from source: play vars 30564 1726882900.63169: variable 'interface' from source: play vars 30564 1726882900.63232: variable 'interface' from source: play vars TASK [Get the fingerprint comment in ifcfg-statebr] **************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 21:41:40 -0400 (0:00:00.069) 0:01:39.213 ****** 30564 1726882900.63267: entering _queue_task() for managed_node2/command 30564 1726882900.63757: worker is 1 (out of 1 available) 30564 1726882900.63772: exiting _queue_task() for managed_node2/command 30564 1726882900.63785: done queuing things up, now waiting for results queue to drain 30564 1726882900.63786: waiting for pending results... 30564 1726882900.64558: running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-statebr 30564 1726882900.64666: in run() - task 0e448fcc-3ce9-4216-acec-000000001f1f 30564 1726882900.64684: variable 'ansible_search_path' from source: unknown 30564 1726882900.64688: variable 'ansible_search_path' from source: unknown 30564 1726882900.64889: calling self._execute() 30564 1726882900.65050: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882900.65062: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882900.65087: variable 'omit' from source: magic vars 30564 1726882900.65487: variable 'ansible_distribution_major_version' from source: facts 30564 1726882900.65505: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882900.65640: variable 'profile_stat' from source: set_fact 30564 1726882900.65655: Evaluated conditional (profile_stat.stat.exists): False 30564 1726882900.65662: when evaluation is False, skipping this task 30564 1726882900.65672: _execute() done 30564 1726882900.65679: dumping result to json 30564 1726882900.65690: done dumping result, returning 30564 1726882900.65703: done running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-statebr [0e448fcc-3ce9-4216-acec-000000001f1f] 30564 1726882900.65712: sending task result for task 0e448fcc-3ce9-4216-acec-000000001f1f skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30564 1726882900.65859: no more pending results, returning what we have 30564 1726882900.65866: results queue empty 30564 1726882900.65867: checking for any_errors_fatal 30564 1726882900.65876: done checking for any_errors_fatal 30564 1726882900.65877: checking for max_fail_percentage 30564 1726882900.65879: done checking for max_fail_percentage 30564 1726882900.65880: checking to see if all hosts have failed and the running result is not ok 30564 1726882900.65881: done checking to see if all hosts have failed 30564 1726882900.65882: getting the remaining hosts for this loop 30564 1726882900.65884: done getting the remaining hosts for this loop 30564 1726882900.65888: getting the next task for host managed_node2 30564 1726882900.65897: done getting next task for host managed_node2 30564 1726882900.65899: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 30564 1726882900.65905: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882900.65910: getting variables 30564 1726882900.65912: in VariableManager get_vars() 30564 1726882900.65956: Calling all_inventory to load vars for managed_node2 30564 1726882900.65959: Calling groups_inventory to load vars for managed_node2 30564 1726882900.65962: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882900.65977: Calling all_plugins_play to load vars for managed_node2 30564 1726882900.65981: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882900.65984: Calling groups_plugins_play to load vars for managed_node2 30564 1726882900.67004: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001f1f 30564 1726882900.67007: WORKER PROCESS EXITING 30564 1726882900.68023: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882900.70374: done with get_vars() 30564 1726882900.70405: done getting variables 30564 1726882900.70462: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30564 1726882900.70632: variable 'profile' from source: play vars 30564 1726882900.70636: variable 'interface' from source: play vars 30564 1726882900.70705: variable 'interface' from source: play vars TASK [Verify the fingerprint comment in ifcfg-statebr] ************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 21:41:40 -0400 (0:00:00.074) 0:01:39.288 ****** 30564 1726882900.70743: entering _queue_task() for managed_node2/set_fact 30564 1726882900.71071: worker is 1 (out of 1 available) 30564 1726882900.71084: exiting _queue_task() for managed_node2/set_fact 30564 1726882900.71095: done queuing things up, now waiting for results queue to drain 30564 1726882900.71097: waiting for pending results... 30564 1726882900.71792: running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-statebr 30564 1726882900.71923: in run() - task 0e448fcc-3ce9-4216-acec-000000001f20 30564 1726882900.71947: variable 'ansible_search_path' from source: unknown 30564 1726882900.71954: variable 'ansible_search_path' from source: unknown 30564 1726882900.71997: calling self._execute() 30564 1726882900.72101: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882900.72112: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882900.72125: variable 'omit' from source: magic vars 30564 1726882900.72498: variable 'ansible_distribution_major_version' from source: facts 30564 1726882900.72516: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882900.72647: variable 'profile_stat' from source: set_fact 30564 1726882900.72671: Evaluated conditional (profile_stat.stat.exists): False 30564 1726882900.72684: when evaluation is False, skipping this task 30564 1726882900.72696: _execute() done 30564 1726882900.72704: dumping result to json 30564 1726882900.72714: done dumping result, returning 30564 1726882900.72725: done running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-statebr [0e448fcc-3ce9-4216-acec-000000001f20] 30564 1726882900.72735: sending task result for task 0e448fcc-3ce9-4216-acec-000000001f20 skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30564 1726882900.72888: no more pending results, returning what we have 30564 1726882900.72892: results queue empty 30564 1726882900.72893: checking for any_errors_fatal 30564 1726882900.72901: done checking for any_errors_fatal 30564 1726882900.72902: checking for max_fail_percentage 30564 1726882900.72903: done checking for max_fail_percentage 30564 1726882900.72904: checking to see if all hosts have failed and the running result is not ok 30564 1726882900.72905: done checking to see if all hosts have failed 30564 1726882900.72906: getting the remaining hosts for this loop 30564 1726882900.72908: done getting the remaining hosts for this loop 30564 1726882900.72911: getting the next task for host managed_node2 30564 1726882900.72921: done getting next task for host managed_node2 30564 1726882900.72924: ^ task is: TASK: Assert that the profile is absent - '{{ profile }}' 30564 1726882900.72928: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882900.72934: getting variables 30564 1726882900.72936: in VariableManager get_vars() 30564 1726882900.72978: Calling all_inventory to load vars for managed_node2 30564 1726882900.72980: Calling groups_inventory to load vars for managed_node2 30564 1726882900.72984: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882900.72999: Calling all_plugins_play to load vars for managed_node2 30564 1726882900.73002: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882900.73006: Calling groups_plugins_play to load vars for managed_node2 30564 1726882900.74174: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001f20 30564 1726882900.74178: WORKER PROCESS EXITING 30564 1726882900.75138: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882900.77084: done with get_vars() 30564 1726882900.77109: done getting variables 30564 1726882900.77174: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30564 1726882900.77320: variable 'profile' from source: play vars 30564 1726882900.77324: variable 'interface' from source: play vars 30564 1726882900.77401: variable 'interface' from source: play vars TASK [Assert that the profile is absent - 'statebr'] *************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:5 Friday 20 September 2024 21:41:40 -0400 (0:00:00.066) 0:01:39.355 ****** 30564 1726882900.77449: entering _queue_task() for managed_node2/assert 30564 1726882900.77886: worker is 1 (out of 1 available) 30564 1726882900.77900: exiting _queue_task() for managed_node2/assert 30564 1726882900.77913: done queuing things up, now waiting for results queue to drain 30564 1726882900.77921: waiting for pending results... 30564 1726882900.78334: running TaskExecutor() for managed_node2/TASK: Assert that the profile is absent - 'statebr' 30564 1726882900.78454: in run() - task 0e448fcc-3ce9-4216-acec-000000001e9a 30564 1726882900.78467: variable 'ansible_search_path' from source: unknown 30564 1726882900.78473: variable 'ansible_search_path' from source: unknown 30564 1726882900.78511: calling self._execute() 30564 1726882900.78618: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882900.78622: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882900.78633: variable 'omit' from source: magic vars 30564 1726882900.78996: variable 'ansible_distribution_major_version' from source: facts 30564 1726882900.79008: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882900.79015: variable 'omit' from source: magic vars 30564 1726882900.79060: variable 'omit' from source: magic vars 30564 1726882900.79154: variable 'profile' from source: play vars 30564 1726882900.79158: variable 'interface' from source: play vars 30564 1726882900.79220: variable 'interface' from source: play vars 30564 1726882900.79236: variable 'omit' from source: magic vars 30564 1726882900.79281: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882900.79313: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882900.79333: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882900.79349: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882900.79361: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882900.79398: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882900.79402: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882900.79404: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882900.79505: Set connection var ansible_timeout to 10 30564 1726882900.79510: Set connection var ansible_pipelining to False 30564 1726882900.79513: Set connection var ansible_shell_type to sh 30564 1726882900.79519: Set connection var ansible_shell_executable to /bin/sh 30564 1726882900.79526: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882900.79529: Set connection var ansible_connection to ssh 30564 1726882900.79555: variable 'ansible_shell_executable' from source: unknown 30564 1726882900.79558: variable 'ansible_connection' from source: unknown 30564 1726882900.79561: variable 'ansible_module_compression' from source: unknown 30564 1726882900.79564: variable 'ansible_shell_type' from source: unknown 30564 1726882900.79573: variable 'ansible_shell_executable' from source: unknown 30564 1726882900.79576: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882900.79578: variable 'ansible_pipelining' from source: unknown 30564 1726882900.79580: variable 'ansible_timeout' from source: unknown 30564 1726882900.79582: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882900.79739: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882900.79750: variable 'omit' from source: magic vars 30564 1726882900.79756: starting attempt loop 30564 1726882900.79759: running the handler 30564 1726882900.79883: variable 'lsr_net_profile_exists' from source: set_fact 30564 1726882900.79886: Evaluated conditional (not lsr_net_profile_exists): True 30564 1726882900.79894: handler run complete 30564 1726882900.79907: attempt loop complete, returning result 30564 1726882900.79915: _execute() done 30564 1726882900.79918: dumping result to json 30564 1726882900.79921: done dumping result, returning 30564 1726882900.79928: done running TaskExecutor() for managed_node2/TASK: Assert that the profile is absent - 'statebr' [0e448fcc-3ce9-4216-acec-000000001e9a] 30564 1726882900.79934: sending task result for task 0e448fcc-3ce9-4216-acec-000000001e9a 30564 1726882900.80030: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001e9a 30564 1726882900.80033: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 30564 1726882900.80084: no more pending results, returning what we have 30564 1726882900.80087: results queue empty 30564 1726882900.80089: checking for any_errors_fatal 30564 1726882900.80097: done checking for any_errors_fatal 30564 1726882900.80098: checking for max_fail_percentage 30564 1726882900.80100: done checking for max_fail_percentage 30564 1726882900.80101: checking to see if all hosts have failed and the running result is not ok 30564 1726882900.80102: done checking to see if all hosts have failed 30564 1726882900.80103: getting the remaining hosts for this loop 30564 1726882900.80105: done getting the remaining hosts for this loop 30564 1726882900.80109: getting the next task for host managed_node2 30564 1726882900.80120: done getting next task for host managed_node2 30564 1726882900.80124: ^ task is: TASK: Conditional asserts 30564 1726882900.80127: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882900.80133: getting variables 30564 1726882900.80134: in VariableManager get_vars() 30564 1726882900.80177: Calling all_inventory to load vars for managed_node2 30564 1726882900.80180: Calling groups_inventory to load vars for managed_node2 30564 1726882900.80184: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882900.80196: Calling all_plugins_play to load vars for managed_node2 30564 1726882900.80200: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882900.80202: Calling groups_plugins_play to load vars for managed_node2 30564 1726882900.81853: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882900.83622: done with get_vars() 30564 1726882900.83645: done getting variables TASK [Conditional asserts] ***************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:42 Friday 20 September 2024 21:41:40 -0400 (0:00:00.064) 0:01:39.419 ****** 30564 1726882900.83855: entering _queue_task() for managed_node2/include_tasks 30564 1726882900.84152: worker is 1 (out of 1 available) 30564 1726882900.84166: exiting _queue_task() for managed_node2/include_tasks 30564 1726882900.84179: done queuing things up, now waiting for results queue to drain 30564 1726882900.84180: waiting for pending results... 30564 1726882900.84488: running TaskExecutor() for managed_node2/TASK: Conditional asserts 30564 1726882900.84579: in run() - task 0e448fcc-3ce9-4216-acec-00000000174a 30564 1726882900.84593: variable 'ansible_search_path' from source: unknown 30564 1726882900.84597: variable 'ansible_search_path' from source: unknown 30564 1726882900.84887: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882900.87103: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882900.87174: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882900.87208: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882900.87244: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882900.87272: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882900.87349: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882900.87381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882900.87406: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882900.87450: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882900.87466: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882900.87572: variable 'lsr_assert_when' from source: include params 30564 1726882900.87685: variable 'network_provider' from source: set_fact 30564 1726882900.87751: variable 'omit' from source: magic vars 30564 1726882900.88251: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882900.88259: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882900.88273: variable 'omit' from source: magic vars 30564 1726882900.88484: variable 'ansible_distribution_major_version' from source: facts 30564 1726882900.88494: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882900.88616: variable 'item' from source: unknown 30564 1726882900.88624: Evaluated conditional (item['condition']): True 30564 1726882900.88716: variable 'item' from source: unknown 30564 1726882900.88754: variable 'item' from source: unknown 30564 1726882900.88815: variable 'item' from source: unknown 30564 1726882900.88976: dumping result to json 30564 1726882900.88979: done dumping result, returning 30564 1726882900.88981: done running TaskExecutor() for managed_node2/TASK: Conditional asserts [0e448fcc-3ce9-4216-acec-00000000174a] 30564 1726882900.88983: sending task result for task 0e448fcc-3ce9-4216-acec-00000000174a 30564 1726882900.89022: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000174a 30564 1726882900.89025: WORKER PROCESS EXITING 30564 1726882900.89108: no more pending results, returning what we have 30564 1726882900.89113: in VariableManager get_vars() 30564 1726882900.89162: Calling all_inventory to load vars for managed_node2 30564 1726882900.89168: Calling groups_inventory to load vars for managed_node2 30564 1726882900.89172: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882900.89184: Calling all_plugins_play to load vars for managed_node2 30564 1726882900.89188: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882900.89191: Calling groups_plugins_play to load vars for managed_node2 30564 1726882900.91000: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882900.92708: done with get_vars() 30564 1726882900.92728: variable 'ansible_search_path' from source: unknown 30564 1726882900.92730: variable 'ansible_search_path' from source: unknown 30564 1726882900.92770: we have included files to process 30564 1726882900.92771: generating all_blocks data 30564 1726882900.92773: done generating all_blocks data 30564 1726882900.92778: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 30564 1726882900.92779: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 30564 1726882900.92782: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 30564 1726882900.92890: in VariableManager get_vars() 30564 1726882900.92913: done with get_vars() 30564 1726882900.93024: done processing included file 30564 1726882900.93026: iterating over new_blocks loaded from include file 30564 1726882900.93028: in VariableManager get_vars() 30564 1726882900.93043: done with get_vars() 30564 1726882900.93045: filtering new block on tags 30564 1726882900.93082: done filtering new block on tags 30564 1726882900.93085: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed_node2 => (item={'what': 'tasks/assert_device_absent.yml', 'condition': True}) 30564 1726882900.93090: extending task lists for all hosts with included blocks 30564 1726882900.94318: done extending task lists 30564 1726882900.94319: done processing included files 30564 1726882900.94320: results queue empty 30564 1726882900.94321: checking for any_errors_fatal 30564 1726882900.94324: done checking for any_errors_fatal 30564 1726882900.94325: checking for max_fail_percentage 30564 1726882900.94326: done checking for max_fail_percentage 30564 1726882900.94327: checking to see if all hosts have failed and the running result is not ok 30564 1726882900.94327: done checking to see if all hosts have failed 30564 1726882900.94328: getting the remaining hosts for this loop 30564 1726882900.94330: done getting the remaining hosts for this loop 30564 1726882900.94332: getting the next task for host managed_node2 30564 1726882900.94336: done getting next task for host managed_node2 30564 1726882900.94338: ^ task is: TASK: Include the task 'get_interface_stat.yml' 30564 1726882900.94341: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882900.94350: getting variables 30564 1726882900.94350: in VariableManager get_vars() 30564 1726882900.94361: Calling all_inventory to load vars for managed_node2 30564 1726882900.94365: Calling groups_inventory to load vars for managed_node2 30564 1726882900.94368: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882900.94373: Calling all_plugins_play to load vars for managed_node2 30564 1726882900.94375: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882900.94378: Calling groups_plugins_play to load vars for managed_node2 30564 1726882900.95647: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882900.97361: done with get_vars() 30564 1726882900.97384: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Friday 20 September 2024 21:41:40 -0400 (0:00:00.135) 0:01:39.555 ****** 30564 1726882900.97455: entering _queue_task() for managed_node2/include_tasks 30564 1726882900.97787: worker is 1 (out of 1 available) 30564 1726882900.97800: exiting _queue_task() for managed_node2/include_tasks 30564 1726882900.97813: done queuing things up, now waiting for results queue to drain 30564 1726882900.97814: waiting for pending results... 30564 1726882900.98112: running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' 30564 1726882900.98249: in run() - task 0e448fcc-3ce9-4216-acec-000000001f59 30564 1726882900.98273: variable 'ansible_search_path' from source: unknown 30564 1726882900.98284: variable 'ansible_search_path' from source: unknown 30564 1726882900.98328: calling self._execute() 30564 1726882900.98456: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882900.98475: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882900.98493: variable 'omit' from source: magic vars 30564 1726882900.98906: variable 'ansible_distribution_major_version' from source: facts 30564 1726882900.98926: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882900.98937: _execute() done 30564 1726882900.98950: dumping result to json 30564 1726882900.98958: done dumping result, returning 30564 1726882900.98973: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' [0e448fcc-3ce9-4216-acec-000000001f59] 30564 1726882900.98986: sending task result for task 0e448fcc-3ce9-4216-acec-000000001f59 30564 1726882900.99125: no more pending results, returning what we have 30564 1726882900.99130: in VariableManager get_vars() 30564 1726882900.99174: Calling all_inventory to load vars for managed_node2 30564 1726882900.99177: Calling groups_inventory to load vars for managed_node2 30564 1726882900.99181: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882900.99196: Calling all_plugins_play to load vars for managed_node2 30564 1726882900.99199: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882900.99202: Calling groups_plugins_play to load vars for managed_node2 30564 1726882900.99721: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001f59 30564 1726882900.99725: WORKER PROCESS EXITING 30564 1726882901.06811: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882901.08531: done with get_vars() 30564 1726882901.08561: variable 'ansible_search_path' from source: unknown 30564 1726882901.08563: variable 'ansible_search_path' from source: unknown 30564 1726882901.08692: variable 'item' from source: include params 30564 1726882901.08725: we have included files to process 30564 1726882901.08726: generating all_blocks data 30564 1726882901.08728: done generating all_blocks data 30564 1726882901.08729: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30564 1726882901.08730: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30564 1726882901.08732: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30564 1726882901.08913: done processing included file 30564 1726882901.08916: iterating over new_blocks loaded from include file 30564 1726882901.08917: in VariableManager get_vars() 30564 1726882901.08935: done with get_vars() 30564 1726882901.08937: filtering new block on tags 30564 1726882901.08962: done filtering new block on tags 30564 1726882901.08970: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node2 30564 1726882901.08981: extending task lists for all hosts with included blocks 30564 1726882901.09161: done extending task lists 30564 1726882901.09162: done processing included files 30564 1726882901.09166: results queue empty 30564 1726882901.09167: checking for any_errors_fatal 30564 1726882901.09174: done checking for any_errors_fatal 30564 1726882901.09174: checking for max_fail_percentage 30564 1726882901.09176: done checking for max_fail_percentage 30564 1726882901.09176: checking to see if all hosts have failed and the running result is not ok 30564 1726882901.09177: done checking to see if all hosts have failed 30564 1726882901.09178: getting the remaining hosts for this loop 30564 1726882901.09179: done getting the remaining hosts for this loop 30564 1726882901.09182: getting the next task for host managed_node2 30564 1726882901.09186: done getting next task for host managed_node2 30564 1726882901.09188: ^ task is: TASK: Get stat for interface {{ interface }} 30564 1726882901.09195: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882901.09198: getting variables 30564 1726882901.09199: in VariableManager get_vars() 30564 1726882901.09210: Calling all_inventory to load vars for managed_node2 30564 1726882901.09212: Calling groups_inventory to load vars for managed_node2 30564 1726882901.09215: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882901.09220: Calling all_plugins_play to load vars for managed_node2 30564 1726882901.09222: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882901.09225: Calling groups_plugins_play to load vars for managed_node2 30564 1726882901.10627: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882901.12384: done with get_vars() 30564 1726882901.12406: done getting variables 30564 1726882901.12533: variable 'interface' from source: play vars TASK [Get stat for interface statebr] ****************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:41:41 -0400 (0:00:00.151) 0:01:39.706 ****** 30564 1726882901.12567: entering _queue_task() for managed_node2/stat 30564 1726882901.12914: worker is 1 (out of 1 available) 30564 1726882901.12928: exiting _queue_task() for managed_node2/stat 30564 1726882901.12942: done queuing things up, now waiting for results queue to drain 30564 1726882901.12943: waiting for pending results... 30564 1726882901.13259: running TaskExecutor() for managed_node2/TASK: Get stat for interface statebr 30564 1726882901.13409: in run() - task 0e448fcc-3ce9-4216-acec-000000001fe8 30564 1726882901.13431: variable 'ansible_search_path' from source: unknown 30564 1726882901.13438: variable 'ansible_search_path' from source: unknown 30564 1726882901.13480: calling self._execute() 30564 1726882901.13605: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882901.13621: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882901.13641: variable 'omit' from source: magic vars 30564 1726882901.14084: variable 'ansible_distribution_major_version' from source: facts 30564 1726882901.14104: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882901.14116: variable 'omit' from source: magic vars 30564 1726882901.14191: variable 'omit' from source: magic vars 30564 1726882901.14311: variable 'interface' from source: play vars 30564 1726882901.14336: variable 'omit' from source: magic vars 30564 1726882901.14390: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882901.14434: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882901.14461: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882901.14494: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882901.14517: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882901.14554: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882901.14566: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882901.14579: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882901.14699: Set connection var ansible_timeout to 10 30564 1726882901.14711: Set connection var ansible_pipelining to False 30564 1726882901.14721: Set connection var ansible_shell_type to sh 30564 1726882901.14735: Set connection var ansible_shell_executable to /bin/sh 30564 1726882901.14747: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882901.14754: Set connection var ansible_connection to ssh 30564 1726882901.14787: variable 'ansible_shell_executable' from source: unknown 30564 1726882901.14797: variable 'ansible_connection' from source: unknown 30564 1726882901.14809: variable 'ansible_module_compression' from source: unknown 30564 1726882901.14816: variable 'ansible_shell_type' from source: unknown 30564 1726882901.14823: variable 'ansible_shell_executable' from source: unknown 30564 1726882901.14832: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882901.14844: variable 'ansible_pipelining' from source: unknown 30564 1726882901.14851: variable 'ansible_timeout' from source: unknown 30564 1726882901.14858: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882901.15089: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30564 1726882901.15107: variable 'omit' from source: magic vars 30564 1726882901.15117: starting attempt loop 30564 1726882901.15127: running the handler 30564 1726882901.15149: _low_level_execute_command(): starting 30564 1726882901.15163: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882901.16008: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882901.16025: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882901.16040: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882901.16066: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882901.16113: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882901.16130: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882901.16144: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882901.16171: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882901.16184: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882901.16194: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882901.16207: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882901.16224: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882901.16246: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882901.16258: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882901.16279: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882901.16296: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882901.16385: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882901.16408: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882901.16425: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882901.16562: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882901.18239: stdout chunk (state=3): >>>/root <<< 30564 1726882901.18345: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882901.18426: stderr chunk (state=3): >>><<< 30564 1726882901.18442: stdout chunk (state=3): >>><<< 30564 1726882901.18576: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882901.18580: _low_level_execute_command(): starting 30564 1726882901.18583: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882901.1848211-34948-113292335971393 `" && echo ansible-tmp-1726882901.1848211-34948-113292335971393="` echo /root/.ansible/tmp/ansible-tmp-1726882901.1848211-34948-113292335971393 `" ) && sleep 0' 30564 1726882901.19295: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882901.19310: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882901.19326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882901.19345: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882901.19399: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882901.19412: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882901.19434: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882901.19466: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882901.19485: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882901.19497: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882901.19509: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882901.19528: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882901.19544: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882901.19565: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882901.19585: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882901.19608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882901.19729: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882901.19752: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882901.19773: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882901.19926: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882901.21788: stdout chunk (state=3): >>>ansible-tmp-1726882901.1848211-34948-113292335971393=/root/.ansible/tmp/ansible-tmp-1726882901.1848211-34948-113292335971393 <<< 30564 1726882901.21951: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882901.21954: stdout chunk (state=3): >>><<< 30564 1726882901.21958: stderr chunk (state=3): >>><<< 30564 1726882901.21965: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882901.1848211-34948-113292335971393=/root/.ansible/tmp/ansible-tmp-1726882901.1848211-34948-113292335971393 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882901.22008: variable 'ansible_module_compression' from source: unknown 30564 1726882901.22054: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 30564 1726882901.22091: variable 'ansible_facts' from source: unknown 30564 1726882901.22149: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882901.1848211-34948-113292335971393/AnsiballZ_stat.py 30564 1726882901.22255: Sending initial data 30564 1726882901.22258: Sent initial data (153 bytes) 30564 1726882901.23219: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882901.23225: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882901.23278: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 30564 1726882901.23311: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882901.23316: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882901.23330: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882901.23337: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882901.23492: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882901.23506: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882901.23516: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882901.23638: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882901.25387: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882901.25483: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882901.25585: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmp24toswfm /root/.ansible/tmp/ansible-tmp-1726882901.1848211-34948-113292335971393/AnsiballZ_stat.py <<< 30564 1726882901.25687: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882901.27044: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882901.27122: stderr chunk (state=3): >>><<< 30564 1726882901.27126: stdout chunk (state=3): >>><<< 30564 1726882901.27146: done transferring module to remote 30564 1726882901.27162: _low_level_execute_command(): starting 30564 1726882901.27167: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882901.1848211-34948-113292335971393/ /root/.ansible/tmp/ansible-tmp-1726882901.1848211-34948-113292335971393/AnsiballZ_stat.py && sleep 0' 30564 1726882901.27659: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882901.27672: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882901.27683: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882901.27690: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882901.27699: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882901.27705: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882901.27715: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882901.27722: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882901.27728: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 30564 1726882901.27735: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882901.27792: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882901.27798: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882901.27907: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882901.29792: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882901.29796: stdout chunk (state=3): >>><<< 30564 1726882901.29798: stderr chunk (state=3): >>><<< 30564 1726882901.29885: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882901.29888: _low_level_execute_command(): starting 30564 1726882901.29891: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882901.1848211-34948-113292335971393/AnsiballZ_stat.py && sleep 0' 30564 1726882901.30435: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882901.30441: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882901.30456: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882901.30590: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882901.30593: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882901.30595: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882901.30597: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882901.30599: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882901.30601: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882901.30603: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882901.30605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882901.30607: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882901.30609: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882901.30610: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882901.30612: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882901.30657: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882901.30661: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882901.30663: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882901.30802: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882901.43805: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 30564 1726882901.44772: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882901.44824: stderr chunk (state=3): >>><<< 30564 1726882901.44827: stdout chunk (state=3): >>><<< 30564 1726882901.44842: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882901.44867: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882901.1848211-34948-113292335971393/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882901.44877: _low_level_execute_command(): starting 30564 1726882901.44881: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882901.1848211-34948-113292335971393/ > /dev/null 2>&1 && sleep 0' 30564 1726882901.45316: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882901.45323: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882901.45372: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 30564 1726882901.45376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882901.45379: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882901.45433: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882901.45439: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882901.45445: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882901.45544: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882901.47376: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882901.47421: stderr chunk (state=3): >>><<< 30564 1726882901.47425: stdout chunk (state=3): >>><<< 30564 1726882901.47437: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882901.47442: handler run complete 30564 1726882901.47459: attempt loop complete, returning result 30564 1726882901.47461: _execute() done 30564 1726882901.47465: dumping result to json 30564 1726882901.47470: done dumping result, returning 30564 1726882901.47482: done running TaskExecutor() for managed_node2/TASK: Get stat for interface statebr [0e448fcc-3ce9-4216-acec-000000001fe8] 30564 1726882901.47487: sending task result for task 0e448fcc-3ce9-4216-acec-000000001fe8 30564 1726882901.47584: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001fe8 30564 1726882901.47588: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 30564 1726882901.47648: no more pending results, returning what we have 30564 1726882901.47651: results queue empty 30564 1726882901.47652: checking for any_errors_fatal 30564 1726882901.47654: done checking for any_errors_fatal 30564 1726882901.47654: checking for max_fail_percentage 30564 1726882901.47656: done checking for max_fail_percentage 30564 1726882901.47657: checking to see if all hosts have failed and the running result is not ok 30564 1726882901.47658: done checking to see if all hosts have failed 30564 1726882901.47659: getting the remaining hosts for this loop 30564 1726882901.47661: done getting the remaining hosts for this loop 30564 1726882901.47666: getting the next task for host managed_node2 30564 1726882901.47677: done getting next task for host managed_node2 30564 1726882901.47680: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 30564 1726882901.47683: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882901.47689: getting variables 30564 1726882901.47691: in VariableManager get_vars() 30564 1726882901.47733: Calling all_inventory to load vars for managed_node2 30564 1726882901.47736: Calling groups_inventory to load vars for managed_node2 30564 1726882901.47739: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882901.47750: Calling all_plugins_play to load vars for managed_node2 30564 1726882901.47752: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882901.47754: Calling groups_plugins_play to load vars for managed_node2 30564 1726882901.48636: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882901.49595: done with get_vars() 30564 1726882901.49611: done getting variables 30564 1726882901.49655: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30564 1726882901.49745: variable 'interface' from source: play vars TASK [Assert that the interface is absent - 'statebr'] ************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Friday 20 September 2024 21:41:41 -0400 (0:00:00.372) 0:01:40.079 ****** 30564 1726882901.49770: entering _queue_task() for managed_node2/assert 30564 1726882901.49982: worker is 1 (out of 1 available) 30564 1726882901.49994: exiting _queue_task() for managed_node2/assert 30564 1726882901.50006: done queuing things up, now waiting for results queue to drain 30564 1726882901.50007: waiting for pending results... 30564 1726882901.50204: running TaskExecutor() for managed_node2/TASK: Assert that the interface is absent - 'statebr' 30564 1726882901.50275: in run() - task 0e448fcc-3ce9-4216-acec-000000001f5a 30564 1726882901.50286: variable 'ansible_search_path' from source: unknown 30564 1726882901.50290: variable 'ansible_search_path' from source: unknown 30564 1726882901.50318: calling self._execute() 30564 1726882901.50408: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882901.50412: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882901.50418: variable 'omit' from source: magic vars 30564 1726882901.50710: variable 'ansible_distribution_major_version' from source: facts 30564 1726882901.50721: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882901.50733: variable 'omit' from source: magic vars 30564 1726882901.50760: variable 'omit' from source: magic vars 30564 1726882901.50830: variable 'interface' from source: play vars 30564 1726882901.50847: variable 'omit' from source: magic vars 30564 1726882901.50882: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882901.50908: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882901.50924: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882901.50937: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882901.50949: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882901.50973: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882901.50977: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882901.50979: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882901.51045: Set connection var ansible_timeout to 10 30564 1726882901.51049: Set connection var ansible_pipelining to False 30564 1726882901.51053: Set connection var ansible_shell_type to sh 30564 1726882901.51061: Set connection var ansible_shell_executable to /bin/sh 30564 1726882901.51065: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882901.51067: Set connection var ansible_connection to ssh 30564 1726882901.51089: variable 'ansible_shell_executable' from source: unknown 30564 1726882901.51093: variable 'ansible_connection' from source: unknown 30564 1726882901.51096: variable 'ansible_module_compression' from source: unknown 30564 1726882901.51098: variable 'ansible_shell_type' from source: unknown 30564 1726882901.51100: variable 'ansible_shell_executable' from source: unknown 30564 1726882901.51102: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882901.51104: variable 'ansible_pipelining' from source: unknown 30564 1726882901.51106: variable 'ansible_timeout' from source: unknown 30564 1726882901.51108: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882901.51208: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882901.51218: variable 'omit' from source: magic vars 30564 1726882901.51221: starting attempt loop 30564 1726882901.51224: running the handler 30564 1726882901.51324: variable 'interface_stat' from source: set_fact 30564 1726882901.51332: Evaluated conditional (not interface_stat.stat.exists): True 30564 1726882901.51336: handler run complete 30564 1726882901.51348: attempt loop complete, returning result 30564 1726882901.51352: _execute() done 30564 1726882901.51354: dumping result to json 30564 1726882901.51357: done dumping result, returning 30564 1726882901.51362: done running TaskExecutor() for managed_node2/TASK: Assert that the interface is absent - 'statebr' [0e448fcc-3ce9-4216-acec-000000001f5a] 30564 1726882901.51371: sending task result for task 0e448fcc-3ce9-4216-acec-000000001f5a 30564 1726882901.51457: done sending task result for task 0e448fcc-3ce9-4216-acec-000000001f5a 30564 1726882901.51460: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 30564 1726882901.51516: no more pending results, returning what we have 30564 1726882901.51520: results queue empty 30564 1726882901.51521: checking for any_errors_fatal 30564 1726882901.51531: done checking for any_errors_fatal 30564 1726882901.51532: checking for max_fail_percentage 30564 1726882901.51533: done checking for max_fail_percentage 30564 1726882901.51534: checking to see if all hosts have failed and the running result is not ok 30564 1726882901.51535: done checking to see if all hosts have failed 30564 1726882901.51536: getting the remaining hosts for this loop 30564 1726882901.51538: done getting the remaining hosts for this loop 30564 1726882901.51541: getting the next task for host managed_node2 30564 1726882901.51549: done getting next task for host managed_node2 30564 1726882901.51551: ^ task is: TASK: Success in test '{{ lsr_description }}' 30564 1726882901.51554: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882901.51557: getting variables 30564 1726882901.51558: in VariableManager get_vars() 30564 1726882901.51594: Calling all_inventory to load vars for managed_node2 30564 1726882901.51596: Calling groups_inventory to load vars for managed_node2 30564 1726882901.51603: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882901.51616: Calling all_plugins_play to load vars for managed_node2 30564 1726882901.51620: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882901.51622: Calling groups_plugins_play to load vars for managed_node2 30564 1726882901.52556: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882901.53507: done with get_vars() 30564 1726882901.53522: done getting variables 30564 1726882901.53566: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30564 1726882901.53649: variable 'lsr_description' from source: include params TASK [Success in test 'I can take a profile down that is absent'] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:47 Friday 20 September 2024 21:41:41 -0400 (0:00:00.039) 0:01:40.118 ****** 30564 1726882901.53678: entering _queue_task() for managed_node2/debug 30564 1726882901.53892: worker is 1 (out of 1 available) 30564 1726882901.53905: exiting _queue_task() for managed_node2/debug 30564 1726882901.53917: done queuing things up, now waiting for results queue to drain 30564 1726882901.53919: waiting for pending results... 30564 1726882901.54106: running TaskExecutor() for managed_node2/TASK: Success in test 'I can take a profile down that is absent' 30564 1726882901.54184: in run() - task 0e448fcc-3ce9-4216-acec-00000000174b 30564 1726882901.54195: variable 'ansible_search_path' from source: unknown 30564 1726882901.54199: variable 'ansible_search_path' from source: unknown 30564 1726882901.54229: calling self._execute() 30564 1726882901.54312: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882901.54316: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882901.54324: variable 'omit' from source: magic vars 30564 1726882901.54612: variable 'ansible_distribution_major_version' from source: facts 30564 1726882901.54624: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882901.54630: variable 'omit' from source: magic vars 30564 1726882901.54661: variable 'omit' from source: magic vars 30564 1726882901.54734: variable 'lsr_description' from source: include params 30564 1726882901.54747: variable 'omit' from source: magic vars 30564 1726882901.54785: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882901.54812: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882901.54828: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882901.54841: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882901.54850: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882901.54883: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882901.54886: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882901.54888: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882901.54952: Set connection var ansible_timeout to 10 30564 1726882901.54955: Set connection var ansible_pipelining to False 30564 1726882901.54958: Set connection var ansible_shell_type to sh 30564 1726882901.54965: Set connection var ansible_shell_executable to /bin/sh 30564 1726882901.54974: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882901.54976: Set connection var ansible_connection to ssh 30564 1726882901.54998: variable 'ansible_shell_executable' from source: unknown 30564 1726882901.55001: variable 'ansible_connection' from source: unknown 30564 1726882901.55004: variable 'ansible_module_compression' from source: unknown 30564 1726882901.55007: variable 'ansible_shell_type' from source: unknown 30564 1726882901.55010: variable 'ansible_shell_executable' from source: unknown 30564 1726882901.55012: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882901.55014: variable 'ansible_pipelining' from source: unknown 30564 1726882901.55016: variable 'ansible_timeout' from source: unknown 30564 1726882901.55019: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882901.55121: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882901.55130: variable 'omit' from source: magic vars 30564 1726882901.55137: starting attempt loop 30564 1726882901.55139: running the handler 30564 1726882901.55178: handler run complete 30564 1726882901.55189: attempt loop complete, returning result 30564 1726882901.55192: _execute() done 30564 1726882901.55195: dumping result to json 30564 1726882901.55197: done dumping result, returning 30564 1726882901.55208: done running TaskExecutor() for managed_node2/TASK: Success in test 'I can take a profile down that is absent' [0e448fcc-3ce9-4216-acec-00000000174b] 30564 1726882901.55211: sending task result for task 0e448fcc-3ce9-4216-acec-00000000174b 30564 1726882901.55290: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000174b 30564 1726882901.55294: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: +++++ Success in test 'I can take a profile down that is absent' +++++ 30564 1726882901.55343: no more pending results, returning what we have 30564 1726882901.55346: results queue empty 30564 1726882901.55347: checking for any_errors_fatal 30564 1726882901.55356: done checking for any_errors_fatal 30564 1726882901.55357: checking for max_fail_percentage 30564 1726882901.55358: done checking for max_fail_percentage 30564 1726882901.55359: checking to see if all hosts have failed and the running result is not ok 30564 1726882901.55360: done checking to see if all hosts have failed 30564 1726882901.55361: getting the remaining hosts for this loop 30564 1726882901.55363: done getting the remaining hosts for this loop 30564 1726882901.55368: getting the next task for host managed_node2 30564 1726882901.55375: done getting next task for host managed_node2 30564 1726882901.55378: ^ task is: TASK: Cleanup 30564 1726882901.55381: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882901.55386: getting variables 30564 1726882901.55387: in VariableManager get_vars() 30564 1726882901.55428: Calling all_inventory to load vars for managed_node2 30564 1726882901.55430: Calling groups_inventory to load vars for managed_node2 30564 1726882901.55434: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882901.55443: Calling all_plugins_play to load vars for managed_node2 30564 1726882901.55446: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882901.55449: Calling groups_plugins_play to load vars for managed_node2 30564 1726882901.56266: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882901.57335: done with get_vars() 30564 1726882901.57350: done getting variables TASK [Cleanup] ***************************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:66 Friday 20 September 2024 21:41:41 -0400 (0:00:00.037) 0:01:40.155 ****** 30564 1726882901.57414: entering _queue_task() for managed_node2/include_tasks 30564 1726882901.57618: worker is 1 (out of 1 available) 30564 1726882901.57630: exiting _queue_task() for managed_node2/include_tasks 30564 1726882901.57642: done queuing things up, now waiting for results queue to drain 30564 1726882901.57644: waiting for pending results... 30564 1726882901.57830: running TaskExecutor() for managed_node2/TASK: Cleanup 30564 1726882901.57896: in run() - task 0e448fcc-3ce9-4216-acec-00000000174f 30564 1726882901.57907: variable 'ansible_search_path' from source: unknown 30564 1726882901.57912: variable 'ansible_search_path' from source: unknown 30564 1726882901.57946: variable 'lsr_cleanup' from source: include params 30564 1726882901.58106: variable 'lsr_cleanup' from source: include params 30564 1726882901.58158: variable 'omit' from source: magic vars 30564 1726882901.58263: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882901.58274: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882901.58283: variable 'omit' from source: magic vars 30564 1726882901.58459: variable 'ansible_distribution_major_version' from source: facts 30564 1726882901.58472: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882901.58476: variable 'item' from source: unknown 30564 1726882901.58520: variable 'item' from source: unknown 30564 1726882901.58542: variable 'item' from source: unknown 30564 1726882901.58590: variable 'item' from source: unknown 30564 1726882901.58711: dumping result to json 30564 1726882901.58713: done dumping result, returning 30564 1726882901.58715: done running TaskExecutor() for managed_node2/TASK: Cleanup [0e448fcc-3ce9-4216-acec-00000000174f] 30564 1726882901.58717: sending task result for task 0e448fcc-3ce9-4216-acec-00000000174f 30564 1726882901.58754: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000174f 30564 1726882901.58756: WORKER PROCESS EXITING 30564 1726882901.58783: no more pending results, returning what we have 30564 1726882901.58787: in VariableManager get_vars() 30564 1726882901.58822: Calling all_inventory to load vars for managed_node2 30564 1726882901.58824: Calling groups_inventory to load vars for managed_node2 30564 1726882901.58827: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882901.58837: Calling all_plugins_play to load vars for managed_node2 30564 1726882901.58839: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882901.58841: Calling groups_plugins_play to load vars for managed_node2 30564 1726882901.59634: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882901.60582: done with get_vars() 30564 1726882901.60596: variable 'ansible_search_path' from source: unknown 30564 1726882901.60597: variable 'ansible_search_path' from source: unknown 30564 1726882901.60624: we have included files to process 30564 1726882901.60625: generating all_blocks data 30564 1726882901.60627: done generating all_blocks data 30564 1726882901.60630: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30564 1726882901.60631: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30564 1726882901.60632: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30564 1726882901.60759: done processing included file 30564 1726882901.60761: iterating over new_blocks loaded from include file 30564 1726882901.60762: in VariableManager get_vars() 30564 1726882901.60775: done with get_vars() 30564 1726882901.60776: filtering new block on tags 30564 1726882901.60792: done filtering new block on tags 30564 1726882901.60793: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml for managed_node2 => (item=tasks/cleanup_profile+device.yml) 30564 1726882901.60797: extending task lists for all hosts with included blocks 30564 1726882901.61584: done extending task lists 30564 1726882901.61586: done processing included files 30564 1726882901.61586: results queue empty 30564 1726882901.61587: checking for any_errors_fatal 30564 1726882901.61590: done checking for any_errors_fatal 30564 1726882901.61591: checking for max_fail_percentage 30564 1726882901.61591: done checking for max_fail_percentage 30564 1726882901.61592: checking to see if all hosts have failed and the running result is not ok 30564 1726882901.61592: done checking to see if all hosts have failed 30564 1726882901.61593: getting the remaining hosts for this loop 30564 1726882901.61594: done getting the remaining hosts for this loop 30564 1726882901.61595: getting the next task for host managed_node2 30564 1726882901.61598: done getting next task for host managed_node2 30564 1726882901.61600: ^ task is: TASK: Cleanup profile and device 30564 1726882901.61602: ^ state is: HOST STATE: block=7, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882901.61603: getting variables 30564 1726882901.61604: in VariableManager get_vars() 30564 1726882901.61611: Calling all_inventory to load vars for managed_node2 30564 1726882901.61613: Calling groups_inventory to load vars for managed_node2 30564 1726882901.61614: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882901.61618: Calling all_plugins_play to load vars for managed_node2 30564 1726882901.61619: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882901.61621: Calling groups_plugins_play to load vars for managed_node2 30564 1726882901.62352: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882901.63271: done with get_vars() 30564 1726882901.63285: done getting variables 30564 1726882901.63315: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Cleanup profile and device] ********************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml:3 Friday 20 September 2024 21:41:41 -0400 (0:00:00.059) 0:01:40.214 ****** 30564 1726882901.63336: entering _queue_task() for managed_node2/shell 30564 1726882901.63548: worker is 1 (out of 1 available) 30564 1726882901.63560: exiting _queue_task() for managed_node2/shell 30564 1726882901.63573: done queuing things up, now waiting for results queue to drain 30564 1726882901.63574: waiting for pending results... 30564 1726882901.63761: running TaskExecutor() for managed_node2/TASK: Cleanup profile and device 30564 1726882901.63827: in run() - task 0e448fcc-3ce9-4216-acec-00000000200b 30564 1726882901.63837: variable 'ansible_search_path' from source: unknown 30564 1726882901.63841: variable 'ansible_search_path' from source: unknown 30564 1726882901.63869: calling self._execute() 30564 1726882901.63946: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882901.63950: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882901.63960: variable 'omit' from source: magic vars 30564 1726882901.64240: variable 'ansible_distribution_major_version' from source: facts 30564 1726882901.64251: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882901.64257: variable 'omit' from source: magic vars 30564 1726882901.64294: variable 'omit' from source: magic vars 30564 1726882901.64397: variable 'interface' from source: play vars 30564 1726882901.64411: variable 'omit' from source: magic vars 30564 1726882901.64445: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882901.64471: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882901.64489: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882901.64506: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882901.64514: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882901.64538: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882901.64541: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882901.64544: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882901.64615: Set connection var ansible_timeout to 10 30564 1726882901.64618: Set connection var ansible_pipelining to False 30564 1726882901.64621: Set connection var ansible_shell_type to sh 30564 1726882901.64626: Set connection var ansible_shell_executable to /bin/sh 30564 1726882901.64634: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882901.64637: Set connection var ansible_connection to ssh 30564 1726882901.64655: variable 'ansible_shell_executable' from source: unknown 30564 1726882901.64658: variable 'ansible_connection' from source: unknown 30564 1726882901.64661: variable 'ansible_module_compression' from source: unknown 30564 1726882901.64665: variable 'ansible_shell_type' from source: unknown 30564 1726882901.64667: variable 'ansible_shell_executable' from source: unknown 30564 1726882901.64669: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882901.64675: variable 'ansible_pipelining' from source: unknown 30564 1726882901.64679: variable 'ansible_timeout' from source: unknown 30564 1726882901.64681: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882901.64780: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882901.64789: variable 'omit' from source: magic vars 30564 1726882901.64793: starting attempt loop 30564 1726882901.64796: running the handler 30564 1726882901.64806: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882901.64820: _low_level_execute_command(): starting 30564 1726882901.64827: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882901.65349: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882901.65358: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882901.65394: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 30564 1726882901.65409: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882901.65459: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882901.65475: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882901.65597: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882901.67255: stdout chunk (state=3): >>>/root <<< 30564 1726882901.67359: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882901.67406: stderr chunk (state=3): >>><<< 30564 1726882901.67410: stdout chunk (state=3): >>><<< 30564 1726882901.67429: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882901.67442: _low_level_execute_command(): starting 30564 1726882901.67447: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882901.6742866-34974-162460444692563 `" && echo ansible-tmp-1726882901.6742866-34974-162460444692563="` echo /root/.ansible/tmp/ansible-tmp-1726882901.6742866-34974-162460444692563 `" ) && sleep 0' 30564 1726882901.67874: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882901.67881: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882901.67915: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882901.67928: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882901.67979: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882901.67991: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882901.68099: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882901.69957: stdout chunk (state=3): >>>ansible-tmp-1726882901.6742866-34974-162460444692563=/root/.ansible/tmp/ansible-tmp-1726882901.6742866-34974-162460444692563 <<< 30564 1726882901.70069: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882901.70112: stderr chunk (state=3): >>><<< 30564 1726882901.70116: stdout chunk (state=3): >>><<< 30564 1726882901.70129: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882901.6742866-34974-162460444692563=/root/.ansible/tmp/ansible-tmp-1726882901.6742866-34974-162460444692563 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882901.70152: variable 'ansible_module_compression' from source: unknown 30564 1726882901.70196: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30564 1726882901.70228: variable 'ansible_facts' from source: unknown 30564 1726882901.70291: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882901.6742866-34974-162460444692563/AnsiballZ_command.py 30564 1726882901.70393: Sending initial data 30564 1726882901.70397: Sent initial data (156 bytes) 30564 1726882901.71032: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882901.71038: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882901.71073: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882901.71088: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 30564 1726882901.71098: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882901.71143: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882901.71155: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882901.71255: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882901.72982: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 <<< 30564 1726882901.72990: stderr chunk (state=3): >>>debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882901.73084: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882901.73187: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmp8em7s41g /root/.ansible/tmp/ansible-tmp-1726882901.6742866-34974-162460444692563/AnsiballZ_command.py <<< 30564 1726882901.73285: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882901.74310: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882901.74395: stderr chunk (state=3): >>><<< 30564 1726882901.74398: stdout chunk (state=3): >>><<< 30564 1726882901.74414: done transferring module to remote 30564 1726882901.74422: _low_level_execute_command(): starting 30564 1726882901.74426: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882901.6742866-34974-162460444692563/ /root/.ansible/tmp/ansible-tmp-1726882901.6742866-34974-162460444692563/AnsiballZ_command.py && sleep 0' 30564 1726882901.74844: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882901.74862: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882901.74881: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882901.74892: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882901.74940: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882901.74947: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882901.75062: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882901.76791: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882901.76833: stderr chunk (state=3): >>><<< 30564 1726882901.76836: stdout chunk (state=3): >>><<< 30564 1726882901.76847: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882901.76851: _low_level_execute_command(): starting 30564 1726882901.76854: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882901.6742866-34974-162460444692563/AnsiballZ_command.py && sleep 0' 30564 1726882901.77266: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882901.77282: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882901.77294: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 30564 1726882901.77305: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882901.77315: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882901.77360: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882901.77381: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882901.77489: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882901.94162: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "Error: unknown connection 'statebr'.\nError: cannot delete unknown connection(s): 'statebr'.\nCannot find device \"statebr\"", "rc": 1, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-20 21:41:41.904146", "end": "2024-09-20 21:41:41.939536", "delta": "0:00:00.035390", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30564 1726882901.95352: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.11.158 closed. <<< 30564 1726882901.95416: stderr chunk (state=3): >>><<< 30564 1726882901.95420: stdout chunk (state=3): >>><<< 30564 1726882901.95439: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "Error: unknown connection 'statebr'.\nError: cannot delete unknown connection(s): 'statebr'.\nCannot find device \"statebr\"", "rc": 1, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-20 21:41:41.904146", "end": "2024-09-20 21:41:41.939536", "delta": "0:00:00.035390", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.11.158 closed. 30564 1726882901.95475: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882901.6742866-34974-162460444692563/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882901.95481: _low_level_execute_command(): starting 30564 1726882901.95486: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882901.6742866-34974-162460444692563/ > /dev/null 2>&1 && sleep 0' 30564 1726882901.95950: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882901.95957: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882901.95988: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882901.96000: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882901.96052: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882901.96072: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882901.96182: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882901.97978: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882901.98020: stderr chunk (state=3): >>><<< 30564 1726882901.98024: stdout chunk (state=3): >>><<< 30564 1726882901.98036: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882901.98042: handler run complete 30564 1726882901.98063: Evaluated conditional (False): False 30564 1726882901.98075: attempt loop complete, returning result 30564 1726882901.98078: _execute() done 30564 1726882901.98080: dumping result to json 30564 1726882901.98085: done dumping result, returning 30564 1726882901.98093: done running TaskExecutor() for managed_node2/TASK: Cleanup profile and device [0e448fcc-3ce9-4216-acec-00000000200b] 30564 1726882901.98099: sending task result for task 0e448fcc-3ce9-4216-acec-00000000200b 30564 1726882901.98197: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000200b 30564 1726882901.98200: WORKER PROCESS EXITING fatal: [managed_node2]: FAILED! => { "changed": false, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "delta": "0:00:00.035390", "end": "2024-09-20 21:41:41.939536", "rc": 1, "start": "2024-09-20 21:41:41.904146" } STDERR: Error: unknown connection 'statebr'. Error: cannot delete unknown connection(s): 'statebr'. Cannot find device "statebr" MSG: non-zero return code ...ignoring 30564 1726882901.98265: no more pending results, returning what we have 30564 1726882901.98269: results queue empty 30564 1726882901.98270: checking for any_errors_fatal 30564 1726882901.98272: done checking for any_errors_fatal 30564 1726882901.98272: checking for max_fail_percentage 30564 1726882901.98274: done checking for max_fail_percentage 30564 1726882901.98275: checking to see if all hosts have failed and the running result is not ok 30564 1726882901.98276: done checking to see if all hosts have failed 30564 1726882901.98278: getting the remaining hosts for this loop 30564 1726882901.98279: done getting the remaining hosts for this loop 30564 1726882901.98283: getting the next task for host managed_node2 30564 1726882901.98294: done getting next task for host managed_node2 30564 1726882901.98297: ^ task is: TASK: Include the task 'run_test.yml' 30564 1726882901.98300: ^ state is: HOST STATE: block=8, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882901.98304: getting variables 30564 1726882901.98306: in VariableManager get_vars() 30564 1726882901.98348: Calling all_inventory to load vars for managed_node2 30564 1726882901.98350: Calling groups_inventory to load vars for managed_node2 30564 1726882901.98354: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882901.98372: Calling all_plugins_play to load vars for managed_node2 30564 1726882901.98375: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882901.98378: Calling groups_plugins_play to load vars for managed_node2 30564 1726882901.99365: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882902.00303: done with get_vars() 30564 1726882902.00320: done getting variables TASK [Include the task 'run_test.yml'] ***************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_states.yml:124 Friday 20 September 2024 21:41:42 -0400 (0:00:00.370) 0:01:40.585 ****** 30564 1726882902.00388: entering _queue_task() for managed_node2/include_tasks 30564 1726882902.00604: worker is 1 (out of 1 available) 30564 1726882902.00618: exiting _queue_task() for managed_node2/include_tasks 30564 1726882902.00630: done queuing things up, now waiting for results queue to drain 30564 1726882902.00631: waiting for pending results... 30564 1726882902.00821: running TaskExecutor() for managed_node2/TASK: Include the task 'run_test.yml' 30564 1726882902.00880: in run() - task 0e448fcc-3ce9-4216-acec-000000000017 30564 1726882902.00892: variable 'ansible_search_path' from source: unknown 30564 1726882902.00919: calling self._execute() 30564 1726882902.00998: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882902.01002: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882902.01010: variable 'omit' from source: magic vars 30564 1726882902.01297: variable 'ansible_distribution_major_version' from source: facts 30564 1726882902.01309: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882902.01312: _execute() done 30564 1726882902.01316: dumping result to json 30564 1726882902.01319: done dumping result, returning 30564 1726882902.01324: done running TaskExecutor() for managed_node2/TASK: Include the task 'run_test.yml' [0e448fcc-3ce9-4216-acec-000000000017] 30564 1726882902.01330: sending task result for task 0e448fcc-3ce9-4216-acec-000000000017 30564 1726882902.01433: done sending task result for task 0e448fcc-3ce9-4216-acec-000000000017 30564 1726882902.01436: WORKER PROCESS EXITING 30564 1726882902.01467: no more pending results, returning what we have 30564 1726882902.01471: in VariableManager get_vars() 30564 1726882902.01514: Calling all_inventory to load vars for managed_node2 30564 1726882902.01516: Calling groups_inventory to load vars for managed_node2 30564 1726882902.01520: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882902.01531: Calling all_plugins_play to load vars for managed_node2 30564 1726882902.01534: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882902.01537: Calling groups_plugins_play to load vars for managed_node2 30564 1726882902.02346: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882902.03280: done with get_vars() 30564 1726882902.03295: variable 'ansible_search_path' from source: unknown 30564 1726882902.03305: we have included files to process 30564 1726882902.03305: generating all_blocks data 30564 1726882902.03307: done generating all_blocks data 30564 1726882902.03311: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30564 1726882902.03312: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30564 1726882902.03313: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 30564 1726882902.03569: in VariableManager get_vars() 30564 1726882902.03582: done with get_vars() 30564 1726882902.03609: in VariableManager get_vars() 30564 1726882902.03622: done with get_vars() 30564 1726882902.03648: in VariableManager get_vars() 30564 1726882902.03659: done with get_vars() 30564 1726882902.03685: in VariableManager get_vars() 30564 1726882902.03697: done with get_vars() 30564 1726882902.03725: in VariableManager get_vars() 30564 1726882902.03736: done with get_vars() 30564 1726882902.03988: in VariableManager get_vars() 30564 1726882902.03999: done with get_vars() 30564 1726882902.04006: done processing included file 30564 1726882902.04008: iterating over new_blocks loaded from include file 30564 1726882902.04009: in VariableManager get_vars() 30564 1726882902.04015: done with get_vars() 30564 1726882902.04016: filtering new block on tags 30564 1726882902.04077: done filtering new block on tags 30564 1726882902.04079: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml for managed_node2 30564 1726882902.04083: extending task lists for all hosts with included blocks 30564 1726882902.04105: done extending task lists 30564 1726882902.04105: done processing included files 30564 1726882902.04106: results queue empty 30564 1726882902.04106: checking for any_errors_fatal 30564 1726882902.04109: done checking for any_errors_fatal 30564 1726882902.04110: checking for max_fail_percentage 30564 1726882902.04111: done checking for max_fail_percentage 30564 1726882902.04111: checking to see if all hosts have failed and the running result is not ok 30564 1726882902.04112: done checking to see if all hosts have failed 30564 1726882902.04112: getting the remaining hosts for this loop 30564 1726882902.04113: done getting the remaining hosts for this loop 30564 1726882902.04115: getting the next task for host managed_node2 30564 1726882902.04117: done getting next task for host managed_node2 30564 1726882902.04118: ^ task is: TASK: TEST: {{ lsr_description }} 30564 1726882902.04120: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882902.04122: getting variables 30564 1726882902.04122: in VariableManager get_vars() 30564 1726882902.04128: Calling all_inventory to load vars for managed_node2 30564 1726882902.04129: Calling groups_inventory to load vars for managed_node2 30564 1726882902.04131: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882902.04134: Calling all_plugins_play to load vars for managed_node2 30564 1726882902.04136: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882902.04137: Calling groups_plugins_play to load vars for managed_node2 30564 1726882902.04839: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882902.05750: done with get_vars() 30564 1726882902.05767: done getting variables 30564 1726882902.05794: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30564 1726882902.05874: variable 'lsr_description' from source: include params TASK [TEST: I will not get an error when I try to remove an absent profile] **** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:5 Friday 20 September 2024 21:41:42 -0400 (0:00:00.055) 0:01:40.640 ****** 30564 1726882902.05896: entering _queue_task() for managed_node2/debug 30564 1726882902.06108: worker is 1 (out of 1 available) 30564 1726882902.06121: exiting _queue_task() for managed_node2/debug 30564 1726882902.06134: done queuing things up, now waiting for results queue to drain 30564 1726882902.06135: waiting for pending results... 30564 1726882902.06323: running TaskExecutor() for managed_node2/TASK: TEST: I will not get an error when I try to remove an absent profile 30564 1726882902.06391: in run() - task 0e448fcc-3ce9-4216-acec-0000000020ad 30564 1726882902.06403: variable 'ansible_search_path' from source: unknown 30564 1726882902.06407: variable 'ansible_search_path' from source: unknown 30564 1726882902.06436: calling self._execute() 30564 1726882902.06513: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882902.06519: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882902.06528: variable 'omit' from source: magic vars 30564 1726882902.06807: variable 'ansible_distribution_major_version' from source: facts 30564 1726882902.06817: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882902.06824: variable 'omit' from source: magic vars 30564 1726882902.06852: variable 'omit' from source: magic vars 30564 1726882902.06922: variable 'lsr_description' from source: include params 30564 1726882902.06935: variable 'omit' from source: magic vars 30564 1726882902.06970: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882902.07000: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882902.07019: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882902.07032: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882902.07042: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882902.07066: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882902.07070: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882902.07078: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882902.07144: Set connection var ansible_timeout to 10 30564 1726882902.07147: Set connection var ansible_pipelining to False 30564 1726882902.07150: Set connection var ansible_shell_type to sh 30564 1726882902.07156: Set connection var ansible_shell_executable to /bin/sh 30564 1726882902.07162: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882902.07166: Set connection var ansible_connection to ssh 30564 1726882902.07188: variable 'ansible_shell_executable' from source: unknown 30564 1726882902.07191: variable 'ansible_connection' from source: unknown 30564 1726882902.07194: variable 'ansible_module_compression' from source: unknown 30564 1726882902.07196: variable 'ansible_shell_type' from source: unknown 30564 1726882902.07200: variable 'ansible_shell_executable' from source: unknown 30564 1726882902.07203: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882902.07206: variable 'ansible_pipelining' from source: unknown 30564 1726882902.07208: variable 'ansible_timeout' from source: unknown 30564 1726882902.07210: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882902.07308: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882902.07317: variable 'omit' from source: magic vars 30564 1726882902.07325: starting attempt loop 30564 1726882902.07328: running the handler 30564 1726882902.07362: handler run complete 30564 1726882902.07377: attempt loop complete, returning result 30564 1726882902.07380: _execute() done 30564 1726882902.07383: dumping result to json 30564 1726882902.07385: done dumping result, returning 30564 1726882902.07391: done running TaskExecutor() for managed_node2/TASK: TEST: I will not get an error when I try to remove an absent profile [0e448fcc-3ce9-4216-acec-0000000020ad] 30564 1726882902.07396: sending task result for task 0e448fcc-3ce9-4216-acec-0000000020ad 30564 1726882902.07483: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000020ad 30564 1726882902.07486: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: ########## I will not get an error when I try to remove an absent profile ########## 30564 1726882902.07533: no more pending results, returning what we have 30564 1726882902.07541: results queue empty 30564 1726882902.07542: checking for any_errors_fatal 30564 1726882902.07544: done checking for any_errors_fatal 30564 1726882902.07544: checking for max_fail_percentage 30564 1726882902.07546: done checking for max_fail_percentage 30564 1726882902.07547: checking to see if all hosts have failed and the running result is not ok 30564 1726882902.07547: done checking to see if all hosts have failed 30564 1726882902.07548: getting the remaining hosts for this loop 30564 1726882902.07550: done getting the remaining hosts for this loop 30564 1726882902.07553: getting the next task for host managed_node2 30564 1726882902.07560: done getting next task for host managed_node2 30564 1726882902.07562: ^ task is: TASK: Show item 30564 1726882902.07566: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882902.07569: getting variables 30564 1726882902.07571: in VariableManager get_vars() 30564 1726882902.07603: Calling all_inventory to load vars for managed_node2 30564 1726882902.07605: Calling groups_inventory to load vars for managed_node2 30564 1726882902.07608: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882902.07622: Calling all_plugins_play to load vars for managed_node2 30564 1726882902.07624: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882902.07627: Calling groups_plugins_play to load vars for managed_node2 30564 1726882902.08514: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882902.09448: done with get_vars() 30564 1726882902.09466: done getting variables 30564 1726882902.09504: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show item] *************************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:9 Friday 20 September 2024 21:41:42 -0400 (0:00:00.036) 0:01:40.676 ****** 30564 1726882902.09523: entering _queue_task() for managed_node2/debug 30564 1726882902.09715: worker is 1 (out of 1 available) 30564 1726882902.09729: exiting _queue_task() for managed_node2/debug 30564 1726882902.09740: done queuing things up, now waiting for results queue to drain 30564 1726882902.09742: waiting for pending results... 30564 1726882902.09921: running TaskExecutor() for managed_node2/TASK: Show item 30564 1726882902.09994: in run() - task 0e448fcc-3ce9-4216-acec-0000000020ae 30564 1726882902.10006: variable 'ansible_search_path' from source: unknown 30564 1726882902.10009: variable 'ansible_search_path' from source: unknown 30564 1726882902.10054: variable 'omit' from source: magic vars 30564 1726882902.10173: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882902.10183: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882902.10192: variable 'omit' from source: magic vars 30564 1726882902.10446: variable 'ansible_distribution_major_version' from source: facts 30564 1726882902.10457: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882902.10462: variable 'omit' from source: magic vars 30564 1726882902.10489: variable 'omit' from source: magic vars 30564 1726882902.10519: variable 'item' from source: unknown 30564 1726882902.10567: variable 'item' from source: unknown 30564 1726882902.10581: variable 'omit' from source: magic vars 30564 1726882902.10614: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882902.10642: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882902.10659: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882902.10680: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882902.10691: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882902.10714: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882902.10717: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882902.10719: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882902.10790: Set connection var ansible_timeout to 10 30564 1726882902.10794: Set connection var ansible_pipelining to False 30564 1726882902.10797: Set connection var ansible_shell_type to sh 30564 1726882902.10802: Set connection var ansible_shell_executable to /bin/sh 30564 1726882902.10809: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882902.10811: Set connection var ansible_connection to ssh 30564 1726882902.10827: variable 'ansible_shell_executable' from source: unknown 30564 1726882902.10830: variable 'ansible_connection' from source: unknown 30564 1726882902.10834: variable 'ansible_module_compression' from source: unknown 30564 1726882902.10837: variable 'ansible_shell_type' from source: unknown 30564 1726882902.10839: variable 'ansible_shell_executable' from source: unknown 30564 1726882902.10841: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882902.10843: variable 'ansible_pipelining' from source: unknown 30564 1726882902.10847: variable 'ansible_timeout' from source: unknown 30564 1726882902.10853: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882902.10947: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882902.10957: variable 'omit' from source: magic vars 30564 1726882902.10960: starting attempt loop 30564 1726882902.10963: running the handler 30564 1726882902.11002: variable 'lsr_description' from source: include params 30564 1726882902.11050: variable 'lsr_description' from source: include params 30564 1726882902.11057: handler run complete 30564 1726882902.11073: attempt loop complete, returning result 30564 1726882902.11085: variable 'item' from source: unknown 30564 1726882902.11130: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_description) => { "ansible_loop_var": "item", "item": "lsr_description", "lsr_description": "I will not get an error when I try to remove an absent profile" } 30564 1726882902.11283: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882902.11287: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882902.11289: variable 'omit' from source: magic vars 30564 1726882902.11347: variable 'ansible_distribution_major_version' from source: facts 30564 1726882902.11351: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882902.11356: variable 'omit' from source: magic vars 30564 1726882902.11370: variable 'omit' from source: magic vars 30564 1726882902.11394: variable 'item' from source: unknown 30564 1726882902.11439: variable 'item' from source: unknown 30564 1726882902.11454: variable 'omit' from source: magic vars 30564 1726882902.11467: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882902.11475: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882902.11481: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882902.11490: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882902.11493: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882902.11495: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882902.11544: Set connection var ansible_timeout to 10 30564 1726882902.11547: Set connection var ansible_pipelining to False 30564 1726882902.11550: Set connection var ansible_shell_type to sh 30564 1726882902.11555: Set connection var ansible_shell_executable to /bin/sh 30564 1726882902.11563: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882902.11566: Set connection var ansible_connection to ssh 30564 1726882902.11582: variable 'ansible_shell_executable' from source: unknown 30564 1726882902.11585: variable 'ansible_connection' from source: unknown 30564 1726882902.11587: variable 'ansible_module_compression' from source: unknown 30564 1726882902.11590: variable 'ansible_shell_type' from source: unknown 30564 1726882902.11592: variable 'ansible_shell_executable' from source: unknown 30564 1726882902.11594: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882902.11596: variable 'ansible_pipelining' from source: unknown 30564 1726882902.11600: variable 'ansible_timeout' from source: unknown 30564 1726882902.11604: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882902.11661: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882902.11673: variable 'omit' from source: magic vars 30564 1726882902.11676: starting attempt loop 30564 1726882902.11678: running the handler 30564 1726882902.11694: variable 'lsr_setup' from source: include params 30564 1726882902.11744: variable 'lsr_setup' from source: include params 30564 1726882902.11784: handler run complete 30564 1726882902.11795: attempt loop complete, returning result 30564 1726882902.11806: variable 'item' from source: unknown 30564 1726882902.11853: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_setup) => { "ansible_loop_var": "item", "item": "lsr_setup", "lsr_setup": [ "tasks/create_bridge_profile.yml", "tasks/activate_profile.yml", "tasks/remove+down_profile.yml" ] } 30564 1726882902.11937: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882902.11945: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882902.11948: variable 'omit' from source: magic vars 30564 1726882902.12042: variable 'ansible_distribution_major_version' from source: facts 30564 1726882902.12045: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882902.12054: variable 'omit' from source: magic vars 30564 1726882902.12065: variable 'omit' from source: magic vars 30564 1726882902.12091: variable 'item' from source: unknown 30564 1726882902.12134: variable 'item' from source: unknown 30564 1726882902.12146: variable 'omit' from source: magic vars 30564 1726882902.12163: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882902.12166: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882902.12176: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882902.12183: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882902.12186: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882902.12189: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882902.12234: Set connection var ansible_timeout to 10 30564 1726882902.12237: Set connection var ansible_pipelining to False 30564 1726882902.12240: Set connection var ansible_shell_type to sh 30564 1726882902.12245: Set connection var ansible_shell_executable to /bin/sh 30564 1726882902.12252: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882902.12255: Set connection var ansible_connection to ssh 30564 1726882902.12273: variable 'ansible_shell_executable' from source: unknown 30564 1726882902.12276: variable 'ansible_connection' from source: unknown 30564 1726882902.12281: variable 'ansible_module_compression' from source: unknown 30564 1726882902.12283: variable 'ansible_shell_type' from source: unknown 30564 1726882902.12285: variable 'ansible_shell_executable' from source: unknown 30564 1726882902.12287: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882902.12289: variable 'ansible_pipelining' from source: unknown 30564 1726882902.12292: variable 'ansible_timeout' from source: unknown 30564 1726882902.12294: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882902.12347: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882902.12353: variable 'omit' from source: magic vars 30564 1726882902.12358: starting attempt loop 30564 1726882902.12361: running the handler 30564 1726882902.12380: variable 'lsr_test' from source: include params 30564 1726882902.12423: variable 'lsr_test' from source: include params 30564 1726882902.12437: handler run complete 30564 1726882902.12446: attempt loop complete, returning result 30564 1726882902.12457: variable 'item' from source: unknown 30564 1726882902.12503: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_test) => { "ansible_loop_var": "item", "item": "lsr_test", "lsr_test": [ "tasks/remove+down_profile.yml" ] } 30564 1726882902.12581: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882902.12585: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882902.12587: variable 'omit' from source: magic vars 30564 1726882902.12685: variable 'ansible_distribution_major_version' from source: facts 30564 1726882902.12689: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882902.12694: variable 'omit' from source: magic vars 30564 1726882902.12704: variable 'omit' from source: magic vars 30564 1726882902.12731: variable 'item' from source: unknown 30564 1726882902.12774: variable 'item' from source: unknown 30564 1726882902.12785: variable 'omit' from source: magic vars 30564 1726882902.12799: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882902.12805: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882902.12813: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882902.12820: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882902.12823: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882902.12825: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882902.12874: Set connection var ansible_timeout to 10 30564 1726882902.12877: Set connection var ansible_pipelining to False 30564 1726882902.12880: Set connection var ansible_shell_type to sh 30564 1726882902.12884: Set connection var ansible_shell_executable to /bin/sh 30564 1726882902.12891: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882902.12893: Set connection var ansible_connection to ssh 30564 1726882902.12908: variable 'ansible_shell_executable' from source: unknown 30564 1726882902.12910: variable 'ansible_connection' from source: unknown 30564 1726882902.12913: variable 'ansible_module_compression' from source: unknown 30564 1726882902.12919: variable 'ansible_shell_type' from source: unknown 30564 1726882902.12922: variable 'ansible_shell_executable' from source: unknown 30564 1726882902.12924: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882902.12926: variable 'ansible_pipelining' from source: unknown 30564 1726882902.12928: variable 'ansible_timeout' from source: unknown 30564 1726882902.12930: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882902.12988: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882902.12994: variable 'omit' from source: magic vars 30564 1726882902.12997: starting attempt loop 30564 1726882902.13000: running the handler 30564 1726882902.13013: variable 'lsr_assert' from source: include params 30564 1726882902.13057: variable 'lsr_assert' from source: include params 30564 1726882902.13073: handler run complete 30564 1726882902.13084: attempt loop complete, returning result 30564 1726882902.13095: variable 'item' from source: unknown 30564 1726882902.13139: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_assert) => { "ansible_loop_var": "item", "item": "lsr_assert", "lsr_assert": [ "tasks/assert_profile_absent.yml", "tasks/get_NetworkManager_NVR.yml" ] } 30564 1726882902.13219: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882902.13223: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882902.13225: variable 'omit' from source: magic vars 30564 1726882902.13341: variable 'ansible_distribution_major_version' from source: facts 30564 1726882902.13348: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882902.13355: variable 'omit' from source: magic vars 30564 1726882902.13361: variable 'omit' from source: magic vars 30564 1726882902.13389: variable 'item' from source: unknown 30564 1726882902.13433: variable 'item' from source: unknown 30564 1726882902.13444: variable 'omit' from source: magic vars 30564 1726882902.13457: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882902.13465: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882902.13473: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882902.13481: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882902.13484: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882902.13486: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882902.13531: Set connection var ansible_timeout to 10 30564 1726882902.13534: Set connection var ansible_pipelining to False 30564 1726882902.13537: Set connection var ansible_shell_type to sh 30564 1726882902.13544: Set connection var ansible_shell_executable to /bin/sh 30564 1726882902.13549: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882902.13552: Set connection var ansible_connection to ssh 30564 1726882902.13572: variable 'ansible_shell_executable' from source: unknown 30564 1726882902.13575: variable 'ansible_connection' from source: unknown 30564 1726882902.13578: variable 'ansible_module_compression' from source: unknown 30564 1726882902.13580: variable 'ansible_shell_type' from source: unknown 30564 1726882902.13582: variable 'ansible_shell_executable' from source: unknown 30564 1726882902.13584: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882902.13586: variable 'ansible_pipelining' from source: unknown 30564 1726882902.13589: variable 'ansible_timeout' from source: unknown 30564 1726882902.13591: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882902.13644: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882902.13650: variable 'omit' from source: magic vars 30564 1726882902.13655: starting attempt loop 30564 1726882902.13661: running the handler 30564 1726882902.13680: variable 'lsr_assert_when' from source: include params 30564 1726882902.13720: variable 'lsr_assert_when' from source: include params 30564 1726882902.13779: variable 'network_provider' from source: set_fact 30564 1726882902.13803: handler run complete 30564 1726882902.13813: attempt loop complete, returning result 30564 1726882902.13824: variable 'item' from source: unknown 30564 1726882902.13867: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_assert_when) => { "ansible_loop_var": "item", "item": "lsr_assert_when", "lsr_assert_when": [ { "condition": true, "what": "tasks/assert_device_absent.yml" } ] } 30564 1726882902.13948: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882902.13951: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882902.13954: variable 'omit' from source: magic vars 30564 1726882902.14045: variable 'ansible_distribution_major_version' from source: facts 30564 1726882902.14049: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882902.14056: variable 'omit' from source: magic vars 30564 1726882902.14066: variable 'omit' from source: magic vars 30564 1726882902.14096: variable 'item' from source: unknown 30564 1726882902.14136: variable 'item' from source: unknown 30564 1726882902.14147: variable 'omit' from source: magic vars 30564 1726882902.14164: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882902.14167: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882902.14174: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882902.14182: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882902.14185: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882902.14187: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882902.14233: Set connection var ansible_timeout to 10 30564 1726882902.14236: Set connection var ansible_pipelining to False 30564 1726882902.14239: Set connection var ansible_shell_type to sh 30564 1726882902.14244: Set connection var ansible_shell_executable to /bin/sh 30564 1726882902.14251: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882902.14253: Set connection var ansible_connection to ssh 30564 1726882902.14272: variable 'ansible_shell_executable' from source: unknown 30564 1726882902.14275: variable 'ansible_connection' from source: unknown 30564 1726882902.14278: variable 'ansible_module_compression' from source: unknown 30564 1726882902.14281: variable 'ansible_shell_type' from source: unknown 30564 1726882902.14283: variable 'ansible_shell_executable' from source: unknown 30564 1726882902.14285: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882902.14287: variable 'ansible_pipelining' from source: unknown 30564 1726882902.14289: variable 'ansible_timeout' from source: unknown 30564 1726882902.14291: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882902.14347: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882902.14353: variable 'omit' from source: magic vars 30564 1726882902.14356: starting attempt loop 30564 1726882902.14358: running the handler 30564 1726882902.14378: variable 'lsr_fail_debug' from source: play vars 30564 1726882902.14421: variable 'lsr_fail_debug' from source: play vars 30564 1726882902.14434: handler run complete 30564 1726882902.14442: attempt loop complete, returning result 30564 1726882902.14454: variable 'item' from source: unknown 30564 1726882902.14497: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_fail_debug) => { "ansible_loop_var": "item", "item": "lsr_fail_debug", "lsr_fail_debug": [ "__network_connections_result" ] } 30564 1726882902.14579: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882902.14582: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882902.14584: variable 'omit' from source: magic vars 30564 1726882902.14674: variable 'ansible_distribution_major_version' from source: facts 30564 1726882902.14678: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882902.14684: variable 'omit' from source: magic vars 30564 1726882902.14694: variable 'omit' from source: magic vars 30564 1726882902.14721: variable 'item' from source: unknown 30564 1726882902.14763: variable 'item' from source: unknown 30564 1726882902.14777: variable 'omit' from source: magic vars 30564 1726882902.14790: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882902.14799: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882902.14802: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882902.14813: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882902.14815: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882902.14818: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882902.14859: Set connection var ansible_timeout to 10 30564 1726882902.14862: Set connection var ansible_pipelining to False 30564 1726882902.14871: Set connection var ansible_shell_type to sh 30564 1726882902.14874: Set connection var ansible_shell_executable to /bin/sh 30564 1726882902.14879: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882902.14881: Set connection var ansible_connection to ssh 30564 1726882902.14895: variable 'ansible_shell_executable' from source: unknown 30564 1726882902.14898: variable 'ansible_connection' from source: unknown 30564 1726882902.14903: variable 'ansible_module_compression' from source: unknown 30564 1726882902.14908: variable 'ansible_shell_type' from source: unknown 30564 1726882902.14910: variable 'ansible_shell_executable' from source: unknown 30564 1726882902.14912: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882902.14914: variable 'ansible_pipelining' from source: unknown 30564 1726882902.14920: variable 'ansible_timeout' from source: unknown 30564 1726882902.14923: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882902.14978: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882902.14982: variable 'omit' from source: magic vars 30564 1726882902.14984: starting attempt loop 30564 1726882902.14987: running the handler 30564 1726882902.15000: variable 'lsr_cleanup' from source: include params 30564 1726882902.15045: variable 'lsr_cleanup' from source: include params 30564 1726882902.15058: handler run complete 30564 1726882902.15073: attempt loop complete, returning result 30564 1726882902.15088: variable 'item' from source: unknown 30564 1726882902.15126: variable 'item' from source: unknown ok: [managed_node2] => (item=lsr_cleanup) => { "ansible_loop_var": "item", "item": "lsr_cleanup", "lsr_cleanup": [ "tasks/cleanup_profile+device.yml", "tasks/check_network_dns.yml" ] } 30564 1726882902.15204: dumping result to json 30564 1726882902.15207: done dumping result, returning 30564 1726882902.15210: done running TaskExecutor() for managed_node2/TASK: Show item [0e448fcc-3ce9-4216-acec-0000000020ae] 30564 1726882902.15212: sending task result for task 0e448fcc-3ce9-4216-acec-0000000020ae 30564 1726882902.15310: no more pending results, returning what we have 30564 1726882902.15315: results queue empty 30564 1726882902.15316: checking for any_errors_fatal 30564 1726882902.15321: done checking for any_errors_fatal 30564 1726882902.15322: checking for max_fail_percentage 30564 1726882902.15324: done checking for max_fail_percentage 30564 1726882902.15325: checking to see if all hosts have failed and the running result is not ok 30564 1726882902.15325: done checking to see if all hosts have failed 30564 1726882902.15327: getting the remaining hosts for this loop 30564 1726882902.15328: done getting the remaining hosts for this loop 30564 1726882902.15331: getting the next task for host managed_node2 30564 1726882902.15337: done getting next task for host managed_node2 30564 1726882902.15339: ^ task is: TASK: Include the task 'show_interfaces.yml' 30564 1726882902.15342: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882902.15346: getting variables 30564 1726882902.15349: in VariableManager get_vars() 30564 1726882902.15387: Calling all_inventory to load vars for managed_node2 30564 1726882902.15390: Calling groups_inventory to load vars for managed_node2 30564 1726882902.15393: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882902.15403: Calling all_plugins_play to load vars for managed_node2 30564 1726882902.15406: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882902.15408: Calling groups_plugins_play to load vars for managed_node2 30564 1726882902.15981: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000020ae 30564 1726882902.16233: WORKER PROCESS EXITING 30564 1726882902.16243: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882902.17182: done with get_vars() 30564 1726882902.17198: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:21 Friday 20 September 2024 21:41:42 -0400 (0:00:00.077) 0:01:40.753 ****** 30564 1726882902.17260: entering _queue_task() for managed_node2/include_tasks 30564 1726882902.17466: worker is 1 (out of 1 available) 30564 1726882902.17479: exiting _queue_task() for managed_node2/include_tasks 30564 1726882902.17491: done queuing things up, now waiting for results queue to drain 30564 1726882902.17493: waiting for pending results... 30564 1726882902.17680: running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' 30564 1726882902.17752: in run() - task 0e448fcc-3ce9-4216-acec-0000000020af 30564 1726882902.17764: variable 'ansible_search_path' from source: unknown 30564 1726882902.17768: variable 'ansible_search_path' from source: unknown 30564 1726882902.17796: calling self._execute() 30564 1726882902.17874: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882902.17880: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882902.17890: variable 'omit' from source: magic vars 30564 1726882902.18162: variable 'ansible_distribution_major_version' from source: facts 30564 1726882902.18181: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882902.18184: _execute() done 30564 1726882902.18189: dumping result to json 30564 1726882902.18191: done dumping result, returning 30564 1726882902.18194: done running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' [0e448fcc-3ce9-4216-acec-0000000020af] 30564 1726882902.18200: sending task result for task 0e448fcc-3ce9-4216-acec-0000000020af 30564 1726882902.18291: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000020af 30564 1726882902.18294: WORKER PROCESS EXITING 30564 1726882902.18324: no more pending results, returning what we have 30564 1726882902.18331: in VariableManager get_vars() 30564 1726882902.18378: Calling all_inventory to load vars for managed_node2 30564 1726882902.18381: Calling groups_inventory to load vars for managed_node2 30564 1726882902.18385: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882902.18398: Calling all_plugins_play to load vars for managed_node2 30564 1726882902.18400: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882902.18407: Calling groups_plugins_play to load vars for managed_node2 30564 1726882902.19333: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882902.20272: done with get_vars() 30564 1726882902.20287: variable 'ansible_search_path' from source: unknown 30564 1726882902.20288: variable 'ansible_search_path' from source: unknown 30564 1726882902.20316: we have included files to process 30564 1726882902.20316: generating all_blocks data 30564 1726882902.20318: done generating all_blocks data 30564 1726882902.20321: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30564 1726882902.20321: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30564 1726882902.20323: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 30564 1726882902.20396: in VariableManager get_vars() 30564 1726882902.20409: done with get_vars() 30564 1726882902.20484: done processing included file 30564 1726882902.20486: iterating over new_blocks loaded from include file 30564 1726882902.20487: in VariableManager get_vars() 30564 1726882902.20497: done with get_vars() 30564 1726882902.20498: filtering new block on tags 30564 1726882902.20518: done filtering new block on tags 30564 1726882902.20519: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node2 30564 1726882902.20523: extending task lists for all hosts with included blocks 30564 1726882902.20790: done extending task lists 30564 1726882902.20792: done processing included files 30564 1726882902.20793: results queue empty 30564 1726882902.20793: checking for any_errors_fatal 30564 1726882902.20798: done checking for any_errors_fatal 30564 1726882902.20799: checking for max_fail_percentage 30564 1726882902.20799: done checking for max_fail_percentage 30564 1726882902.20800: checking to see if all hosts have failed and the running result is not ok 30564 1726882902.20800: done checking to see if all hosts have failed 30564 1726882902.20801: getting the remaining hosts for this loop 30564 1726882902.20802: done getting the remaining hosts for this loop 30564 1726882902.20804: getting the next task for host managed_node2 30564 1726882902.20807: done getting next task for host managed_node2 30564 1726882902.20808: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 30564 1726882902.20810: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882902.20812: getting variables 30564 1726882902.20812: in VariableManager get_vars() 30564 1726882902.20819: Calling all_inventory to load vars for managed_node2 30564 1726882902.20821: Calling groups_inventory to load vars for managed_node2 30564 1726882902.20822: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882902.20825: Calling all_plugins_play to load vars for managed_node2 30564 1726882902.20827: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882902.20828: Calling groups_plugins_play to load vars for managed_node2 30564 1726882902.21566: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882902.22474: done with get_vars() 30564 1726882902.22488: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 21:41:42 -0400 (0:00:00.052) 0:01:40.806 ****** 30564 1726882902.22538: entering _queue_task() for managed_node2/include_tasks 30564 1726882902.22769: worker is 1 (out of 1 available) 30564 1726882902.22783: exiting _queue_task() for managed_node2/include_tasks 30564 1726882902.22796: done queuing things up, now waiting for results queue to drain 30564 1726882902.22797: waiting for pending results... 30564 1726882902.22980: running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' 30564 1726882902.23055: in run() - task 0e448fcc-3ce9-4216-acec-0000000020d6 30564 1726882902.23072: variable 'ansible_search_path' from source: unknown 30564 1726882902.23075: variable 'ansible_search_path' from source: unknown 30564 1726882902.23102: calling self._execute() 30564 1726882902.23183: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882902.23188: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882902.23193: variable 'omit' from source: magic vars 30564 1726882902.23468: variable 'ansible_distribution_major_version' from source: facts 30564 1726882902.23483: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882902.23489: _execute() done 30564 1726882902.23492: dumping result to json 30564 1726882902.23495: done dumping result, returning 30564 1726882902.23503: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' [0e448fcc-3ce9-4216-acec-0000000020d6] 30564 1726882902.23511: sending task result for task 0e448fcc-3ce9-4216-acec-0000000020d6 30564 1726882902.23598: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000020d6 30564 1726882902.23601: WORKER PROCESS EXITING 30564 1726882902.23631: no more pending results, returning what we have 30564 1726882902.23635: in VariableManager get_vars() 30564 1726882902.23682: Calling all_inventory to load vars for managed_node2 30564 1726882902.23685: Calling groups_inventory to load vars for managed_node2 30564 1726882902.23688: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882902.23700: Calling all_plugins_play to load vars for managed_node2 30564 1726882902.23703: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882902.23706: Calling groups_plugins_play to load vars for managed_node2 30564 1726882902.24511: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882902.25447: done with get_vars() 30564 1726882902.25462: variable 'ansible_search_path' from source: unknown 30564 1726882902.25465: variable 'ansible_search_path' from source: unknown 30564 1726882902.25489: we have included files to process 30564 1726882902.25490: generating all_blocks data 30564 1726882902.25491: done generating all_blocks data 30564 1726882902.25492: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30564 1726882902.25493: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30564 1726882902.25494: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 30564 1726882902.25658: done processing included file 30564 1726882902.25660: iterating over new_blocks loaded from include file 30564 1726882902.25661: in VariableManager get_vars() 30564 1726882902.25676: done with get_vars() 30564 1726882902.25678: filtering new block on tags 30564 1726882902.25701: done filtering new block on tags 30564 1726882902.25703: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node2 30564 1726882902.25706: extending task lists for all hosts with included blocks 30564 1726882902.25802: done extending task lists 30564 1726882902.25803: done processing included files 30564 1726882902.25803: results queue empty 30564 1726882902.25803: checking for any_errors_fatal 30564 1726882902.25806: done checking for any_errors_fatal 30564 1726882902.25806: checking for max_fail_percentage 30564 1726882902.25807: done checking for max_fail_percentage 30564 1726882902.25808: checking to see if all hosts have failed and the running result is not ok 30564 1726882902.25808: done checking to see if all hosts have failed 30564 1726882902.25809: getting the remaining hosts for this loop 30564 1726882902.25809: done getting the remaining hosts for this loop 30564 1726882902.25811: getting the next task for host managed_node2 30564 1726882902.25814: done getting next task for host managed_node2 30564 1726882902.25816: ^ task is: TASK: Gather current interface info 30564 1726882902.25818: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882902.25819: getting variables 30564 1726882902.25820: in VariableManager get_vars() 30564 1726882902.25827: Calling all_inventory to load vars for managed_node2 30564 1726882902.25828: Calling groups_inventory to load vars for managed_node2 30564 1726882902.25829: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882902.25833: Calling all_plugins_play to load vars for managed_node2 30564 1726882902.25834: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882902.25836: Calling groups_plugins_play to load vars for managed_node2 30564 1726882902.26554: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882902.27462: done with get_vars() 30564 1726882902.27479: done getting variables 30564 1726882902.27505: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 21:41:42 -0400 (0:00:00.049) 0:01:40.856 ****** 30564 1726882902.27528: entering _queue_task() for managed_node2/command 30564 1726882902.27729: worker is 1 (out of 1 available) 30564 1726882902.27740: exiting _queue_task() for managed_node2/command 30564 1726882902.27752: done queuing things up, now waiting for results queue to drain 30564 1726882902.27753: waiting for pending results... 30564 1726882902.27940: running TaskExecutor() for managed_node2/TASK: Gather current interface info 30564 1726882902.28023: in run() - task 0e448fcc-3ce9-4216-acec-000000002111 30564 1726882902.28033: variable 'ansible_search_path' from source: unknown 30564 1726882902.28036: variable 'ansible_search_path' from source: unknown 30564 1726882902.28066: calling self._execute() 30564 1726882902.28136: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882902.28140: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882902.28149: variable 'omit' from source: magic vars 30564 1726882902.28418: variable 'ansible_distribution_major_version' from source: facts 30564 1726882902.28429: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882902.28434: variable 'omit' from source: magic vars 30564 1726882902.28469: variable 'omit' from source: magic vars 30564 1726882902.28494: variable 'omit' from source: magic vars 30564 1726882902.28527: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882902.28552: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882902.28568: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882902.28583: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882902.28594: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882902.28619: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882902.28622: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882902.28625: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882902.28697: Set connection var ansible_timeout to 10 30564 1726882902.28700: Set connection var ansible_pipelining to False 30564 1726882902.28703: Set connection var ansible_shell_type to sh 30564 1726882902.28708: Set connection var ansible_shell_executable to /bin/sh 30564 1726882902.28719: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882902.28722: Set connection var ansible_connection to ssh 30564 1726882902.28737: variable 'ansible_shell_executable' from source: unknown 30564 1726882902.28740: variable 'ansible_connection' from source: unknown 30564 1726882902.28742: variable 'ansible_module_compression' from source: unknown 30564 1726882902.28745: variable 'ansible_shell_type' from source: unknown 30564 1726882902.28747: variable 'ansible_shell_executable' from source: unknown 30564 1726882902.28749: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882902.28751: variable 'ansible_pipelining' from source: unknown 30564 1726882902.28756: variable 'ansible_timeout' from source: unknown 30564 1726882902.28759: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882902.28856: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882902.28867: variable 'omit' from source: magic vars 30564 1726882902.28875: starting attempt loop 30564 1726882902.28878: running the handler 30564 1726882902.28890: _low_level_execute_command(): starting 30564 1726882902.28898: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882902.29425: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882902.29435: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882902.29460: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 30564 1726882902.29476: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 30564 1726882902.29490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882902.29535: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882902.29547: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882902.29670: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882902.31342: stdout chunk (state=3): >>>/root <<< 30564 1726882902.31445: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882902.31499: stderr chunk (state=3): >>><<< 30564 1726882902.31502: stdout chunk (state=3): >>><<< 30564 1726882902.31525: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882902.31536: _low_level_execute_command(): starting 30564 1726882902.31541: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882902.3152328-34987-209879346107439 `" && echo ansible-tmp-1726882902.3152328-34987-209879346107439="` echo /root/.ansible/tmp/ansible-tmp-1726882902.3152328-34987-209879346107439 `" ) && sleep 0' 30564 1726882902.31977: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882902.31984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882902.32017: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882902.32039: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882902.32090: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882902.32098: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882902.32215: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882902.34083: stdout chunk (state=3): >>>ansible-tmp-1726882902.3152328-34987-209879346107439=/root/.ansible/tmp/ansible-tmp-1726882902.3152328-34987-209879346107439 <<< 30564 1726882902.34196: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882902.34236: stderr chunk (state=3): >>><<< 30564 1726882902.34239: stdout chunk (state=3): >>><<< 30564 1726882902.34254: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882902.3152328-34987-209879346107439=/root/.ansible/tmp/ansible-tmp-1726882902.3152328-34987-209879346107439 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882902.34284: variable 'ansible_module_compression' from source: unknown 30564 1726882902.34326: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30564 1726882902.34355: variable 'ansible_facts' from source: unknown 30564 1726882902.34420: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882902.3152328-34987-209879346107439/AnsiballZ_command.py 30564 1726882902.34518: Sending initial data 30564 1726882902.34522: Sent initial data (156 bytes) 30564 1726882902.35157: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882902.35163: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882902.35198: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882902.35211: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882902.35223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882902.35272: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882902.35282: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882902.35390: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882902.37127: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 <<< 30564 1726882902.37134: stderr chunk (state=3): >>>debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882902.37230: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882902.37331: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmp_3ghcvgp /root/.ansible/tmp/ansible-tmp-1726882902.3152328-34987-209879346107439/AnsiballZ_command.py <<< 30564 1726882902.37422: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882902.38439: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882902.38548: stderr chunk (state=3): >>><<< 30564 1726882902.38551: stdout chunk (state=3): >>><<< 30564 1726882902.38573: done transferring module to remote 30564 1726882902.38581: _low_level_execute_command(): starting 30564 1726882902.38585: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882902.3152328-34987-209879346107439/ /root/.ansible/tmp/ansible-tmp-1726882902.3152328-34987-209879346107439/AnsiballZ_command.py && sleep 0' 30564 1726882902.39041: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882902.39048: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882902.39093: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882902.39100: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882902.39110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882902.39171: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882902.39175: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882902.39291: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882902.41075: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882902.41147: stderr chunk (state=3): >>><<< 30564 1726882902.41150: stdout chunk (state=3): >>><<< 30564 1726882902.41254: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882902.41258: _low_level_execute_command(): starting 30564 1726882902.41260: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882902.3152328-34987-209879346107439/AnsiballZ_command.py && sleep 0' 30564 1726882902.41871: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882902.41889: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882902.41910: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882902.41932: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882902.41978: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882902.41991: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882902.42006: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882902.42035: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882902.42048: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882902.42060: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882902.42078: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882902.42094: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882902.42111: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882902.42132: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882902.42149: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882902.42165: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882902.42250: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882902.42276: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882902.42294: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882902.42432: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882902.55776: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo\nrpltstbr", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:41:42.552423", "end": "2024-09-20 21:41:42.555779", "delta": "0:00:00.003356", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30564 1726882902.56947: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882902.57007: stderr chunk (state=3): >>><<< 30564 1726882902.57010: stdout chunk (state=3): >>><<< 30564 1726882902.57030: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo\nrpltstbr", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:41:42.552423", "end": "2024-09-20 21:41:42.555779", "delta": "0:00:00.003356", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882902.57058: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882902.3152328-34987-209879346107439/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882902.57066: _low_level_execute_command(): starting 30564 1726882902.57073: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882902.3152328-34987-209879346107439/ > /dev/null 2>&1 && sleep 0' 30564 1726882902.57534: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882902.57538: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882902.57572: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 30564 1726882902.57576: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882902.57592: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882902.57624: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882902.57635: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882902.57739: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882902.59533: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882902.59583: stderr chunk (state=3): >>><<< 30564 1726882902.59586: stdout chunk (state=3): >>><<< 30564 1726882902.59601: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882902.59608: handler run complete 30564 1726882902.59624: Evaluated conditional (False): False 30564 1726882902.59634: attempt loop complete, returning result 30564 1726882902.59636: _execute() done 30564 1726882902.59639: dumping result to json 30564 1726882902.59644: done dumping result, returning 30564 1726882902.59650: done running TaskExecutor() for managed_node2/TASK: Gather current interface info [0e448fcc-3ce9-4216-acec-000000002111] 30564 1726882902.59656: sending task result for task 0e448fcc-3ce9-4216-acec-000000002111 30564 1726882902.59761: done sending task result for task 0e448fcc-3ce9-4216-acec-000000002111 30564 1726882902.59766: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003356", "end": "2024-09-20 21:41:42.555779", "rc": 0, "start": "2024-09-20 21:41:42.552423" } STDOUT: bonding_masters eth0 lo rpltstbr 30564 1726882902.59836: no more pending results, returning what we have 30564 1726882902.59840: results queue empty 30564 1726882902.59841: checking for any_errors_fatal 30564 1726882902.59843: done checking for any_errors_fatal 30564 1726882902.59844: checking for max_fail_percentage 30564 1726882902.59846: done checking for max_fail_percentage 30564 1726882902.59847: checking to see if all hosts have failed and the running result is not ok 30564 1726882902.59848: done checking to see if all hosts have failed 30564 1726882902.59848: getting the remaining hosts for this loop 30564 1726882902.59850: done getting the remaining hosts for this loop 30564 1726882902.59854: getting the next task for host managed_node2 30564 1726882902.59862: done getting next task for host managed_node2 30564 1726882902.59866: ^ task is: TASK: Set current_interfaces 30564 1726882902.59873: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882902.59879: getting variables 30564 1726882902.59880: in VariableManager get_vars() 30564 1726882902.59920: Calling all_inventory to load vars for managed_node2 30564 1726882902.59922: Calling groups_inventory to load vars for managed_node2 30564 1726882902.59926: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882902.59936: Calling all_plugins_play to load vars for managed_node2 30564 1726882902.59939: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882902.59942: Calling groups_plugins_play to load vars for managed_node2 30564 1726882902.60933: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882902.65631: done with get_vars() 30564 1726882902.65651: done getting variables 30564 1726882902.65689: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 21:41:42 -0400 (0:00:00.381) 0:01:41.238 ****** 30564 1726882902.65710: entering _queue_task() for managed_node2/set_fact 30564 1726882902.65948: worker is 1 (out of 1 available) 30564 1726882902.65961: exiting _queue_task() for managed_node2/set_fact 30564 1726882902.65977: done queuing things up, now waiting for results queue to drain 30564 1726882902.65979: waiting for pending results... 30564 1726882902.66160: running TaskExecutor() for managed_node2/TASK: Set current_interfaces 30564 1726882902.66257: in run() - task 0e448fcc-3ce9-4216-acec-000000002112 30564 1726882902.66274: variable 'ansible_search_path' from source: unknown 30564 1726882902.66279: variable 'ansible_search_path' from source: unknown 30564 1726882902.66307: calling self._execute() 30564 1726882902.66385: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882902.66389: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882902.66397: variable 'omit' from source: magic vars 30564 1726882902.66689: variable 'ansible_distribution_major_version' from source: facts 30564 1726882902.66703: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882902.66709: variable 'omit' from source: magic vars 30564 1726882902.66745: variable 'omit' from source: magic vars 30564 1726882902.66908: variable '_current_interfaces' from source: set_fact 30564 1726882902.67002: variable 'omit' from source: magic vars 30564 1726882902.67048: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882902.67093: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882902.67119: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882902.67140: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882902.67162: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882902.67202: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882902.67210: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882902.67217: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882902.67327: Set connection var ansible_timeout to 10 30564 1726882902.67338: Set connection var ansible_pipelining to False 30564 1726882902.67344: Set connection var ansible_shell_type to sh 30564 1726882902.67353: Set connection var ansible_shell_executable to /bin/sh 30564 1726882902.67371: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882902.67379: Set connection var ansible_connection to ssh 30564 1726882902.67410: variable 'ansible_shell_executable' from source: unknown 30564 1726882902.67418: variable 'ansible_connection' from source: unknown 30564 1726882902.67425: variable 'ansible_module_compression' from source: unknown 30564 1726882902.67430: variable 'ansible_shell_type' from source: unknown 30564 1726882902.67436: variable 'ansible_shell_executable' from source: unknown 30564 1726882902.67442: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882902.67448: variable 'ansible_pipelining' from source: unknown 30564 1726882902.67454: variable 'ansible_timeout' from source: unknown 30564 1726882902.67461: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882902.67631: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882902.67651: variable 'omit' from source: magic vars 30564 1726882902.67662: starting attempt loop 30564 1726882902.67672: running the handler 30564 1726882902.67694: handler run complete 30564 1726882902.67709: attempt loop complete, returning result 30564 1726882902.67717: _execute() done 30564 1726882902.67726: dumping result to json 30564 1726882902.67735: done dumping result, returning 30564 1726882902.67746: done running TaskExecutor() for managed_node2/TASK: Set current_interfaces [0e448fcc-3ce9-4216-acec-000000002112] 30564 1726882902.67756: sending task result for task 0e448fcc-3ce9-4216-acec-000000002112 ok: [managed_node2] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo", "rpltstbr" ] }, "changed": false } 30564 1726882902.67919: no more pending results, returning what we have 30564 1726882902.67922: results queue empty 30564 1726882902.67923: checking for any_errors_fatal 30564 1726882902.67935: done checking for any_errors_fatal 30564 1726882902.67936: checking for max_fail_percentage 30564 1726882902.67938: done checking for max_fail_percentage 30564 1726882902.67939: checking to see if all hosts have failed and the running result is not ok 30564 1726882902.67940: done checking to see if all hosts have failed 30564 1726882902.67941: getting the remaining hosts for this loop 30564 1726882902.67943: done getting the remaining hosts for this loop 30564 1726882902.67947: getting the next task for host managed_node2 30564 1726882902.67958: done getting next task for host managed_node2 30564 1726882902.67961: ^ task is: TASK: Show current_interfaces 30564 1726882902.67967: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882902.67972: getting variables 30564 1726882902.67974: in VariableManager get_vars() 30564 1726882902.68014: Calling all_inventory to load vars for managed_node2 30564 1726882902.68017: Calling groups_inventory to load vars for managed_node2 30564 1726882902.68021: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882902.68033: Calling all_plugins_play to load vars for managed_node2 30564 1726882902.68036: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882902.68039: Calling groups_plugins_play to load vars for managed_node2 30564 1726882902.69087: done sending task result for task 0e448fcc-3ce9-4216-acec-000000002112 30564 1726882902.69090: WORKER PROCESS EXITING 30564 1726882902.69360: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882902.70317: done with get_vars() 30564 1726882902.70337: done getting variables 30564 1726882902.70382: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 21:41:42 -0400 (0:00:00.046) 0:01:41.285 ****** 30564 1726882902.70405: entering _queue_task() for managed_node2/debug 30564 1726882902.70725: worker is 1 (out of 1 available) 30564 1726882902.70738: exiting _queue_task() for managed_node2/debug 30564 1726882902.70751: done queuing things up, now waiting for results queue to drain 30564 1726882902.70752: waiting for pending results... 30564 1726882902.71062: running TaskExecutor() for managed_node2/TASK: Show current_interfaces 30564 1726882902.71185: in run() - task 0e448fcc-3ce9-4216-acec-0000000020d7 30564 1726882902.71210: variable 'ansible_search_path' from source: unknown 30564 1726882902.71217: variable 'ansible_search_path' from source: unknown 30564 1726882902.71254: calling self._execute() 30564 1726882902.71368: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882902.71381: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882902.71396: variable 'omit' from source: magic vars 30564 1726882902.71799: variable 'ansible_distribution_major_version' from source: facts 30564 1726882902.71829: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882902.71839: variable 'omit' from source: magic vars 30564 1726882902.71917: variable 'omit' from source: magic vars 30564 1726882902.72028: variable 'current_interfaces' from source: set_fact 30564 1726882902.72066: variable 'omit' from source: magic vars 30564 1726882902.72111: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882902.72152: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882902.72181: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882902.72203: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882902.72219: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882902.72253: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882902.72261: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882902.72273: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882902.72618: Set connection var ansible_timeout to 10 30564 1726882902.72629: Set connection var ansible_pipelining to False 30564 1726882902.72635: Set connection var ansible_shell_type to sh 30564 1726882902.72644: Set connection var ansible_shell_executable to /bin/sh 30564 1726882902.72655: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882902.72661: Set connection var ansible_connection to ssh 30564 1726882902.72694: variable 'ansible_shell_executable' from source: unknown 30564 1726882902.72705: variable 'ansible_connection' from source: unknown 30564 1726882902.72712: variable 'ansible_module_compression' from source: unknown 30564 1726882902.72718: variable 'ansible_shell_type' from source: unknown 30564 1726882902.72724: variable 'ansible_shell_executable' from source: unknown 30564 1726882902.72731: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882902.72738: variable 'ansible_pipelining' from source: unknown 30564 1726882902.72744: variable 'ansible_timeout' from source: unknown 30564 1726882902.72751: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882902.72894: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882902.72915: variable 'omit' from source: magic vars 30564 1726882902.72925: starting attempt loop 30564 1726882902.72931: running the handler 30564 1726882902.72981: handler run complete 30564 1726882902.72999: attempt loop complete, returning result 30564 1726882902.73005: _execute() done 30564 1726882902.73011: dumping result to json 30564 1726882902.73017: done dumping result, returning 30564 1726882902.73031: done running TaskExecutor() for managed_node2/TASK: Show current_interfaces [0e448fcc-3ce9-4216-acec-0000000020d7] 30564 1726882902.73042: sending task result for task 0e448fcc-3ce9-4216-acec-0000000020d7 ok: [managed_node2] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo', 'rpltstbr'] 30564 1726882902.73182: no more pending results, returning what we have 30564 1726882902.73186: results queue empty 30564 1726882902.73187: checking for any_errors_fatal 30564 1726882902.73195: done checking for any_errors_fatal 30564 1726882902.73195: checking for max_fail_percentage 30564 1726882902.73198: done checking for max_fail_percentage 30564 1726882902.73199: checking to see if all hosts have failed and the running result is not ok 30564 1726882902.73200: done checking to see if all hosts have failed 30564 1726882902.73201: getting the remaining hosts for this loop 30564 1726882902.73203: done getting the remaining hosts for this loop 30564 1726882902.73207: getting the next task for host managed_node2 30564 1726882902.73218: done getting next task for host managed_node2 30564 1726882902.73222: ^ task is: TASK: Setup 30564 1726882902.73225: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882902.73231: getting variables 30564 1726882902.73232: in VariableManager get_vars() 30564 1726882902.73274: Calling all_inventory to load vars for managed_node2 30564 1726882902.73277: Calling groups_inventory to load vars for managed_node2 30564 1726882902.73281: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882902.73292: Calling all_plugins_play to load vars for managed_node2 30564 1726882902.73296: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882902.73298: Calling groups_plugins_play to load vars for managed_node2 30564 1726882902.74286: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000020d7 30564 1726882902.74290: WORKER PROCESS EXITING 30564 1726882902.75186: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882902.76877: done with get_vars() 30564 1726882902.76900: done getting variables TASK [Setup] ******************************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:24 Friday 20 September 2024 21:41:42 -0400 (0:00:00.065) 0:01:41.351 ****** 30564 1726882902.76993: entering _queue_task() for managed_node2/include_tasks 30564 1726882902.77335: worker is 1 (out of 1 available) 30564 1726882902.77349: exiting _queue_task() for managed_node2/include_tasks 30564 1726882902.77360: done queuing things up, now waiting for results queue to drain 30564 1726882902.77362: waiting for pending results... 30564 1726882902.77682: running TaskExecutor() for managed_node2/TASK: Setup 30564 1726882902.77803: in run() - task 0e448fcc-3ce9-4216-acec-0000000020b0 30564 1726882902.77825: variable 'ansible_search_path' from source: unknown 30564 1726882902.77833: variable 'ansible_search_path' from source: unknown 30564 1726882902.77903: variable 'lsr_setup' from source: include params 30564 1726882902.78174: variable 'lsr_setup' from source: include params 30564 1726882902.78269: variable 'omit' from source: magic vars 30564 1726882902.78474: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882902.78490: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882902.78508: variable 'omit' from source: magic vars 30564 1726882902.78803: variable 'ansible_distribution_major_version' from source: facts 30564 1726882902.78819: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882902.78830: variable 'item' from source: unknown 30564 1726882902.78905: variable 'item' from source: unknown 30564 1726882902.78942: variable 'item' from source: unknown 30564 1726882902.79015: variable 'item' from source: unknown 30564 1726882902.79210: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882902.79224: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882902.79236: variable 'omit' from source: magic vars 30564 1726882902.79393: variable 'ansible_distribution_major_version' from source: facts 30564 1726882902.79404: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882902.79412: variable 'item' from source: unknown 30564 1726882902.79498: variable 'item' from source: unknown 30564 1726882902.79539: variable 'item' from source: unknown 30564 1726882902.79671: variable 'item' from source: unknown 30564 1726882902.79830: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882902.79837: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882902.79844: variable 'omit' from source: magic vars 30564 1726882902.79971: variable 'ansible_distribution_major_version' from source: facts 30564 1726882902.79978: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882902.79983: variable 'item' from source: unknown 30564 1726882902.80029: variable 'item' from source: unknown 30564 1726882902.80051: variable 'item' from source: unknown 30564 1726882902.80102: variable 'item' from source: unknown 30564 1726882902.80159: dumping result to json 30564 1726882902.80162: done dumping result, returning 30564 1726882902.80166: done running TaskExecutor() for managed_node2/TASK: Setup [0e448fcc-3ce9-4216-acec-0000000020b0] 30564 1726882902.80169: sending task result for task 0e448fcc-3ce9-4216-acec-0000000020b0 30564 1726882902.80203: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000020b0 30564 1726882902.80206: WORKER PROCESS EXITING 30564 1726882902.80229: no more pending results, returning what we have 30564 1726882902.80234: in VariableManager get_vars() 30564 1726882902.80279: Calling all_inventory to load vars for managed_node2 30564 1726882902.80282: Calling groups_inventory to load vars for managed_node2 30564 1726882902.80285: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882902.80298: Calling all_plugins_play to load vars for managed_node2 30564 1726882902.80302: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882902.80304: Calling groups_plugins_play to load vars for managed_node2 30564 1726882902.81142: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882902.82839: done with get_vars() 30564 1726882902.82853: variable 'ansible_search_path' from source: unknown 30564 1726882902.82854: variable 'ansible_search_path' from source: unknown 30564 1726882902.82885: variable 'ansible_search_path' from source: unknown 30564 1726882902.82886: variable 'ansible_search_path' from source: unknown 30564 1726882902.82904: variable 'ansible_search_path' from source: unknown 30564 1726882902.82905: variable 'ansible_search_path' from source: unknown 30564 1726882902.82923: we have included files to process 30564 1726882902.82924: generating all_blocks data 30564 1726882902.82925: done generating all_blocks data 30564 1726882902.82928: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 30564 1726882902.82929: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 30564 1726882902.82930: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml 30564 1726882902.83097: done processing included file 30564 1726882902.83099: iterating over new_blocks loaded from include file 30564 1726882902.83100: in VariableManager get_vars() 30564 1726882902.83110: done with get_vars() 30564 1726882902.83112: filtering new block on tags 30564 1726882902.83135: done filtering new block on tags 30564 1726882902.83136: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml for managed_node2 => (item=tasks/create_bridge_profile.yml) 30564 1726882902.83139: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 30564 1726882902.83140: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 30564 1726882902.83142: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml 30564 1726882902.83205: done processing included file 30564 1726882902.83206: iterating over new_blocks loaded from include file 30564 1726882902.83207: in VariableManager get_vars() 30564 1726882902.83218: done with get_vars() 30564 1726882902.83219: filtering new block on tags 30564 1726882902.83232: done filtering new block on tags 30564 1726882902.83233: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml for managed_node2 => (item=tasks/activate_profile.yml) 30564 1726882902.83236: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml 30564 1726882902.83237: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml 30564 1726882902.83240: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml 30564 1726882902.83301: done processing included file 30564 1726882902.83303: iterating over new_blocks loaded from include file 30564 1726882902.83303: in VariableManager get_vars() 30564 1726882902.83314: done with get_vars() 30564 1726882902.83314: filtering new block on tags 30564 1726882902.83327: done filtering new block on tags 30564 1726882902.83329: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml for managed_node2 => (item=tasks/remove+down_profile.yml) 30564 1726882902.83331: extending task lists for all hosts with included blocks 30564 1726882902.83743: done extending task lists 30564 1726882902.83744: done processing included files 30564 1726882902.83744: results queue empty 30564 1726882902.83745: checking for any_errors_fatal 30564 1726882902.83748: done checking for any_errors_fatal 30564 1726882902.83748: checking for max_fail_percentage 30564 1726882902.83749: done checking for max_fail_percentage 30564 1726882902.83749: checking to see if all hosts have failed and the running result is not ok 30564 1726882902.83750: done checking to see if all hosts have failed 30564 1726882902.83750: getting the remaining hosts for this loop 30564 1726882902.83751: done getting the remaining hosts for this loop 30564 1726882902.83753: getting the next task for host managed_node2 30564 1726882902.83756: done getting next task for host managed_node2 30564 1726882902.83757: ^ task is: TASK: Include network role 30564 1726882902.83759: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882902.83761: getting variables 30564 1726882902.83761: in VariableManager get_vars() 30564 1726882902.83772: Calling all_inventory to load vars for managed_node2 30564 1726882902.83773: Calling groups_inventory to load vars for managed_node2 30564 1726882902.83775: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882902.83780: Calling all_plugins_play to load vars for managed_node2 30564 1726882902.83781: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882902.83783: Calling groups_plugins_play to load vars for managed_node2 30564 1726882902.84444: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882902.86057: done with get_vars() 30564 1726882902.86080: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml:3 Friday 20 September 2024 21:41:42 -0400 (0:00:00.091) 0:01:41.442 ****** 30564 1726882902.86155: entering _queue_task() for managed_node2/include_role 30564 1726882902.86612: worker is 1 (out of 1 available) 30564 1726882902.86624: exiting _queue_task() for managed_node2/include_role 30564 1726882902.86640: done queuing things up, now waiting for results queue to drain 30564 1726882902.86642: waiting for pending results... 30564 1726882902.86850: running TaskExecutor() for managed_node2/TASK: Include network role 30564 1726882902.86935: in run() - task 0e448fcc-3ce9-4216-acec-000000002139 30564 1726882902.86945: variable 'ansible_search_path' from source: unknown 30564 1726882902.86949: variable 'ansible_search_path' from source: unknown 30564 1726882902.86978: calling self._execute() 30564 1726882902.87053: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882902.87057: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882902.87071: variable 'omit' from source: magic vars 30564 1726882902.87352: variable 'ansible_distribution_major_version' from source: facts 30564 1726882902.87362: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882902.87375: _execute() done 30564 1726882902.87378: dumping result to json 30564 1726882902.87380: done dumping result, returning 30564 1726882902.87387: done running TaskExecutor() for managed_node2/TASK: Include network role [0e448fcc-3ce9-4216-acec-000000002139] 30564 1726882902.87392: sending task result for task 0e448fcc-3ce9-4216-acec-000000002139 30564 1726882902.87509: done sending task result for task 0e448fcc-3ce9-4216-acec-000000002139 30564 1726882902.87512: WORKER PROCESS EXITING 30564 1726882902.87543: no more pending results, returning what we have 30564 1726882902.87548: in VariableManager get_vars() 30564 1726882902.87595: Calling all_inventory to load vars for managed_node2 30564 1726882902.87598: Calling groups_inventory to load vars for managed_node2 30564 1726882902.87601: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882902.87613: Calling all_plugins_play to load vars for managed_node2 30564 1726882902.87616: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882902.87619: Calling groups_plugins_play to load vars for managed_node2 30564 1726882902.88611: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882902.90357: done with get_vars() 30564 1726882902.90379: variable 'ansible_search_path' from source: unknown 30564 1726882902.90381: variable 'ansible_search_path' from source: unknown 30564 1726882902.90583: variable 'omit' from source: magic vars 30564 1726882902.90622: variable 'omit' from source: magic vars 30564 1726882902.90635: variable 'omit' from source: magic vars 30564 1726882902.90639: we have included files to process 30564 1726882902.90639: generating all_blocks data 30564 1726882902.90640: done generating all_blocks data 30564 1726882902.90641: processing included file: fedora.linux_system_roles.network 30564 1726882902.90654: in VariableManager get_vars() 30564 1726882902.90665: done with get_vars() 30564 1726882902.90686: in VariableManager get_vars() 30564 1726882902.90698: done with get_vars() 30564 1726882902.90725: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 30564 1726882902.90842: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 30564 1726882902.90920: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 30564 1726882902.91241: in VariableManager get_vars() 30564 1726882902.91269: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30564 1726882902.93062: iterating over new_blocks loaded from include file 30564 1726882902.93066: in VariableManager get_vars() 30564 1726882902.93081: done with get_vars() 30564 1726882902.93082: filtering new block on tags 30564 1726882902.93250: done filtering new block on tags 30564 1726882902.93252: in VariableManager get_vars() 30564 1726882902.93262: done with get_vars() 30564 1726882902.93265: filtering new block on tags 30564 1726882902.93277: done filtering new block on tags 30564 1726882902.93279: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node2 30564 1726882902.93282: extending task lists for all hosts with included blocks 30564 1726882902.93380: done extending task lists 30564 1726882902.93381: done processing included files 30564 1726882902.93381: results queue empty 30564 1726882902.93382: checking for any_errors_fatal 30564 1726882902.93385: done checking for any_errors_fatal 30564 1726882902.93385: checking for max_fail_percentage 30564 1726882902.93386: done checking for max_fail_percentage 30564 1726882902.93386: checking to see if all hosts have failed and the running result is not ok 30564 1726882902.93387: done checking to see if all hosts have failed 30564 1726882902.93387: getting the remaining hosts for this loop 30564 1726882902.93388: done getting the remaining hosts for this loop 30564 1726882902.93390: getting the next task for host managed_node2 30564 1726882902.93393: done getting next task for host managed_node2 30564 1726882902.93395: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30564 1726882902.93397: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882902.93404: getting variables 30564 1726882902.93405: in VariableManager get_vars() 30564 1726882902.93414: Calling all_inventory to load vars for managed_node2 30564 1726882902.93416: Calling groups_inventory to load vars for managed_node2 30564 1726882902.93417: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882902.93421: Calling all_plugins_play to load vars for managed_node2 30564 1726882902.93423: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882902.93425: Calling groups_plugins_play to load vars for managed_node2 30564 1726882902.94224: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882902.95688: done with get_vars() 30564 1726882902.95705: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:41:42 -0400 (0:00:00.095) 0:01:41.538 ****** 30564 1726882902.95754: entering _queue_task() for managed_node2/include_tasks 30564 1726882902.95986: worker is 1 (out of 1 available) 30564 1726882902.95999: exiting _queue_task() for managed_node2/include_tasks 30564 1726882902.96011: done queuing things up, now waiting for results queue to drain 30564 1726882902.96012: waiting for pending results... 30564 1726882902.96199: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30564 1726882902.96290: in run() - task 0e448fcc-3ce9-4216-acec-0000000021a3 30564 1726882902.96301: variable 'ansible_search_path' from source: unknown 30564 1726882902.96304: variable 'ansible_search_path' from source: unknown 30564 1726882902.96333: calling self._execute() 30564 1726882902.96410: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882902.96414: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882902.96424: variable 'omit' from source: magic vars 30564 1726882902.96707: variable 'ansible_distribution_major_version' from source: facts 30564 1726882902.96718: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882902.96724: _execute() done 30564 1726882902.96727: dumping result to json 30564 1726882902.96729: done dumping result, returning 30564 1726882902.96736: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0e448fcc-3ce9-4216-acec-0000000021a3] 30564 1726882902.96741: sending task result for task 0e448fcc-3ce9-4216-acec-0000000021a3 30564 1726882902.96830: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000021a3 30564 1726882902.96832: WORKER PROCESS EXITING 30564 1726882902.96881: no more pending results, returning what we have 30564 1726882902.96886: in VariableManager get_vars() 30564 1726882902.96933: Calling all_inventory to load vars for managed_node2 30564 1726882902.96936: Calling groups_inventory to load vars for managed_node2 30564 1726882902.96938: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882902.96955: Calling all_plugins_play to load vars for managed_node2 30564 1726882902.96958: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882902.96961: Calling groups_plugins_play to load vars for managed_node2 30564 1726882902.97762: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882902.98714: done with get_vars() 30564 1726882902.98728: variable 'ansible_search_path' from source: unknown 30564 1726882902.98729: variable 'ansible_search_path' from source: unknown 30564 1726882902.98753: we have included files to process 30564 1726882902.98754: generating all_blocks data 30564 1726882902.98755: done generating all_blocks data 30564 1726882902.98758: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30564 1726882902.98758: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30564 1726882902.98759: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30564 1726882902.99130: done processing included file 30564 1726882902.99132: iterating over new_blocks loaded from include file 30564 1726882902.99133: in VariableManager get_vars() 30564 1726882902.99148: done with get_vars() 30564 1726882902.99149: filtering new block on tags 30564 1726882902.99169: done filtering new block on tags 30564 1726882902.99171: in VariableManager get_vars() 30564 1726882902.99186: done with get_vars() 30564 1726882902.99187: filtering new block on tags 30564 1726882902.99213: done filtering new block on tags 30564 1726882902.99214: in VariableManager get_vars() 30564 1726882902.99230: done with get_vars() 30564 1726882902.99232: filtering new block on tags 30564 1726882902.99256: done filtering new block on tags 30564 1726882902.99257: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 30564 1726882902.99261: extending task lists for all hosts with included blocks 30564 1726882903.00344: done extending task lists 30564 1726882903.00345: done processing included files 30564 1726882903.00345: results queue empty 30564 1726882903.00346: checking for any_errors_fatal 30564 1726882903.00348: done checking for any_errors_fatal 30564 1726882903.00349: checking for max_fail_percentage 30564 1726882903.00349: done checking for max_fail_percentage 30564 1726882903.00350: checking to see if all hosts have failed and the running result is not ok 30564 1726882903.00350: done checking to see if all hosts have failed 30564 1726882903.00351: getting the remaining hosts for this loop 30564 1726882903.00352: done getting the remaining hosts for this loop 30564 1726882903.00354: getting the next task for host managed_node2 30564 1726882903.00357: done getting next task for host managed_node2 30564 1726882903.00358: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30564 1726882903.00362: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882903.00371: getting variables 30564 1726882903.00372: in VariableManager get_vars() 30564 1726882903.00381: Calling all_inventory to load vars for managed_node2 30564 1726882903.00383: Calling groups_inventory to load vars for managed_node2 30564 1726882903.00384: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882903.00388: Calling all_plugins_play to load vars for managed_node2 30564 1726882903.00389: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882903.00391: Calling groups_plugins_play to load vars for managed_node2 30564 1726882903.01057: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882903.01982: done with get_vars() 30564 1726882903.01997: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:41:43 -0400 (0:00:00.062) 0:01:41.601 ****** 30564 1726882903.02048: entering _queue_task() for managed_node2/setup 30564 1726882903.02296: worker is 1 (out of 1 available) 30564 1726882903.02311: exiting _queue_task() for managed_node2/setup 30564 1726882903.02322: done queuing things up, now waiting for results queue to drain 30564 1726882903.02324: waiting for pending results... 30564 1726882903.02520: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30564 1726882903.02625: in run() - task 0e448fcc-3ce9-4216-acec-000000002200 30564 1726882903.02636: variable 'ansible_search_path' from source: unknown 30564 1726882903.02640: variable 'ansible_search_path' from source: unknown 30564 1726882903.02669: calling self._execute() 30564 1726882903.02745: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882903.02748: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882903.02758: variable 'omit' from source: magic vars 30564 1726882903.03044: variable 'ansible_distribution_major_version' from source: facts 30564 1726882903.03055: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882903.03216: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882903.04809: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882903.04853: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882903.04884: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882903.04911: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882903.04931: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882903.04995: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882903.05014: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882903.05032: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882903.05059: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882903.05073: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882903.05114: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882903.05129: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882903.05148: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882903.05176: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882903.05193: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882903.05302: variable '__network_required_facts' from source: role '' defaults 30564 1726882903.05305: variable 'ansible_facts' from source: unknown 30564 1726882903.05908: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 30564 1726882903.05912: when evaluation is False, skipping this task 30564 1726882903.05915: _execute() done 30564 1726882903.05918: dumping result to json 30564 1726882903.05920: done dumping result, returning 30564 1726882903.05926: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0e448fcc-3ce9-4216-acec-000000002200] 30564 1726882903.05932: sending task result for task 0e448fcc-3ce9-4216-acec-000000002200 30564 1726882903.06020: done sending task result for task 0e448fcc-3ce9-4216-acec-000000002200 30564 1726882903.06023: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30564 1726882903.06075: no more pending results, returning what we have 30564 1726882903.06079: results queue empty 30564 1726882903.06080: checking for any_errors_fatal 30564 1726882903.06082: done checking for any_errors_fatal 30564 1726882903.06083: checking for max_fail_percentage 30564 1726882903.06085: done checking for max_fail_percentage 30564 1726882903.06086: checking to see if all hosts have failed and the running result is not ok 30564 1726882903.06086: done checking to see if all hosts have failed 30564 1726882903.06087: getting the remaining hosts for this loop 30564 1726882903.06089: done getting the remaining hosts for this loop 30564 1726882903.06092: getting the next task for host managed_node2 30564 1726882903.06104: done getting next task for host managed_node2 30564 1726882903.06108: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 30564 1726882903.06114: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882903.06134: getting variables 30564 1726882903.06136: in VariableManager get_vars() 30564 1726882903.06190: Calling all_inventory to load vars for managed_node2 30564 1726882903.06193: Calling groups_inventory to load vars for managed_node2 30564 1726882903.06195: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882903.06206: Calling all_plugins_play to load vars for managed_node2 30564 1726882903.06208: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882903.06217: Calling groups_plugins_play to load vars for managed_node2 30564 1726882903.07201: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882903.08187: done with get_vars() 30564 1726882903.08205: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:41:43 -0400 (0:00:00.062) 0:01:41.664 ****** 30564 1726882903.08282: entering _queue_task() for managed_node2/stat 30564 1726882903.08528: worker is 1 (out of 1 available) 30564 1726882903.08544: exiting _queue_task() for managed_node2/stat 30564 1726882903.08557: done queuing things up, now waiting for results queue to drain 30564 1726882903.08558: waiting for pending results... 30564 1726882903.08765: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 30564 1726882903.08877: in run() - task 0e448fcc-3ce9-4216-acec-000000002202 30564 1726882903.08890: variable 'ansible_search_path' from source: unknown 30564 1726882903.08894: variable 'ansible_search_path' from source: unknown 30564 1726882903.08922: calling self._execute() 30564 1726882903.09002: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882903.09008: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882903.09017: variable 'omit' from source: magic vars 30564 1726882903.09345: variable 'ansible_distribution_major_version' from source: facts 30564 1726882903.09364: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882903.09503: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882903.09721: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882903.09753: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882903.09784: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882903.09808: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882903.09873: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882903.09894: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882903.09912: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882903.09930: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882903.09996: variable '__network_is_ostree' from source: set_fact 30564 1726882903.10002: Evaluated conditional (not __network_is_ostree is defined): False 30564 1726882903.10005: when evaluation is False, skipping this task 30564 1726882903.10008: _execute() done 30564 1726882903.10011: dumping result to json 30564 1726882903.10013: done dumping result, returning 30564 1726882903.10020: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0e448fcc-3ce9-4216-acec-000000002202] 30564 1726882903.10025: sending task result for task 0e448fcc-3ce9-4216-acec-000000002202 30564 1726882903.10110: done sending task result for task 0e448fcc-3ce9-4216-acec-000000002202 30564 1726882903.10113: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30564 1726882903.10163: no more pending results, returning what we have 30564 1726882903.10168: results queue empty 30564 1726882903.10170: checking for any_errors_fatal 30564 1726882903.10179: done checking for any_errors_fatal 30564 1726882903.10180: checking for max_fail_percentage 30564 1726882903.10182: done checking for max_fail_percentage 30564 1726882903.10183: checking to see if all hosts have failed and the running result is not ok 30564 1726882903.10184: done checking to see if all hosts have failed 30564 1726882903.10185: getting the remaining hosts for this loop 30564 1726882903.10187: done getting the remaining hosts for this loop 30564 1726882903.10195: getting the next task for host managed_node2 30564 1726882903.10202: done getting next task for host managed_node2 30564 1726882903.10206: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30564 1726882903.10212: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882903.10237: getting variables 30564 1726882903.10238: in VariableManager get_vars() 30564 1726882903.10279: Calling all_inventory to load vars for managed_node2 30564 1726882903.10282: Calling groups_inventory to load vars for managed_node2 30564 1726882903.10284: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882903.10293: Calling all_plugins_play to load vars for managed_node2 30564 1726882903.10295: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882903.10297: Calling groups_plugins_play to load vars for managed_node2 30564 1726882903.11119: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882903.12179: done with get_vars() 30564 1726882903.12195: done getting variables 30564 1726882903.12235: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:41:43 -0400 (0:00:00.039) 0:01:41.703 ****** 30564 1726882903.12260: entering _queue_task() for managed_node2/set_fact 30564 1726882903.12469: worker is 1 (out of 1 available) 30564 1726882903.12482: exiting _queue_task() for managed_node2/set_fact 30564 1726882903.12494: done queuing things up, now waiting for results queue to drain 30564 1726882903.12495: waiting for pending results... 30564 1726882903.12676: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30564 1726882903.12784: in run() - task 0e448fcc-3ce9-4216-acec-000000002203 30564 1726882903.12795: variable 'ansible_search_path' from source: unknown 30564 1726882903.12799: variable 'ansible_search_path' from source: unknown 30564 1726882903.12826: calling self._execute() 30564 1726882903.12903: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882903.12907: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882903.12917: variable 'omit' from source: magic vars 30564 1726882903.13195: variable 'ansible_distribution_major_version' from source: facts 30564 1726882903.13206: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882903.13319: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882903.13510: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882903.13541: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882903.13567: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882903.13595: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882903.13657: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882903.13679: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882903.13697: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882903.13718: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882903.13781: variable '__network_is_ostree' from source: set_fact 30564 1726882903.13787: Evaluated conditional (not __network_is_ostree is defined): False 30564 1726882903.13790: when evaluation is False, skipping this task 30564 1726882903.13792: _execute() done 30564 1726882903.13795: dumping result to json 30564 1726882903.13798: done dumping result, returning 30564 1726882903.13805: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0e448fcc-3ce9-4216-acec-000000002203] 30564 1726882903.13810: sending task result for task 0e448fcc-3ce9-4216-acec-000000002203 30564 1726882903.13892: done sending task result for task 0e448fcc-3ce9-4216-acec-000000002203 30564 1726882903.13895: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30564 1726882903.13943: no more pending results, returning what we have 30564 1726882903.13947: results queue empty 30564 1726882903.13948: checking for any_errors_fatal 30564 1726882903.13953: done checking for any_errors_fatal 30564 1726882903.13954: checking for max_fail_percentage 30564 1726882903.13955: done checking for max_fail_percentage 30564 1726882903.13956: checking to see if all hosts have failed and the running result is not ok 30564 1726882903.13957: done checking to see if all hosts have failed 30564 1726882903.13958: getting the remaining hosts for this loop 30564 1726882903.13959: done getting the remaining hosts for this loop 30564 1726882903.13963: getting the next task for host managed_node2 30564 1726882903.13975: done getting next task for host managed_node2 30564 1726882903.13978: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 30564 1726882903.13984: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882903.14002: getting variables 30564 1726882903.14003: in VariableManager get_vars() 30564 1726882903.14045: Calling all_inventory to load vars for managed_node2 30564 1726882903.14047: Calling groups_inventory to load vars for managed_node2 30564 1726882903.14049: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882903.14056: Calling all_plugins_play to load vars for managed_node2 30564 1726882903.14058: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882903.14059: Calling groups_plugins_play to load vars for managed_node2 30564 1726882903.14835: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882903.15787: done with get_vars() 30564 1726882903.15802: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:41:43 -0400 (0:00:00.036) 0:01:41.739 ****** 30564 1726882903.15865: entering _queue_task() for managed_node2/service_facts 30564 1726882903.16057: worker is 1 (out of 1 available) 30564 1726882903.16071: exiting _queue_task() for managed_node2/service_facts 30564 1726882903.16084: done queuing things up, now waiting for results queue to drain 30564 1726882903.16085: waiting for pending results... 30564 1726882903.16266: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 30564 1726882903.16367: in run() - task 0e448fcc-3ce9-4216-acec-000000002205 30564 1726882903.16380: variable 'ansible_search_path' from source: unknown 30564 1726882903.16384: variable 'ansible_search_path' from source: unknown 30564 1726882903.16409: calling self._execute() 30564 1726882903.16487: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882903.16492: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882903.16501: variable 'omit' from source: magic vars 30564 1726882903.16768: variable 'ansible_distribution_major_version' from source: facts 30564 1726882903.16782: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882903.16787: variable 'omit' from source: magic vars 30564 1726882903.16841: variable 'omit' from source: magic vars 30564 1726882903.16864: variable 'omit' from source: magic vars 30564 1726882903.16900: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882903.16926: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882903.16941: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882903.16956: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882903.16971: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882903.16994: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882903.16997: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882903.17000: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882903.17068: Set connection var ansible_timeout to 10 30564 1726882903.17073: Set connection var ansible_pipelining to False 30564 1726882903.17078: Set connection var ansible_shell_type to sh 30564 1726882903.17081: Set connection var ansible_shell_executable to /bin/sh 30564 1726882903.17093: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882903.17095: Set connection var ansible_connection to ssh 30564 1726882903.17112: variable 'ansible_shell_executable' from source: unknown 30564 1726882903.17115: variable 'ansible_connection' from source: unknown 30564 1726882903.17118: variable 'ansible_module_compression' from source: unknown 30564 1726882903.17120: variable 'ansible_shell_type' from source: unknown 30564 1726882903.17122: variable 'ansible_shell_executable' from source: unknown 30564 1726882903.17124: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882903.17129: variable 'ansible_pipelining' from source: unknown 30564 1726882903.17131: variable 'ansible_timeout' from source: unknown 30564 1726882903.17135: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882903.17280: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30564 1726882903.17288: variable 'omit' from source: magic vars 30564 1726882903.17292: starting attempt loop 30564 1726882903.17295: running the handler 30564 1726882903.17309: _low_level_execute_command(): starting 30564 1726882903.17316: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882903.17832: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882903.17856: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882903.17874: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882903.17888: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882903.17934: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882903.17941: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882903.17949: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882903.18082: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882903.19744: stdout chunk (state=3): >>>/root <<< 30564 1726882903.19848: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882903.19903: stderr chunk (state=3): >>><<< 30564 1726882903.19906: stdout chunk (state=3): >>><<< 30564 1726882903.19926: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882903.19937: _low_level_execute_command(): starting 30564 1726882903.19942: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882903.1992474-35018-93251516691595 `" && echo ansible-tmp-1726882903.1992474-35018-93251516691595="` echo /root/.ansible/tmp/ansible-tmp-1726882903.1992474-35018-93251516691595 `" ) && sleep 0' 30564 1726882903.20374: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882903.20377: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882903.20409: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882903.20429: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882903.20482: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882903.20494: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882903.20597: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882903.22467: stdout chunk (state=3): >>>ansible-tmp-1726882903.1992474-35018-93251516691595=/root/.ansible/tmp/ansible-tmp-1726882903.1992474-35018-93251516691595 <<< 30564 1726882903.22588: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882903.22626: stderr chunk (state=3): >>><<< 30564 1726882903.22629: stdout chunk (state=3): >>><<< 30564 1726882903.22640: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882903.1992474-35018-93251516691595=/root/.ansible/tmp/ansible-tmp-1726882903.1992474-35018-93251516691595 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882903.22678: variable 'ansible_module_compression' from source: unknown 30564 1726882903.22714: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 30564 1726882903.22741: variable 'ansible_facts' from source: unknown 30564 1726882903.22800: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882903.1992474-35018-93251516691595/AnsiballZ_service_facts.py 30564 1726882903.22893: Sending initial data 30564 1726882903.22898: Sent initial data (161 bytes) 30564 1726882903.23544: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882903.23547: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882903.23585: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882903.23589: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882903.23591: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882903.23634: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882903.23642: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882903.23754: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882903.25490: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882903.25586: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882903.25685: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmpox1yl8we /root/.ansible/tmp/ansible-tmp-1726882903.1992474-35018-93251516691595/AnsiballZ_service_facts.py <<< 30564 1726882903.25787: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882903.26829: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882903.26921: stderr chunk (state=3): >>><<< 30564 1726882903.26924: stdout chunk (state=3): >>><<< 30564 1726882903.26937: done transferring module to remote 30564 1726882903.26945: _low_level_execute_command(): starting 30564 1726882903.26950: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882903.1992474-35018-93251516691595/ /root/.ansible/tmp/ansible-tmp-1726882903.1992474-35018-93251516691595/AnsiballZ_service_facts.py && sleep 0' 30564 1726882903.27370: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882903.27375: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882903.27404: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882903.27415: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882903.27475: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882903.27482: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882903.27590: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882903.29323: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882903.29368: stderr chunk (state=3): >>><<< 30564 1726882903.29378: stdout chunk (state=3): >>><<< 30564 1726882903.29388: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882903.29391: _low_level_execute_command(): starting 30564 1726882903.29396: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882903.1992474-35018-93251516691595/AnsiballZ_service_facts.py && sleep 0' 30564 1726882903.29798: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882903.29810: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882903.29837: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882903.29849: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882903.29902: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882903.29914: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882903.30029: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882904.61930: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", <<< 30564 1726882904.61937: stdout chunk (state=3): >>>"source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-qu<<< 30564 1726882904.61953: stdout chunk (state=3): >>>it-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.servi<<< 30564 1726882904.61972: stdout chunk (state=3): >>>ce": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "ina<<< 30564 1726882904.61990: stdout chunk (state=3): >>>ctive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 30564 1726882904.63327: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882904.63358: stderr chunk (state=3): >>><<< 30564 1726882904.63366: stdout chunk (state=3): >>><<< 30564 1726882904.63389: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882904.63780: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882903.1992474-35018-93251516691595/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882904.63789: _low_level_execute_command(): starting 30564 1726882904.63804: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882903.1992474-35018-93251516691595/ > /dev/null 2>&1 && sleep 0' 30564 1726882904.64838: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882904.64851: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882904.64872: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882904.64892: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882904.64954: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882904.64973: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882904.64989: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882904.65009: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882904.65032: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882904.65044: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882904.65057: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882904.65076: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882904.65093: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882904.65105: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882904.65117: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882904.65132: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882904.65216: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882904.65232: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882904.65252: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882904.65389: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882904.67235: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882904.67276: stderr chunk (state=3): >>><<< 30564 1726882904.67279: stdout chunk (state=3): >>><<< 30564 1726882904.67290: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882904.67298: handler run complete 30564 1726882904.67398: variable 'ansible_facts' from source: unknown 30564 1726882904.67811: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882904.69271: variable 'ansible_facts' from source: unknown 30564 1726882904.69275: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882904.69277: attempt loop complete, returning result 30564 1726882904.69279: _execute() done 30564 1726882904.69281: dumping result to json 30564 1726882904.69283: done dumping result, returning 30564 1726882904.69285: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [0e448fcc-3ce9-4216-acec-000000002205] 30564 1726882904.69287: sending task result for task 0e448fcc-3ce9-4216-acec-000000002205 30564 1726882904.70700: done sending task result for task 0e448fcc-3ce9-4216-acec-000000002205 30564 1726882904.70703: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30564 1726882904.70810: no more pending results, returning what we have 30564 1726882904.70813: results queue empty 30564 1726882904.70814: checking for any_errors_fatal 30564 1726882904.70818: done checking for any_errors_fatal 30564 1726882904.70819: checking for max_fail_percentage 30564 1726882904.70820: done checking for max_fail_percentage 30564 1726882904.70821: checking to see if all hosts have failed and the running result is not ok 30564 1726882904.70822: done checking to see if all hosts have failed 30564 1726882904.70823: getting the remaining hosts for this loop 30564 1726882904.70824: done getting the remaining hosts for this loop 30564 1726882904.70828: getting the next task for host managed_node2 30564 1726882904.70835: done getting next task for host managed_node2 30564 1726882904.70838: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 30564 1726882904.70844: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882904.70859: getting variables 30564 1726882904.70860: in VariableManager get_vars() 30564 1726882904.70906: Calling all_inventory to load vars for managed_node2 30564 1726882904.70909: Calling groups_inventory to load vars for managed_node2 30564 1726882904.70911: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882904.70920: Calling all_plugins_play to load vars for managed_node2 30564 1726882904.70923: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882904.70926: Calling groups_plugins_play to load vars for managed_node2 30564 1726882904.72551: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882904.74562: done with get_vars() 30564 1726882904.74601: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:41:44 -0400 (0:00:01.588) 0:01:43.328 ****** 30564 1726882904.74718: entering _queue_task() for managed_node2/package_facts 30564 1726882904.75111: worker is 1 (out of 1 available) 30564 1726882904.75125: exiting _queue_task() for managed_node2/package_facts 30564 1726882904.75138: done queuing things up, now waiting for results queue to drain 30564 1726882904.75140: waiting for pending results... 30564 1726882904.75480: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 30564 1726882904.75661: in run() - task 0e448fcc-3ce9-4216-acec-000000002206 30564 1726882904.75688: variable 'ansible_search_path' from source: unknown 30564 1726882904.75697: variable 'ansible_search_path' from source: unknown 30564 1726882904.75743: calling self._execute() 30564 1726882904.75857: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882904.75875: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882904.75893: variable 'omit' from source: magic vars 30564 1726882904.76352: variable 'ansible_distribution_major_version' from source: facts 30564 1726882904.76380: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882904.76397: variable 'omit' from source: magic vars 30564 1726882904.76488: variable 'omit' from source: magic vars 30564 1726882904.76536: variable 'omit' from source: magic vars 30564 1726882904.76592: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882904.76642: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882904.76672: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882904.76696: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882904.76718: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882904.76754: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882904.76798: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882904.76806: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882904.76919: Set connection var ansible_timeout to 10 30564 1726882904.76937: Set connection var ansible_pipelining to False 30564 1726882904.76944: Set connection var ansible_shell_type to sh 30564 1726882904.76953: Set connection var ansible_shell_executable to /bin/sh 30564 1726882904.76965: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882904.76976: Set connection var ansible_connection to ssh 30564 1726882904.77007: variable 'ansible_shell_executable' from source: unknown 30564 1726882904.77015: variable 'ansible_connection' from source: unknown 30564 1726882904.77023: variable 'ansible_module_compression' from source: unknown 30564 1726882904.77030: variable 'ansible_shell_type' from source: unknown 30564 1726882904.77044: variable 'ansible_shell_executable' from source: unknown 30564 1726882904.77050: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882904.77058: variable 'ansible_pipelining' from source: unknown 30564 1726882904.77067: variable 'ansible_timeout' from source: unknown 30564 1726882904.77081: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882904.77297: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30564 1726882904.77314: variable 'omit' from source: magic vars 30564 1726882904.77322: starting attempt loop 30564 1726882904.77328: running the handler 30564 1726882904.77344: _low_level_execute_command(): starting 30564 1726882904.77355: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882904.78141: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882904.78158: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882904.78186: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882904.78205: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882904.78263: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882904.78288: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882904.78314: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882904.78341: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882904.78367: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882904.78382: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882904.78394: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882904.78407: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882904.78423: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882904.78444: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882904.78457: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882904.78481: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882904.78554: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882904.78602: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882904.78637: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882904.78779: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882904.80427: stdout chunk (state=3): >>>/root <<< 30564 1726882904.80531: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882904.80584: stderr chunk (state=3): >>><<< 30564 1726882904.80587: stdout chunk (state=3): >>><<< 30564 1726882904.80608: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882904.80621: _low_level_execute_command(): starting 30564 1726882904.80628: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882904.8060787-35076-150048669793201 `" && echo ansible-tmp-1726882904.8060787-35076-150048669793201="` echo /root/.ansible/tmp/ansible-tmp-1726882904.8060787-35076-150048669793201 `" ) && sleep 0' 30564 1726882904.81084: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882904.81088: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882904.81122: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882904.81130: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882904.81140: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882904.81145: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882904.81201: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882904.81227: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882904.81241: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882904.81363: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882904.83225: stdout chunk (state=3): >>>ansible-tmp-1726882904.8060787-35076-150048669793201=/root/.ansible/tmp/ansible-tmp-1726882904.8060787-35076-150048669793201 <<< 30564 1726882904.83338: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882904.83379: stderr chunk (state=3): >>><<< 30564 1726882904.83383: stdout chunk (state=3): >>><<< 30564 1726882904.83397: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882904.8060787-35076-150048669793201=/root/.ansible/tmp/ansible-tmp-1726882904.8060787-35076-150048669793201 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882904.83434: variable 'ansible_module_compression' from source: unknown 30564 1726882904.83479: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 30564 1726882904.83527: variable 'ansible_facts' from source: unknown 30564 1726882904.83664: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882904.8060787-35076-150048669793201/AnsiballZ_package_facts.py 30564 1726882904.83775: Sending initial data 30564 1726882904.83780: Sent initial data (162 bytes) 30564 1726882904.84601: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882904.84609: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882904.84620: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882904.84649: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882904.84685: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882904.84695: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882904.84700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882904.84715: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882904.84722: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882904.84729: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882904.84737: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882904.84746: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882904.84758: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882904.84770: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882904.84776: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882904.84786: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882904.84858: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882904.84877: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882904.84889: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882904.85013: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882904.86737: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882904.86827: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882904.86926: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmpnl5x7qg9 /root/.ansible/tmp/ansible-tmp-1726882904.8060787-35076-150048669793201/AnsiballZ_package_facts.py <<< 30564 1726882904.87019: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882904.89382: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882904.89609: stderr chunk (state=3): >>><<< 30564 1726882904.89613: stdout chunk (state=3): >>><<< 30564 1726882904.89615: done transferring module to remote 30564 1726882904.89617: _low_level_execute_command(): starting 30564 1726882904.89619: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882904.8060787-35076-150048669793201/ /root/.ansible/tmp/ansible-tmp-1726882904.8060787-35076-150048669793201/AnsiballZ_package_facts.py && sleep 0' 30564 1726882904.90221: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882904.90228: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882904.90281: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882904.90284: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882904.90287: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 30564 1726882904.90289: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882904.90343: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882904.90346: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882904.90450: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882904.92256: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882904.92322: stderr chunk (state=3): >>><<< 30564 1726882904.92326: stdout chunk (state=3): >>><<< 30564 1726882904.92411: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882904.92414: _low_level_execute_command(): starting 30564 1726882904.92417: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882904.8060787-35076-150048669793201/AnsiballZ_package_facts.py && sleep 0' 30564 1726882904.93017: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882904.93021: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882904.93041: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 30564 1726882904.93044: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882904.93096: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882904.93101: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882904.93109: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882904.93231: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882905.39233: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_<<< 30564 1726882905.39294: stdout chunk (state=3): >>>64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba<<< 30564 1726882905.39308: stdout chunk (state=3): >>>", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epo<<< 30564 1726882905.39337: stdout chunk (state=3): >>>ch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.<<< 30564 1726882905.39345: stdout chunk (state=3): >>>9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source":<<< 30564 1726882905.39394: stdout chunk (state=3): >>> "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rp<<< 30564 1726882905.39401: stdout chunk (state=3): >>>m"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1"<<< 30564 1726882905.39403: stdout chunk (state=3): >>>, "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysin<<< 30564 1726882905.39442: stdout chunk (state=3): >>>it", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "<<< 30564 1726882905.39450: stdout chunk (state=3): >>>8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch"<<< 30564 1726882905.39453: stdout chunk (state=3): >>>: null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "relea<<< 30564 1726882905.39471: stdout chunk (state=3): >>>se": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 30564 1726882905.40943: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882905.41026: stderr chunk (state=3): >>><<< 30564 1726882905.41029: stdout chunk (state=3): >>><<< 30564 1726882905.41351: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882905.43654: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882904.8060787-35076-150048669793201/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882905.43676: _low_level_execute_command(): starting 30564 1726882905.43679: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882904.8060787-35076-150048669793201/ > /dev/null 2>&1 && sleep 0' 30564 1726882905.44333: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882905.44341: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882905.44352: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882905.44590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882905.44594: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882905.44596: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882905.44598: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882905.44600: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882905.44603: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882905.44605: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882905.44607: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882905.44608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882905.44610: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882905.44612: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882905.44614: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882905.44615: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882905.44617: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882905.44618: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882905.44620: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882905.44844: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882905.46669: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882905.46677: stdout chunk (state=3): >>><<< 30564 1726882905.46686: stderr chunk (state=3): >>><<< 30564 1726882905.46704: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882905.46710: handler run complete 30564 1726882905.47666: variable 'ansible_facts' from source: unknown 30564 1726882905.48715: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882905.51968: variable 'ansible_facts' from source: unknown 30564 1726882905.52750: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882905.54089: attempt loop complete, returning result 30564 1726882905.54102: _execute() done 30564 1726882905.54105: dumping result to json 30564 1726882905.54512: done dumping result, returning 30564 1726882905.54522: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0e448fcc-3ce9-4216-acec-000000002206] 30564 1726882905.54528: sending task result for task 0e448fcc-3ce9-4216-acec-000000002206 30564 1726882905.63583: done sending task result for task 0e448fcc-3ce9-4216-acec-000000002206 30564 1726882905.63587: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30564 1726882905.63741: no more pending results, returning what we have 30564 1726882905.63743: results queue empty 30564 1726882905.63744: checking for any_errors_fatal 30564 1726882905.63749: done checking for any_errors_fatal 30564 1726882905.63750: checking for max_fail_percentage 30564 1726882905.63751: done checking for max_fail_percentage 30564 1726882905.63752: checking to see if all hosts have failed and the running result is not ok 30564 1726882905.63753: done checking to see if all hosts have failed 30564 1726882905.63754: getting the remaining hosts for this loop 30564 1726882905.63755: done getting the remaining hosts for this loop 30564 1726882905.63758: getting the next task for host managed_node2 30564 1726882905.63770: done getting next task for host managed_node2 30564 1726882905.63774: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 30564 1726882905.63779: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882905.63791: getting variables 30564 1726882905.63793: in VariableManager get_vars() 30564 1726882905.63827: Calling all_inventory to load vars for managed_node2 30564 1726882905.63830: Calling groups_inventory to load vars for managed_node2 30564 1726882905.63837: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882905.63847: Calling all_plugins_play to load vars for managed_node2 30564 1726882905.63850: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882905.63853: Calling groups_plugins_play to load vars for managed_node2 30564 1726882905.65152: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882905.66874: done with get_vars() 30564 1726882905.66897: done getting variables 30564 1726882905.66954: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:41:45 -0400 (0:00:00.922) 0:01:44.251 ****** 30564 1726882905.66994: entering _queue_task() for managed_node2/debug 30564 1726882905.67325: worker is 1 (out of 1 available) 30564 1726882905.67339: exiting _queue_task() for managed_node2/debug 30564 1726882905.67351: done queuing things up, now waiting for results queue to drain 30564 1726882905.67353: waiting for pending results... 30564 1726882905.67658: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 30564 1726882905.67802: in run() - task 0e448fcc-3ce9-4216-acec-0000000021a4 30564 1726882905.67818: variable 'ansible_search_path' from source: unknown 30564 1726882905.67822: variable 'ansible_search_path' from source: unknown 30564 1726882905.67857: calling self._execute() 30564 1726882905.67960: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882905.67968: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882905.67981: variable 'omit' from source: magic vars 30564 1726882905.68372: variable 'ansible_distribution_major_version' from source: facts 30564 1726882905.68392: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882905.68396: variable 'omit' from source: magic vars 30564 1726882905.68460: variable 'omit' from source: magic vars 30564 1726882905.68547: variable 'network_provider' from source: set_fact 30564 1726882905.68570: variable 'omit' from source: magic vars 30564 1726882905.68614: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882905.68648: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882905.68671: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882905.68695: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882905.68708: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882905.68737: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882905.68741: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882905.68743: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882905.68852: Set connection var ansible_timeout to 10 30564 1726882905.68855: Set connection var ansible_pipelining to False 30564 1726882905.68857: Set connection var ansible_shell_type to sh 30564 1726882905.68872: Set connection var ansible_shell_executable to /bin/sh 30564 1726882905.68875: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882905.68888: Set connection var ansible_connection to ssh 30564 1726882905.68909: variable 'ansible_shell_executable' from source: unknown 30564 1726882905.68912: variable 'ansible_connection' from source: unknown 30564 1726882905.68916: variable 'ansible_module_compression' from source: unknown 30564 1726882905.68918: variable 'ansible_shell_type' from source: unknown 30564 1726882905.68921: variable 'ansible_shell_executable' from source: unknown 30564 1726882905.68923: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882905.68925: variable 'ansible_pipelining' from source: unknown 30564 1726882905.68927: variable 'ansible_timeout' from source: unknown 30564 1726882905.68934: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882905.69070: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882905.69083: variable 'omit' from source: magic vars 30564 1726882905.69088: starting attempt loop 30564 1726882905.69091: running the handler 30564 1726882905.69142: handler run complete 30564 1726882905.69152: attempt loop complete, returning result 30564 1726882905.69155: _execute() done 30564 1726882905.69158: dumping result to json 30564 1726882905.69160: done dumping result, returning 30564 1726882905.69169: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [0e448fcc-3ce9-4216-acec-0000000021a4] 30564 1726882905.69179: sending task result for task 0e448fcc-3ce9-4216-acec-0000000021a4 ok: [managed_node2] => {} MSG: Using network provider: nm 30564 1726882905.69329: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000021a4 30564 1726882905.69336: WORKER PROCESS EXITING 30564 1726882905.69350: no more pending results, returning what we have 30564 1726882905.69353: results queue empty 30564 1726882905.69354: checking for any_errors_fatal 30564 1726882905.69371: done checking for any_errors_fatal 30564 1726882905.69372: checking for max_fail_percentage 30564 1726882905.69374: done checking for max_fail_percentage 30564 1726882905.69375: checking to see if all hosts have failed and the running result is not ok 30564 1726882905.69376: done checking to see if all hosts have failed 30564 1726882905.69376: getting the remaining hosts for this loop 30564 1726882905.69379: done getting the remaining hosts for this loop 30564 1726882905.69384: getting the next task for host managed_node2 30564 1726882905.69392: done getting next task for host managed_node2 30564 1726882905.69396: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30564 1726882905.69402: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882905.69416: getting variables 30564 1726882905.69418: in VariableManager get_vars() 30564 1726882905.69462: Calling all_inventory to load vars for managed_node2 30564 1726882905.69466: Calling groups_inventory to load vars for managed_node2 30564 1726882905.69472: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882905.69483: Calling all_plugins_play to load vars for managed_node2 30564 1726882905.69487: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882905.69490: Calling groups_plugins_play to load vars for managed_node2 30564 1726882905.76605: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882905.78376: done with get_vars() 30564 1726882905.78401: done getting variables 30564 1726882905.78448: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:41:45 -0400 (0:00:00.114) 0:01:44.366 ****** 30564 1726882905.78488: entering _queue_task() for managed_node2/fail 30564 1726882905.78825: worker is 1 (out of 1 available) 30564 1726882905.78838: exiting _queue_task() for managed_node2/fail 30564 1726882905.78851: done queuing things up, now waiting for results queue to drain 30564 1726882905.78852: waiting for pending results... 30564 1726882905.79160: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30564 1726882905.79334: in run() - task 0e448fcc-3ce9-4216-acec-0000000021a5 30564 1726882905.79354: variable 'ansible_search_path' from source: unknown 30564 1726882905.79362: variable 'ansible_search_path' from source: unknown 30564 1726882905.79412: calling self._execute() 30564 1726882905.79519: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882905.79533: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882905.79548: variable 'omit' from source: magic vars 30564 1726882905.79949: variable 'ansible_distribution_major_version' from source: facts 30564 1726882905.79972: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882905.80110: variable 'network_state' from source: role '' defaults 30564 1726882905.80125: Evaluated conditional (network_state != {}): False 30564 1726882905.80133: when evaluation is False, skipping this task 30564 1726882905.80141: _execute() done 30564 1726882905.80149: dumping result to json 30564 1726882905.80156: done dumping result, returning 30564 1726882905.80173: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0e448fcc-3ce9-4216-acec-0000000021a5] 30564 1726882905.80184: sending task result for task 0e448fcc-3ce9-4216-acec-0000000021a5 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30564 1726882905.80337: no more pending results, returning what we have 30564 1726882905.80341: results queue empty 30564 1726882905.80342: checking for any_errors_fatal 30564 1726882905.80353: done checking for any_errors_fatal 30564 1726882905.80354: checking for max_fail_percentage 30564 1726882905.80356: done checking for max_fail_percentage 30564 1726882905.80357: checking to see if all hosts have failed and the running result is not ok 30564 1726882905.80358: done checking to see if all hosts have failed 30564 1726882905.80359: getting the remaining hosts for this loop 30564 1726882905.80360: done getting the remaining hosts for this loop 30564 1726882905.80365: getting the next task for host managed_node2 30564 1726882905.80376: done getting next task for host managed_node2 30564 1726882905.80382: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30564 1726882905.80389: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882905.80413: getting variables 30564 1726882905.80415: in VariableManager get_vars() 30564 1726882905.80459: Calling all_inventory to load vars for managed_node2 30564 1726882905.80462: Calling groups_inventory to load vars for managed_node2 30564 1726882905.80467: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882905.80482: Calling all_plugins_play to load vars for managed_node2 30564 1726882905.80485: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882905.80488: Calling groups_plugins_play to load vars for managed_node2 30564 1726882905.81484: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000021a5 30564 1726882905.81488: WORKER PROCESS EXITING 30564 1726882905.82240: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882905.84187: done with get_vars() 30564 1726882905.84212: done getting variables 30564 1726882905.84283: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:41:45 -0400 (0:00:00.058) 0:01:44.424 ****** 30564 1726882905.84316: entering _queue_task() for managed_node2/fail 30564 1726882905.84621: worker is 1 (out of 1 available) 30564 1726882905.84634: exiting _queue_task() for managed_node2/fail 30564 1726882905.84648: done queuing things up, now waiting for results queue to drain 30564 1726882905.84649: waiting for pending results... 30564 1726882905.84947: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30564 1726882905.85115: in run() - task 0e448fcc-3ce9-4216-acec-0000000021a6 30564 1726882905.85138: variable 'ansible_search_path' from source: unknown 30564 1726882905.85147: variable 'ansible_search_path' from source: unknown 30564 1726882905.85190: calling self._execute() 30564 1726882905.85304: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882905.85319: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882905.85339: variable 'omit' from source: magic vars 30564 1726882905.85754: variable 'ansible_distribution_major_version' from source: facts 30564 1726882905.85779: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882905.85916: variable 'network_state' from source: role '' defaults 30564 1726882905.85934: Evaluated conditional (network_state != {}): False 30564 1726882905.85943: when evaluation is False, skipping this task 30564 1726882905.85952: _execute() done 30564 1726882905.85966: dumping result to json 30564 1726882905.85981: done dumping result, returning 30564 1726882905.85996: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0e448fcc-3ce9-4216-acec-0000000021a6] 30564 1726882905.86009: sending task result for task 0e448fcc-3ce9-4216-acec-0000000021a6 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30564 1726882905.86175: no more pending results, returning what we have 30564 1726882905.86180: results queue empty 30564 1726882905.86181: checking for any_errors_fatal 30564 1726882905.86190: done checking for any_errors_fatal 30564 1726882905.86191: checking for max_fail_percentage 30564 1726882905.86193: done checking for max_fail_percentage 30564 1726882905.86194: checking to see if all hosts have failed and the running result is not ok 30564 1726882905.86195: done checking to see if all hosts have failed 30564 1726882905.86196: getting the remaining hosts for this loop 30564 1726882905.86198: done getting the remaining hosts for this loop 30564 1726882905.86202: getting the next task for host managed_node2 30564 1726882905.86212: done getting next task for host managed_node2 30564 1726882905.86218: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30564 1726882905.86225: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882905.86252: getting variables 30564 1726882905.86254: in VariableManager get_vars() 30564 1726882905.86308: Calling all_inventory to load vars for managed_node2 30564 1726882905.86312: Calling groups_inventory to load vars for managed_node2 30564 1726882905.86314: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882905.86329: Calling all_plugins_play to load vars for managed_node2 30564 1726882905.86332: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882905.86336: Calling groups_plugins_play to load vars for managed_node2 30564 1726882905.87286: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000021a6 30564 1726882905.87290: WORKER PROCESS EXITING 30564 1726882905.88183: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882905.90827: done with get_vars() 30564 1726882905.90859: done getting variables 30564 1726882905.91286: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:41:45 -0400 (0:00:00.070) 0:01:44.494 ****** 30564 1726882905.91352: entering _queue_task() for managed_node2/fail 30564 1726882905.91700: worker is 1 (out of 1 available) 30564 1726882905.91714: exiting _queue_task() for managed_node2/fail 30564 1726882905.91727: done queuing things up, now waiting for results queue to drain 30564 1726882905.91729: waiting for pending results... 30564 1726882905.92036: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30564 1726882905.92210: in run() - task 0e448fcc-3ce9-4216-acec-0000000021a7 30564 1726882905.92230: variable 'ansible_search_path' from source: unknown 30564 1726882905.92239: variable 'ansible_search_path' from source: unknown 30564 1726882905.92288: calling self._execute() 30564 1726882905.92405: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882905.92418: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882905.92435: variable 'omit' from source: magic vars 30564 1726882905.92837: variable 'ansible_distribution_major_version' from source: facts 30564 1726882905.92856: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882905.93043: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882905.96312: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882905.96773: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882905.96818: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882905.96853: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882905.96887: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882905.96972: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882905.97009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882905.97043: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882905.97093: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882905.97112: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882905.97213: variable 'ansible_distribution_major_version' from source: facts 30564 1726882905.97232: Evaluated conditional (ansible_distribution_major_version | int > 9): False 30564 1726882905.97239: when evaluation is False, skipping this task 30564 1726882905.97250: _execute() done 30564 1726882905.97258: dumping result to json 30564 1726882905.97266: done dumping result, returning 30564 1726882905.97283: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0e448fcc-3ce9-4216-acec-0000000021a7] 30564 1726882905.97292: sending task result for task 0e448fcc-3ce9-4216-acec-0000000021a7 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 30564 1726882905.97445: no more pending results, returning what we have 30564 1726882905.97449: results queue empty 30564 1726882905.97450: checking for any_errors_fatal 30564 1726882905.97458: done checking for any_errors_fatal 30564 1726882905.97458: checking for max_fail_percentage 30564 1726882905.97460: done checking for max_fail_percentage 30564 1726882905.97461: checking to see if all hosts have failed and the running result is not ok 30564 1726882905.97462: done checking to see if all hosts have failed 30564 1726882905.97465: getting the remaining hosts for this loop 30564 1726882905.97470: done getting the remaining hosts for this loop 30564 1726882905.97474: getting the next task for host managed_node2 30564 1726882905.97484: done getting next task for host managed_node2 30564 1726882905.97489: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30564 1726882905.97496: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882905.97517: getting variables 30564 1726882905.97519: in VariableManager get_vars() 30564 1726882905.97567: Calling all_inventory to load vars for managed_node2 30564 1726882905.97572: Calling groups_inventory to load vars for managed_node2 30564 1726882905.97575: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882905.97586: Calling all_plugins_play to load vars for managed_node2 30564 1726882905.97589: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882905.97592: Calling groups_plugins_play to load vars for managed_node2 30564 1726882905.98986: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000021a7 30564 1726882905.98989: WORKER PROCESS EXITING 30564 1726882905.99567: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882906.01273: done with get_vars() 30564 1726882906.01300: done getting variables 30564 1726882906.01362: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:41:46 -0400 (0:00:00.100) 0:01:44.595 ****** 30564 1726882906.01401: entering _queue_task() for managed_node2/dnf 30564 1726882906.02032: worker is 1 (out of 1 available) 30564 1726882906.02047: exiting _queue_task() for managed_node2/dnf 30564 1726882906.02061: done queuing things up, now waiting for results queue to drain 30564 1726882906.02062: waiting for pending results... 30564 1726882906.02424: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30564 1726882906.02581: in run() - task 0e448fcc-3ce9-4216-acec-0000000021a8 30564 1726882906.02599: variable 'ansible_search_path' from source: unknown 30564 1726882906.02603: variable 'ansible_search_path' from source: unknown 30564 1726882906.02643: calling self._execute() 30564 1726882906.02796: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882906.02803: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882906.02813: variable 'omit' from source: magic vars 30564 1726882906.03215: variable 'ansible_distribution_major_version' from source: facts 30564 1726882906.03228: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882906.03438: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882906.06666: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882906.06750: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882906.06794: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882906.06836: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882906.06871: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882906.07011: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882906.07054: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882906.07162: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882906.07213: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882906.07374: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882906.07606: variable 'ansible_distribution' from source: facts 30564 1726882906.07616: variable 'ansible_distribution_major_version' from source: facts 30564 1726882906.07638: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 30564 1726882906.07855: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882906.08074: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882906.08249: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882906.08288: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882906.08348: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882906.08378: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882906.08435: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882906.08472: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882906.08510: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882906.08589: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882906.08608: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882906.08651: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882906.08709: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882906.08748: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882906.08804: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882906.08823: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882906.09029: variable 'network_connections' from source: include params 30564 1726882906.09046: variable 'interface' from source: play vars 30564 1726882906.09121: variable 'interface' from source: play vars 30564 1726882906.09202: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882906.09458: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882906.09524: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882906.09579: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882906.09613: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882906.09688: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882906.09755: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882906.09803: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882906.09835: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882906.09925: variable '__network_team_connections_defined' from source: role '' defaults 30564 1726882906.10309: variable 'network_connections' from source: include params 30564 1726882906.10320: variable 'interface' from source: play vars 30564 1726882906.10393: variable 'interface' from source: play vars 30564 1726882906.10435: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30564 1726882906.10444: when evaluation is False, skipping this task 30564 1726882906.10451: _execute() done 30564 1726882906.10459: dumping result to json 30564 1726882906.10474: done dumping result, returning 30564 1726882906.10487: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0e448fcc-3ce9-4216-acec-0000000021a8] 30564 1726882906.10508: sending task result for task 0e448fcc-3ce9-4216-acec-0000000021a8 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30564 1726882906.11104: no more pending results, returning what we have 30564 1726882906.11108: results queue empty 30564 1726882906.11109: checking for any_errors_fatal 30564 1726882906.11118: done checking for any_errors_fatal 30564 1726882906.11118: checking for max_fail_percentage 30564 1726882906.11120: done checking for max_fail_percentage 30564 1726882906.11121: checking to see if all hosts have failed and the running result is not ok 30564 1726882906.11122: done checking to see if all hosts have failed 30564 1726882906.11123: getting the remaining hosts for this loop 30564 1726882906.11125: done getting the remaining hosts for this loop 30564 1726882906.11128: getting the next task for host managed_node2 30564 1726882906.11138: done getting next task for host managed_node2 30564 1726882906.11142: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30564 1726882906.11148: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882906.11174: getting variables 30564 1726882906.11176: in VariableManager get_vars() 30564 1726882906.11222: Calling all_inventory to load vars for managed_node2 30564 1726882906.11224: Calling groups_inventory to load vars for managed_node2 30564 1726882906.11227: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882906.11238: Calling all_plugins_play to load vars for managed_node2 30564 1726882906.11241: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882906.11243: Calling groups_plugins_play to load vars for managed_node2 30564 1726882906.12527: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000021a8 30564 1726882906.12530: WORKER PROCESS EXITING 30564 1726882906.13167: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882906.15921: done with get_vars() 30564 1726882906.15949: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30564 1726882906.16030: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:41:46 -0400 (0:00:00.146) 0:01:44.741 ****** 30564 1726882906.16067: entering _queue_task() for managed_node2/yum 30564 1726882906.16473: worker is 1 (out of 1 available) 30564 1726882906.16485: exiting _queue_task() for managed_node2/yum 30564 1726882906.16498: done queuing things up, now waiting for results queue to drain 30564 1726882906.16500: waiting for pending results... 30564 1726882906.16896: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30564 1726882906.17095: in run() - task 0e448fcc-3ce9-4216-acec-0000000021a9 30564 1726882906.17123: variable 'ansible_search_path' from source: unknown 30564 1726882906.17217: variable 'ansible_search_path' from source: unknown 30564 1726882906.17260: calling self._execute() 30564 1726882906.17377: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882906.17407: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882906.17429: variable 'omit' from source: magic vars 30564 1726882906.17895: variable 'ansible_distribution_major_version' from source: facts 30564 1726882906.17913: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882906.18114: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882906.22207: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882906.22300: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882906.22341: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882906.22385: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882906.22417: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882906.22493: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882906.22526: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882906.22552: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882906.22597: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882906.22623: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882906.22746: variable 'ansible_distribution_major_version' from source: facts 30564 1726882906.22774: Evaluated conditional (ansible_distribution_major_version | int < 8): False 30564 1726882906.22783: when evaluation is False, skipping this task 30564 1726882906.22791: _execute() done 30564 1726882906.22798: dumping result to json 30564 1726882906.22804: done dumping result, returning 30564 1726882906.22816: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0e448fcc-3ce9-4216-acec-0000000021a9] 30564 1726882906.22830: sending task result for task 0e448fcc-3ce9-4216-acec-0000000021a9 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 30564 1726882906.23031: no more pending results, returning what we have 30564 1726882906.23036: results queue empty 30564 1726882906.23037: checking for any_errors_fatal 30564 1726882906.23045: done checking for any_errors_fatal 30564 1726882906.23046: checking for max_fail_percentage 30564 1726882906.23048: done checking for max_fail_percentage 30564 1726882906.23050: checking to see if all hosts have failed and the running result is not ok 30564 1726882906.23050: done checking to see if all hosts have failed 30564 1726882906.23051: getting the remaining hosts for this loop 30564 1726882906.23053: done getting the remaining hosts for this loop 30564 1726882906.23058: getting the next task for host managed_node2 30564 1726882906.23071: done getting next task for host managed_node2 30564 1726882906.23078: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30564 1726882906.23084: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882906.23108: getting variables 30564 1726882906.23110: in VariableManager get_vars() 30564 1726882906.23161: Calling all_inventory to load vars for managed_node2 30564 1726882906.23185: Calling groups_inventory to load vars for managed_node2 30564 1726882906.23189: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882906.23202: Calling all_plugins_play to load vars for managed_node2 30564 1726882906.23205: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882906.23209: Calling groups_plugins_play to load vars for managed_node2 30564 1726882906.24575: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000021a9 30564 1726882906.24580: WORKER PROCESS EXITING 30564 1726882906.26516: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882906.28332: done with get_vars() 30564 1726882906.28361: done getting variables 30564 1726882906.28422: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:41:46 -0400 (0:00:00.123) 0:01:44.865 ****** 30564 1726882906.28462: entering _queue_task() for managed_node2/fail 30564 1726882906.28872: worker is 1 (out of 1 available) 30564 1726882906.28887: exiting _queue_task() for managed_node2/fail 30564 1726882906.28901: done queuing things up, now waiting for results queue to drain 30564 1726882906.28902: waiting for pending results... 30564 1726882906.29284: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30564 1726882906.29447: in run() - task 0e448fcc-3ce9-4216-acec-0000000021aa 30564 1726882906.29479: variable 'ansible_search_path' from source: unknown 30564 1726882906.29489: variable 'ansible_search_path' from source: unknown 30564 1726882906.29548: calling self._execute() 30564 1726882906.29681: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882906.29701: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882906.29717: variable 'omit' from source: magic vars 30564 1726882906.30244: variable 'ansible_distribution_major_version' from source: facts 30564 1726882906.30290: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882906.30422: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882906.30637: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882906.32626: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882906.32686: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882906.32714: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882906.32741: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882906.32762: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882906.32824: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882906.32844: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882906.32865: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882906.32894: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882906.32904: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882906.32940: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882906.32955: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882906.32978: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882906.33003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882906.33013: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882906.33043: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882906.33059: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882906.33082: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882906.33106: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882906.33116: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882906.33292: variable 'network_connections' from source: include params 30564 1726882906.33295: variable 'interface' from source: play vars 30564 1726882906.33374: variable 'interface' from source: play vars 30564 1726882906.33544: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882906.33670: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882906.33907: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882906.33935: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882906.33961: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882906.34006: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882906.34032: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882906.34056: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882906.34087: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882906.34149: variable '__network_team_connections_defined' from source: role '' defaults 30564 1726882906.34542: variable 'network_connections' from source: include params 30564 1726882906.34547: variable 'interface' from source: play vars 30564 1726882906.34616: variable 'interface' from source: play vars 30564 1726882906.34641: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30564 1726882906.34645: when evaluation is False, skipping this task 30564 1726882906.34653: _execute() done 30564 1726882906.34656: dumping result to json 30564 1726882906.34660: done dumping result, returning 30564 1726882906.34668: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-4216-acec-0000000021aa] 30564 1726882906.34676: sending task result for task 0e448fcc-3ce9-4216-acec-0000000021aa 30564 1726882906.34771: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000021aa 30564 1726882906.34787: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30564 1726882906.34835: no more pending results, returning what we have 30564 1726882906.34838: results queue empty 30564 1726882906.34839: checking for any_errors_fatal 30564 1726882906.34847: done checking for any_errors_fatal 30564 1726882906.34848: checking for max_fail_percentage 30564 1726882906.34850: done checking for max_fail_percentage 30564 1726882906.34851: checking to see if all hosts have failed and the running result is not ok 30564 1726882906.34851: done checking to see if all hosts have failed 30564 1726882906.34852: getting the remaining hosts for this loop 30564 1726882906.34854: done getting the remaining hosts for this loop 30564 1726882906.34858: getting the next task for host managed_node2 30564 1726882906.34869: done getting next task for host managed_node2 30564 1726882906.34873: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 30564 1726882906.34880: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882906.34902: getting variables 30564 1726882906.34904: in VariableManager get_vars() 30564 1726882906.34946: Calling all_inventory to load vars for managed_node2 30564 1726882906.34949: Calling groups_inventory to load vars for managed_node2 30564 1726882906.34951: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882906.34960: Calling all_plugins_play to load vars for managed_node2 30564 1726882906.34963: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882906.34967: Calling groups_plugins_play to load vars for managed_node2 30564 1726882906.35918: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882906.37278: done with get_vars() 30564 1726882906.37300: done getting variables 30564 1726882906.37363: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:41:46 -0400 (0:00:00.089) 0:01:44.955 ****** 30564 1726882906.37402: entering _queue_task() for managed_node2/package 30564 1726882906.37738: worker is 1 (out of 1 available) 30564 1726882906.37750: exiting _queue_task() for managed_node2/package 30564 1726882906.37763: done queuing things up, now waiting for results queue to drain 30564 1726882906.37773: waiting for pending results... 30564 1726882906.38187: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 30564 1726882906.38351: in run() - task 0e448fcc-3ce9-4216-acec-0000000021ab 30564 1726882906.38357: variable 'ansible_search_path' from source: unknown 30564 1726882906.38361: variable 'ansible_search_path' from source: unknown 30564 1726882906.38388: calling self._execute() 30564 1726882906.38464: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882906.38473: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882906.38482: variable 'omit' from source: magic vars 30564 1726882906.38774: variable 'ansible_distribution_major_version' from source: facts 30564 1726882906.38786: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882906.38925: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882906.39123: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882906.39156: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882906.39183: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882906.39231: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882906.39311: variable 'network_packages' from source: role '' defaults 30564 1726882906.39385: variable '__network_provider_setup' from source: role '' defaults 30564 1726882906.39393: variable '__network_service_name_default_nm' from source: role '' defaults 30564 1726882906.39437: variable '__network_service_name_default_nm' from source: role '' defaults 30564 1726882906.39443: variable '__network_packages_default_nm' from source: role '' defaults 30564 1726882906.39499: variable '__network_packages_default_nm' from source: role '' defaults 30564 1726882906.39618: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882906.41618: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882906.41775: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882906.41813: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882906.41851: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882906.41885: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882906.41975: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882906.42010: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882906.42042: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882906.42092: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882906.42711: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882906.42780: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882906.42847: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882906.42897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882906.42942: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882906.43006: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882906.43239: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30564 1726882906.43319: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882906.43336: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882906.43353: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882906.43388: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882906.43399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882906.43462: variable 'ansible_python' from source: facts 30564 1726882906.43478: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30564 1726882906.43535: variable '__network_wpa_supplicant_required' from source: role '' defaults 30564 1726882906.43591: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30564 1726882906.43675: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882906.43692: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882906.43708: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882906.43732: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882906.43746: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882906.43780: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882906.43798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882906.43815: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882906.43839: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882906.43854: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882906.43947: variable 'network_connections' from source: include params 30564 1726882906.43950: variable 'interface' from source: play vars 30564 1726882906.44024: variable 'interface' from source: play vars 30564 1726882906.44079: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882906.44098: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882906.44120: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882906.44141: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882906.44183: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882906.44358: variable 'network_connections' from source: include params 30564 1726882906.44361: variable 'interface' from source: play vars 30564 1726882906.44435: variable 'interface' from source: play vars 30564 1726882906.44473: variable '__network_packages_default_wireless' from source: role '' defaults 30564 1726882906.44527: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882906.44722: variable 'network_connections' from source: include params 30564 1726882906.44725: variable 'interface' from source: play vars 30564 1726882906.44774: variable 'interface' from source: play vars 30564 1726882906.44792: variable '__network_packages_default_team' from source: role '' defaults 30564 1726882906.44845: variable '__network_team_connections_defined' from source: role '' defaults 30564 1726882906.45676: variable 'network_connections' from source: include params 30564 1726882906.45680: variable 'interface' from source: play vars 30564 1726882906.45682: variable 'interface' from source: play vars 30564 1726882906.45685: variable '__network_service_name_default_initscripts' from source: role '' defaults 30564 1726882906.45687: variable '__network_service_name_default_initscripts' from source: role '' defaults 30564 1726882906.45689: variable '__network_packages_default_initscripts' from source: role '' defaults 30564 1726882906.45691: variable '__network_packages_default_initscripts' from source: role '' defaults 30564 1726882906.45693: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30564 1726882906.46106: variable 'network_connections' from source: include params 30564 1726882906.46110: variable 'interface' from source: play vars 30564 1726882906.46170: variable 'interface' from source: play vars 30564 1726882906.46177: variable 'ansible_distribution' from source: facts 30564 1726882906.46180: variable '__network_rh_distros' from source: role '' defaults 30564 1726882906.46186: variable 'ansible_distribution_major_version' from source: facts 30564 1726882906.46212: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30564 1726882906.46362: variable 'ansible_distribution' from source: facts 30564 1726882906.46369: variable '__network_rh_distros' from source: role '' defaults 30564 1726882906.46373: variable 'ansible_distribution_major_version' from source: facts 30564 1726882906.46381: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30564 1726882906.46589: variable 'ansible_distribution' from source: facts 30564 1726882906.46592: variable '__network_rh_distros' from source: role '' defaults 30564 1726882906.46595: variable 'ansible_distribution_major_version' from source: facts 30564 1726882906.46597: variable 'network_provider' from source: set_fact 30564 1726882906.46599: variable 'ansible_facts' from source: unknown 30564 1726882906.48024: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 30564 1726882906.48027: when evaluation is False, skipping this task 30564 1726882906.48030: _execute() done 30564 1726882906.48033: dumping result to json 30564 1726882906.48035: done dumping result, returning 30564 1726882906.48044: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [0e448fcc-3ce9-4216-acec-0000000021ab] 30564 1726882906.48049: sending task result for task 0e448fcc-3ce9-4216-acec-0000000021ab 30564 1726882906.48149: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000021ab 30564 1726882906.48152: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 30564 1726882906.48202: no more pending results, returning what we have 30564 1726882906.48206: results queue empty 30564 1726882906.48207: checking for any_errors_fatal 30564 1726882906.48214: done checking for any_errors_fatal 30564 1726882906.48215: checking for max_fail_percentage 30564 1726882906.48217: done checking for max_fail_percentage 30564 1726882906.48218: checking to see if all hosts have failed and the running result is not ok 30564 1726882906.48219: done checking to see if all hosts have failed 30564 1726882906.48219: getting the remaining hosts for this loop 30564 1726882906.48221: done getting the remaining hosts for this loop 30564 1726882906.48225: getting the next task for host managed_node2 30564 1726882906.48233: done getting next task for host managed_node2 30564 1726882906.48237: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30564 1726882906.48244: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882906.48265: getting variables 30564 1726882906.48267: in VariableManager get_vars() 30564 1726882906.48318: Calling all_inventory to load vars for managed_node2 30564 1726882906.48320: Calling groups_inventory to load vars for managed_node2 30564 1726882906.48322: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882906.48333: Calling all_plugins_play to load vars for managed_node2 30564 1726882906.48335: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882906.48338: Calling groups_plugins_play to load vars for managed_node2 30564 1726882906.49883: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882906.51700: done with get_vars() 30564 1726882906.51729: done getting variables 30564 1726882906.51798: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:41:46 -0400 (0:00:00.144) 0:01:45.099 ****** 30564 1726882906.51839: entering _queue_task() for managed_node2/package 30564 1726882906.52199: worker is 1 (out of 1 available) 30564 1726882906.52212: exiting _queue_task() for managed_node2/package 30564 1726882906.52226: done queuing things up, now waiting for results queue to drain 30564 1726882906.52228: waiting for pending results... 30564 1726882906.52559: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30564 1726882906.52723: in run() - task 0e448fcc-3ce9-4216-acec-0000000021ac 30564 1726882906.52742: variable 'ansible_search_path' from source: unknown 30564 1726882906.52751: variable 'ansible_search_path' from source: unknown 30564 1726882906.52804: calling self._execute() 30564 1726882906.52915: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882906.52925: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882906.52940: variable 'omit' from source: magic vars 30564 1726882906.53336: variable 'ansible_distribution_major_version' from source: facts 30564 1726882906.53355: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882906.53492: variable 'network_state' from source: role '' defaults 30564 1726882906.53507: Evaluated conditional (network_state != {}): False 30564 1726882906.53515: when evaluation is False, skipping this task 30564 1726882906.53521: _execute() done 30564 1726882906.53528: dumping result to json 30564 1726882906.53535: done dumping result, returning 30564 1726882906.53547: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0e448fcc-3ce9-4216-acec-0000000021ac] 30564 1726882906.53564: sending task result for task 0e448fcc-3ce9-4216-acec-0000000021ac skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30564 1726882906.53726: no more pending results, returning what we have 30564 1726882906.53730: results queue empty 30564 1726882906.53731: checking for any_errors_fatal 30564 1726882906.53739: done checking for any_errors_fatal 30564 1726882906.53739: checking for max_fail_percentage 30564 1726882906.53742: done checking for max_fail_percentage 30564 1726882906.53743: checking to see if all hosts have failed and the running result is not ok 30564 1726882906.53743: done checking to see if all hosts have failed 30564 1726882906.53744: getting the remaining hosts for this loop 30564 1726882906.53746: done getting the remaining hosts for this loop 30564 1726882906.53750: getting the next task for host managed_node2 30564 1726882906.53759: done getting next task for host managed_node2 30564 1726882906.53763: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30564 1726882906.53775: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882906.53801: getting variables 30564 1726882906.53803: in VariableManager get_vars() 30564 1726882906.53849: Calling all_inventory to load vars for managed_node2 30564 1726882906.53852: Calling groups_inventory to load vars for managed_node2 30564 1726882906.53855: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882906.53872: Calling all_plugins_play to load vars for managed_node2 30564 1726882906.53876: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882906.53879: Calling groups_plugins_play to load vars for managed_node2 30564 1726882906.54905: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000021ac 30564 1726882906.54908: WORKER PROCESS EXITING 30564 1726882906.56393: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882906.58784: done with get_vars() 30564 1726882906.58817: done getting variables 30564 1726882906.58919: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:41:46 -0400 (0:00:00.071) 0:01:45.170 ****** 30564 1726882906.58960: entering _queue_task() for managed_node2/package 30564 1726882906.59520: worker is 1 (out of 1 available) 30564 1726882906.59534: exiting _queue_task() for managed_node2/package 30564 1726882906.59548: done queuing things up, now waiting for results queue to drain 30564 1726882906.59550: waiting for pending results... 30564 1726882906.60612: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30564 1726882906.60797: in run() - task 0e448fcc-3ce9-4216-acec-0000000021ad 30564 1726882906.60817: variable 'ansible_search_path' from source: unknown 30564 1726882906.60824: variable 'ansible_search_path' from source: unknown 30564 1726882906.60875: calling self._execute() 30564 1726882906.60990: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882906.61001: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882906.61017: variable 'omit' from source: magic vars 30564 1726882906.61447: variable 'ansible_distribution_major_version' from source: facts 30564 1726882906.61512: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882906.61644: variable 'network_state' from source: role '' defaults 30564 1726882906.61660: Evaluated conditional (network_state != {}): False 30564 1726882906.61672: when evaluation is False, skipping this task 30564 1726882906.61680: _execute() done 30564 1726882906.61687: dumping result to json 30564 1726882906.61693: done dumping result, returning 30564 1726882906.61706: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0e448fcc-3ce9-4216-acec-0000000021ad] 30564 1726882906.61720: sending task result for task 0e448fcc-3ce9-4216-acec-0000000021ad skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30564 1726882906.61886: no more pending results, returning what we have 30564 1726882906.61890: results queue empty 30564 1726882906.61891: checking for any_errors_fatal 30564 1726882906.61899: done checking for any_errors_fatal 30564 1726882906.61900: checking for max_fail_percentage 30564 1726882906.61902: done checking for max_fail_percentage 30564 1726882906.61903: checking to see if all hosts have failed and the running result is not ok 30564 1726882906.61904: done checking to see if all hosts have failed 30564 1726882906.61905: getting the remaining hosts for this loop 30564 1726882906.61907: done getting the remaining hosts for this loop 30564 1726882906.61911: getting the next task for host managed_node2 30564 1726882906.61920: done getting next task for host managed_node2 30564 1726882906.61924: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30564 1726882906.61932: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882906.61958: getting variables 30564 1726882906.61960: in VariableManager get_vars() 30564 1726882906.62013: Calling all_inventory to load vars for managed_node2 30564 1726882906.62016: Calling groups_inventory to load vars for managed_node2 30564 1726882906.62019: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882906.62031: Calling all_plugins_play to load vars for managed_node2 30564 1726882906.62034: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882906.62037: Calling groups_plugins_play to load vars for managed_node2 30564 1726882906.62985: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000021ad 30564 1726882906.62989: WORKER PROCESS EXITING 30564 1726882906.65085: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882906.67057: done with get_vars() 30564 1726882906.68497: done getting variables 30564 1726882906.68558: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:41:46 -0400 (0:00:00.096) 0:01:45.267 ****** 30564 1726882906.68599: entering _queue_task() for managed_node2/service 30564 1726882906.69315: worker is 1 (out of 1 available) 30564 1726882906.69327: exiting _queue_task() for managed_node2/service 30564 1726882906.69340: done queuing things up, now waiting for results queue to drain 30564 1726882906.69341: waiting for pending results... 30564 1726882906.70045: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30564 1726882906.70184: in run() - task 0e448fcc-3ce9-4216-acec-0000000021ae 30564 1726882906.70203: variable 'ansible_search_path' from source: unknown 30564 1726882906.70210: variable 'ansible_search_path' from source: unknown 30564 1726882906.70252: calling self._execute() 30564 1726882906.70362: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882906.70376: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882906.70392: variable 'omit' from source: magic vars 30564 1726882906.70756: variable 'ansible_distribution_major_version' from source: facts 30564 1726882906.70776: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882906.70899: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882906.71098: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882906.73396: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882906.73477: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882906.73519: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882906.73555: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882906.73588: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882906.73671: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882906.73705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882906.73738: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882906.73790: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882906.73809: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882906.73855: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882906.73887: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882906.73916: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882906.73958: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882906.73979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882906.74022: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882906.74051: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882906.74082: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882906.74123: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882906.74140: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882906.74307: variable 'network_connections' from source: include params 30564 1726882906.74323: variable 'interface' from source: play vars 30564 1726882906.74389: variable 'interface' from source: play vars 30564 1726882906.74459: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882906.74621: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882906.74674: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882906.74709: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882906.74740: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882906.74787: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882906.74815: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882906.74843: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882906.74876: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882906.74936: variable '__network_team_connections_defined' from source: role '' defaults 30564 1726882906.75179: variable 'network_connections' from source: include params 30564 1726882906.75189: variable 'interface' from source: play vars 30564 1726882906.75251: variable 'interface' from source: play vars 30564 1726882906.75287: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30564 1726882906.75294: when evaluation is False, skipping this task 30564 1726882906.75301: _execute() done 30564 1726882906.75307: dumping result to json 30564 1726882906.75313: done dumping result, returning 30564 1726882906.75322: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-4216-acec-0000000021ae] 30564 1726882906.75333: sending task result for task 0e448fcc-3ce9-4216-acec-0000000021ae 30564 1726882906.75442: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000021ae 30564 1726882906.75498: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30564 1726882906.75509: no more pending results, returning what we have 30564 1726882906.75513: results queue empty 30564 1726882906.75514: checking for any_errors_fatal 30564 1726882906.75523: done checking for any_errors_fatal 30564 1726882906.75523: checking for max_fail_percentage 30564 1726882906.75525: done checking for max_fail_percentage 30564 1726882906.75526: checking to see if all hosts have failed and the running result is not ok 30564 1726882906.75527: done checking to see if all hosts have failed 30564 1726882906.75528: getting the remaining hosts for this loop 30564 1726882906.75530: done getting the remaining hosts for this loop 30564 1726882906.75534: getting the next task for host managed_node2 30564 1726882906.75543: done getting next task for host managed_node2 30564 1726882906.75548: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30564 1726882906.75555: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882906.75577: getting variables 30564 1726882906.75580: in VariableManager get_vars() 30564 1726882906.75625: Calling all_inventory to load vars for managed_node2 30564 1726882906.75627: Calling groups_inventory to load vars for managed_node2 30564 1726882906.75630: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882906.75640: Calling all_plugins_play to load vars for managed_node2 30564 1726882906.75643: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882906.75646: Calling groups_plugins_play to load vars for managed_node2 30564 1726882906.77388: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882906.79153: done with get_vars() 30564 1726882906.79178: done getting variables 30564 1726882906.79237: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:41:46 -0400 (0:00:00.106) 0:01:45.374 ****** 30564 1726882906.79271: entering _queue_task() for managed_node2/service 30564 1726882906.79568: worker is 1 (out of 1 available) 30564 1726882906.79580: exiting _queue_task() for managed_node2/service 30564 1726882906.79592: done queuing things up, now waiting for results queue to drain 30564 1726882906.79593: waiting for pending results... 30564 1726882906.79890: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30564 1726882906.80037: in run() - task 0e448fcc-3ce9-4216-acec-0000000021af 30564 1726882906.80056: variable 'ansible_search_path' from source: unknown 30564 1726882906.80066: variable 'ansible_search_path' from source: unknown 30564 1726882906.80104: calling self._execute() 30564 1726882906.80209: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882906.80219: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882906.80234: variable 'omit' from source: magic vars 30564 1726882906.80602: variable 'ansible_distribution_major_version' from source: facts 30564 1726882906.80620: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882906.80768: variable 'network_provider' from source: set_fact 30564 1726882906.80780: variable 'network_state' from source: role '' defaults 30564 1726882906.80801: Evaluated conditional (network_provider == "nm" or network_state != {}): True 30564 1726882906.80812: variable 'omit' from source: magic vars 30564 1726882906.80876: variable 'omit' from source: magic vars 30564 1726882906.80904: variable 'network_service_name' from source: role '' defaults 30564 1726882906.80972: variable 'network_service_name' from source: role '' defaults 30564 1726882906.81092: variable '__network_provider_setup' from source: role '' defaults 30564 1726882906.81104: variable '__network_service_name_default_nm' from source: role '' defaults 30564 1726882906.81176: variable '__network_service_name_default_nm' from source: role '' defaults 30564 1726882906.81191: variable '__network_packages_default_nm' from source: role '' defaults 30564 1726882906.81294: variable '__network_packages_default_nm' from source: role '' defaults 30564 1726882906.81815: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882906.86334: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882906.86417: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882906.86459: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882906.86501: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882906.86530: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882906.86615: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882906.86738: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882906.86789: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882906.86840: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882906.86860: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882906.86941: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882906.86971: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882906.87000: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882906.87076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882906.87096: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882906.87337: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30564 1726882906.87456: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882906.87492: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882906.87523: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882906.87568: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882906.87592: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882906.87688: variable 'ansible_python' from source: facts 30564 1726882906.87713: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30564 1726882906.87801: variable '__network_wpa_supplicant_required' from source: role '' defaults 30564 1726882906.87889: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30564 1726882906.88022: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882906.88054: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882906.88088: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882906.88133: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882906.88155: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882906.88206: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882906.88244: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882906.88281: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882906.88326: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882906.88345: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882906.88499: variable 'network_connections' from source: include params 30564 1726882906.88511: variable 'interface' from source: play vars 30564 1726882906.88591: variable 'interface' from source: play vars 30564 1726882906.88705: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882906.89139: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882906.89199: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882906.89244: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882906.89291: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882906.89357: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882906.89395: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882906.89437: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882906.89479: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882906.89536: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882906.89848: variable 'network_connections' from source: include params 30564 1726882906.89859: variable 'interface' from source: play vars 30564 1726882906.89935: variable 'interface' from source: play vars 30564 1726882906.89993: variable '__network_packages_default_wireless' from source: role '' defaults 30564 1726882906.90079: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882906.90375: variable 'network_connections' from source: include params 30564 1726882906.90389: variable 'interface' from source: play vars 30564 1726882906.90459: variable 'interface' from source: play vars 30564 1726882906.90492: variable '__network_packages_default_team' from source: role '' defaults 30564 1726882906.90577: variable '__network_team_connections_defined' from source: role '' defaults 30564 1726882906.91029: variable 'network_connections' from source: include params 30564 1726882906.91039: variable 'interface' from source: play vars 30564 1726882906.91130: variable 'interface' from source: play vars 30564 1726882906.91237: variable '__network_service_name_default_initscripts' from source: role '' defaults 30564 1726882906.91307: variable '__network_service_name_default_initscripts' from source: role '' defaults 30564 1726882906.91319: variable '__network_packages_default_initscripts' from source: role '' defaults 30564 1726882906.91395: variable '__network_packages_default_initscripts' from source: role '' defaults 30564 1726882906.91625: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30564 1726882906.92128: variable 'network_connections' from source: include params 30564 1726882906.92138: variable 'interface' from source: play vars 30564 1726882906.92205: variable 'interface' from source: play vars 30564 1726882906.92219: variable 'ansible_distribution' from source: facts 30564 1726882906.92227: variable '__network_rh_distros' from source: role '' defaults 30564 1726882906.92236: variable 'ansible_distribution_major_version' from source: facts 30564 1726882906.92273: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30564 1726882906.92445: variable 'ansible_distribution' from source: facts 30564 1726882906.92453: variable '__network_rh_distros' from source: role '' defaults 30564 1726882906.92462: variable 'ansible_distribution_major_version' from source: facts 30564 1726882906.92479: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30564 1726882906.92649: variable 'ansible_distribution' from source: facts 30564 1726882906.92658: variable '__network_rh_distros' from source: role '' defaults 30564 1726882906.92669: variable 'ansible_distribution_major_version' from source: facts 30564 1726882906.92711: variable 'network_provider' from source: set_fact 30564 1726882906.92736: variable 'omit' from source: magic vars 30564 1726882906.92768: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882906.92803: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882906.92824: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882906.92844: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882906.92857: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882906.92892: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882906.92902: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882906.92911: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882906.93013: Set connection var ansible_timeout to 10 30564 1726882906.93026: Set connection var ansible_pipelining to False 30564 1726882906.93033: Set connection var ansible_shell_type to sh 30564 1726882906.93042: Set connection var ansible_shell_executable to /bin/sh 30564 1726882906.93052: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882906.93058: Set connection var ansible_connection to ssh 30564 1726882906.93092: variable 'ansible_shell_executable' from source: unknown 30564 1726882906.93099: variable 'ansible_connection' from source: unknown 30564 1726882906.93137: variable 'ansible_module_compression' from source: unknown 30564 1726882906.93145: variable 'ansible_shell_type' from source: unknown 30564 1726882906.93152: variable 'ansible_shell_executable' from source: unknown 30564 1726882906.93159: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882906.93169: variable 'ansible_pipelining' from source: unknown 30564 1726882906.93175: variable 'ansible_timeout' from source: unknown 30564 1726882906.93182: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882906.93709: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882906.93729: variable 'omit' from source: magic vars 30564 1726882906.93741: starting attempt loop 30564 1726882906.93749: running the handler 30564 1726882906.93845: variable 'ansible_facts' from source: unknown 30564 1726882906.94682: _low_level_execute_command(): starting 30564 1726882906.94692: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882906.95405: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882906.95419: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882906.95437: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882906.95453: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882906.95497: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882906.95509: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882906.95522: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882906.95542: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882906.95552: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882906.95562: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882906.95577: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882906.95590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882906.95604: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882906.95615: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882906.95625: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882906.95637: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882906.95727: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882906.95745: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882906.95763: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882906.96002: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882906.97657: stdout chunk (state=3): >>>/root <<< 30564 1726882906.97842: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882906.97845: stdout chunk (state=3): >>><<< 30564 1726882906.97848: stderr chunk (state=3): >>><<< 30564 1726882906.97956: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882906.97960: _low_level_execute_command(): starting 30564 1726882906.97963: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882906.978673-35158-262550615074412 `" && echo ansible-tmp-1726882906.978673-35158-262550615074412="` echo /root/.ansible/tmp/ansible-tmp-1726882906.978673-35158-262550615074412 `" ) && sleep 0' 30564 1726882906.98708: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882906.98721: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882906.98735: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882906.98772: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882906.98818: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882906.98830: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882906.98844: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882906.98887: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882906.98898: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882906.98908: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882906.98919: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882906.98935: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882906.98958: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882906.98977: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882906.99478: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882906.99494: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882906.99873: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882906.99890: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882906.99904: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882907.00037: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882907.01922: stdout chunk (state=3): >>>ansible-tmp-1726882906.978673-35158-262550615074412=/root/.ansible/tmp/ansible-tmp-1726882906.978673-35158-262550615074412 <<< 30564 1726882907.02111: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882907.02114: stdout chunk (state=3): >>><<< 30564 1726882907.02122: stderr chunk (state=3): >>><<< 30564 1726882907.02138: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882906.978673-35158-262550615074412=/root/.ansible/tmp/ansible-tmp-1726882906.978673-35158-262550615074412 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882907.02192: variable 'ansible_module_compression' from source: unknown 30564 1726882907.02248: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 30564 1726882907.02312: variable 'ansible_facts' from source: unknown 30564 1726882907.02523: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882906.978673-35158-262550615074412/AnsiballZ_systemd.py 30564 1726882907.02989: Sending initial data 30564 1726882907.02993: Sent initial data (155 bytes) 30564 1726882907.04839: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882907.04847: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882907.04859: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882907.04882: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882907.04925: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882907.04931: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882907.04941: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882907.04953: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882907.04960: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882907.04968: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882907.04983: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882907.04992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882907.05002: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882907.05015: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882907.05021: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882907.05030: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882907.05112: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882907.05134: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882907.05145: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882907.05270: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882907.07031: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882907.07123: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882907.07223: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmp_6xe8cis /root/.ansible/tmp/ansible-tmp-1726882906.978673-35158-262550615074412/AnsiballZ_systemd.py <<< 30564 1726882907.07319: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882907.09976: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882907.10119: stderr chunk (state=3): >>><<< 30564 1726882907.10123: stdout chunk (state=3): >>><<< 30564 1726882907.10147: done transferring module to remote 30564 1726882907.10158: _low_level_execute_command(): starting 30564 1726882907.10163: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882906.978673-35158-262550615074412/ /root/.ansible/tmp/ansible-tmp-1726882906.978673-35158-262550615074412/AnsiballZ_systemd.py && sleep 0' 30564 1726882907.10786: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882907.10794: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882907.10808: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882907.10823: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882907.10857: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882907.10865: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882907.10879: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882907.10892: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882907.10900: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882907.10906: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882907.10915: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882907.10923: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882907.10934: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882907.10941: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882907.10947: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882907.10957: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882907.11056: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882907.11071: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882907.11088: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882907.11220: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882907.13036: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882907.13057: stderr chunk (state=3): >>><<< 30564 1726882907.13060: stdout chunk (state=3): >>><<< 30564 1726882907.13148: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882907.13155: _low_level_execute_command(): starting 30564 1726882907.13158: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882906.978673-35158-262550615074412/AnsiballZ_systemd.py && sleep 0' 30564 1726882907.13698: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882907.13711: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882907.13724: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882907.13740: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882907.13785: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882907.13800: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882907.13818: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882907.13837: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882907.13853: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882907.13870: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882907.13885: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882907.13901: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882907.13916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882907.13927: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882907.13936: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882907.13948: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882907.14025: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882907.14041: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882907.14055: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882907.14197: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882907.39215: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6692", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ExecMainStartTimestampMonotonic": "202392137", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6692", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManag<<< 30564 1726882907.39247: stdout chunk (state=3): >>>er.service", "ControlGroupId": "3602", "MemoryCurrent": "9175040", "MemoryAvailable": "infinity", "CPUUsageNSec": "2347530000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Watchdo<<< 30564 1726882907.39267: stdout chunk (state=3): >>>gSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service multi-user.target network.target shutdown.target cloud-init.service", "After": "cloud-init-local.service dbus-broker.service network-pre.target system.slice dbus.socket systemd-journald.socket basic.target sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:57 EDT", "StateChangeTimestampMonotonic": "316658837", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveExitTimestampMonotonic": "202392395", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveEnterTimestampMonotonic": "202472383", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveExitTimestampMonotonic": "202362940", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveEnterTimestampMonotonic": "202381901", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ConditionTimestampMonotonic": "202382734", "AssertTimestamp": "Fri 2024-09-20 21:31:03 EDT", "AssertTimestampMonotonic": "202382737", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "55e27919215348fab37a11b7ea324f90", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 30564 1726882907.40906: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882907.40911: stdout chunk (state=3): >>><<< 30564 1726882907.40916: stderr chunk (state=3): >>><<< 30564 1726882907.40934: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6692", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ExecMainStartTimestampMonotonic": "202392137", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6692", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3602", "MemoryCurrent": "9175040", "MemoryAvailable": "infinity", "CPUUsageNSec": "2347530000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service multi-user.target network.target shutdown.target cloud-init.service", "After": "cloud-init-local.service dbus-broker.service network-pre.target system.slice dbus.socket systemd-journald.socket basic.target sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:57 EDT", "StateChangeTimestampMonotonic": "316658837", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveExitTimestampMonotonic": "202392395", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveEnterTimestampMonotonic": "202472383", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveExitTimestampMonotonic": "202362940", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveEnterTimestampMonotonic": "202381901", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ConditionTimestampMonotonic": "202382734", "AssertTimestamp": "Fri 2024-09-20 21:31:03 EDT", "AssertTimestampMonotonic": "202382737", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "55e27919215348fab37a11b7ea324f90", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882907.41122: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882906.978673-35158-262550615074412/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882907.41141: _low_level_execute_command(): starting 30564 1726882907.41144: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882906.978673-35158-262550615074412/ > /dev/null 2>&1 && sleep 0' 30564 1726882907.41779: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882907.41787: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882907.41799: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882907.41811: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882907.41850: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882907.41853: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882907.41866: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882907.41888: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882907.41895: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882907.41903: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882907.41909: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882907.41918: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882907.41930: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882907.41937: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882907.41962: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882907.41979: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882907.42109: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882907.42126: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882907.42136: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882907.42255: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882907.44094: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882907.44160: stderr chunk (state=3): >>><<< 30564 1726882907.44163: stdout chunk (state=3): >>><<< 30564 1726882907.44181: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882907.44187: handler run complete 30564 1726882907.44245: attempt loop complete, returning result 30564 1726882907.44248: _execute() done 30564 1726882907.44250: dumping result to json 30564 1726882907.44262: done dumping result, returning 30564 1726882907.44267: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0e448fcc-3ce9-4216-acec-0000000021af] 30564 1726882907.44276: sending task result for task 0e448fcc-3ce9-4216-acec-0000000021af 30564 1726882907.45142: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000021af 30564 1726882907.45145: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30564 1726882907.45202: no more pending results, returning what we have 30564 1726882907.45205: results queue empty 30564 1726882907.45206: checking for any_errors_fatal 30564 1726882907.45210: done checking for any_errors_fatal 30564 1726882907.45211: checking for max_fail_percentage 30564 1726882907.45212: done checking for max_fail_percentage 30564 1726882907.45213: checking to see if all hosts have failed and the running result is not ok 30564 1726882907.45214: done checking to see if all hosts have failed 30564 1726882907.45215: getting the remaining hosts for this loop 30564 1726882907.45216: done getting the remaining hosts for this loop 30564 1726882907.45219: getting the next task for host managed_node2 30564 1726882907.45230: done getting next task for host managed_node2 30564 1726882907.45233: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30564 1726882907.45238: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882907.45251: getting variables 30564 1726882907.45252: in VariableManager get_vars() 30564 1726882907.45282: Calling all_inventory to load vars for managed_node2 30564 1726882907.45284: Calling groups_inventory to load vars for managed_node2 30564 1726882907.45285: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882907.45292: Calling all_plugins_play to load vars for managed_node2 30564 1726882907.45294: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882907.45300: Calling groups_plugins_play to load vars for managed_node2 30564 1726882907.46483: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882907.47610: done with get_vars() 30564 1726882907.47627: done getting variables 30564 1726882907.47675: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:41:47 -0400 (0:00:00.684) 0:01:46.058 ****** 30564 1726882907.47701: entering _queue_task() for managed_node2/service 30564 1726882907.47935: worker is 1 (out of 1 available) 30564 1726882907.47949: exiting _queue_task() for managed_node2/service 30564 1726882907.47962: done queuing things up, now waiting for results queue to drain 30564 1726882907.47965: waiting for pending results... 30564 1726882907.48161: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30564 1726882907.48260: in run() - task 0e448fcc-3ce9-4216-acec-0000000021b0 30564 1726882907.48274: variable 'ansible_search_path' from source: unknown 30564 1726882907.48279: variable 'ansible_search_path' from source: unknown 30564 1726882907.48307: calling self._execute() 30564 1726882907.48385: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882907.48389: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882907.48399: variable 'omit' from source: magic vars 30564 1726882907.48704: variable 'ansible_distribution_major_version' from source: facts 30564 1726882907.48723: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882907.48853: variable 'network_provider' from source: set_fact 30564 1726882907.48871: Evaluated conditional (network_provider == "nm"): True 30564 1726882907.48988: variable '__network_wpa_supplicant_required' from source: role '' defaults 30564 1726882907.49093: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30564 1726882907.49285: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882907.51049: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882907.51098: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882907.51126: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882907.51151: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882907.51173: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882907.51358: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882907.51381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882907.51399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882907.51428: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882907.51439: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882907.51475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882907.51491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882907.51507: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882907.51535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882907.51547: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882907.51576: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882907.51592: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882907.51608: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882907.51632: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882907.51643: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882907.51741: variable 'network_connections' from source: include params 30564 1726882907.51749: variable 'interface' from source: play vars 30564 1726882907.51799: variable 'interface' from source: play vars 30564 1726882907.51850: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882907.51959: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882907.51989: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882907.52026: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882907.52054: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882907.52091: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882907.52106: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882907.52124: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882907.52141: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882907.52181: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882907.52339: variable 'network_connections' from source: include params 30564 1726882907.52343: variable 'interface' from source: play vars 30564 1726882907.52388: variable 'interface' from source: play vars 30564 1726882907.52419: Evaluated conditional (__network_wpa_supplicant_required): False 30564 1726882907.52422: when evaluation is False, skipping this task 30564 1726882907.52425: _execute() done 30564 1726882907.52427: dumping result to json 30564 1726882907.52430: done dumping result, returning 30564 1726882907.52437: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0e448fcc-3ce9-4216-acec-0000000021b0] 30564 1726882907.52448: sending task result for task 0e448fcc-3ce9-4216-acec-0000000021b0 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 30564 1726882907.52583: no more pending results, returning what we have 30564 1726882907.52587: results queue empty 30564 1726882907.52588: checking for any_errors_fatal 30564 1726882907.52613: done checking for any_errors_fatal 30564 1726882907.52613: checking for max_fail_percentage 30564 1726882907.52615: done checking for max_fail_percentage 30564 1726882907.52616: checking to see if all hosts have failed and the running result is not ok 30564 1726882907.52617: done checking to see if all hosts have failed 30564 1726882907.52618: getting the remaining hosts for this loop 30564 1726882907.52620: done getting the remaining hosts for this loop 30564 1726882907.52626: getting the next task for host managed_node2 30564 1726882907.52634: done getting next task for host managed_node2 30564 1726882907.52638: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 30564 1726882907.52644: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882907.52666: getting variables 30564 1726882907.52670: in VariableManager get_vars() 30564 1726882907.52710: Calling all_inventory to load vars for managed_node2 30564 1726882907.52712: Calling groups_inventory to load vars for managed_node2 30564 1726882907.52714: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882907.52724: Calling all_plugins_play to load vars for managed_node2 30564 1726882907.52727: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882907.52729: Calling groups_plugins_play to load vars for managed_node2 30564 1726882907.53306: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000021b0 30564 1726882907.53310: WORKER PROCESS EXITING 30564 1726882907.54138: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882907.55238: done with get_vars() 30564 1726882907.55255: done getting variables 30564 1726882907.55300: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:41:47 -0400 (0:00:00.076) 0:01:46.134 ****** 30564 1726882907.55324: entering _queue_task() for managed_node2/service 30564 1726882907.55545: worker is 1 (out of 1 available) 30564 1726882907.55558: exiting _queue_task() for managed_node2/service 30564 1726882907.55574: done queuing things up, now waiting for results queue to drain 30564 1726882907.55576: waiting for pending results... 30564 1726882907.55763: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 30564 1726882907.55852: in run() - task 0e448fcc-3ce9-4216-acec-0000000021b1 30564 1726882907.55865: variable 'ansible_search_path' from source: unknown 30564 1726882907.55872: variable 'ansible_search_path' from source: unknown 30564 1726882907.55899: calling self._execute() 30564 1726882907.55979: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882907.55982: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882907.55995: variable 'omit' from source: magic vars 30564 1726882907.56314: variable 'ansible_distribution_major_version' from source: facts 30564 1726882907.56359: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882907.56821: variable 'network_provider' from source: set_fact 30564 1726882907.56832: Evaluated conditional (network_provider == "initscripts"): False 30564 1726882907.56838: when evaluation is False, skipping this task 30564 1726882907.56844: _execute() done 30564 1726882907.56850: dumping result to json 30564 1726882907.56856: done dumping result, returning 30564 1726882907.56868: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [0e448fcc-3ce9-4216-acec-0000000021b1] 30564 1726882907.56878: sending task result for task 0e448fcc-3ce9-4216-acec-0000000021b1 30564 1726882907.56981: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000021b1 30564 1726882907.56988: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30564 1726882907.57115: no more pending results, returning what we have 30564 1726882907.57119: results queue empty 30564 1726882907.57120: checking for any_errors_fatal 30564 1726882907.57127: done checking for any_errors_fatal 30564 1726882907.57128: checking for max_fail_percentage 30564 1726882907.57130: done checking for max_fail_percentage 30564 1726882907.57130: checking to see if all hosts have failed and the running result is not ok 30564 1726882907.57131: done checking to see if all hosts have failed 30564 1726882907.57132: getting the remaining hosts for this loop 30564 1726882907.57133: done getting the remaining hosts for this loop 30564 1726882907.57137: getting the next task for host managed_node2 30564 1726882907.57143: done getting next task for host managed_node2 30564 1726882907.57147: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30564 1726882907.57153: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882907.57221: getting variables 30564 1726882907.57223: in VariableManager get_vars() 30564 1726882907.57259: Calling all_inventory to load vars for managed_node2 30564 1726882907.57261: Calling groups_inventory to load vars for managed_node2 30564 1726882907.57265: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882907.57274: Calling all_plugins_play to load vars for managed_node2 30564 1726882907.57276: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882907.57279: Calling groups_plugins_play to load vars for managed_node2 30564 1726882907.58628: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882907.60372: done with get_vars() 30564 1726882907.60391: done getting variables 30564 1726882907.60440: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:41:47 -0400 (0:00:00.051) 0:01:46.186 ****** 30564 1726882907.60473: entering _queue_task() for managed_node2/copy 30564 1726882907.60730: worker is 1 (out of 1 available) 30564 1726882907.60743: exiting _queue_task() for managed_node2/copy 30564 1726882907.60756: done queuing things up, now waiting for results queue to drain 30564 1726882907.60757: waiting for pending results... 30564 1726882907.61060: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30564 1726882907.61226: in run() - task 0e448fcc-3ce9-4216-acec-0000000021b2 30564 1726882907.61244: variable 'ansible_search_path' from source: unknown 30564 1726882907.61254: variable 'ansible_search_path' from source: unknown 30564 1726882907.61295: calling self._execute() 30564 1726882907.61406: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882907.61423: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882907.61439: variable 'omit' from source: magic vars 30564 1726882907.62017: variable 'ansible_distribution_major_version' from source: facts 30564 1726882907.62035: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882907.62265: variable 'network_provider' from source: set_fact 30564 1726882907.62279: Evaluated conditional (network_provider == "initscripts"): False 30564 1726882907.62287: when evaluation is False, skipping this task 30564 1726882907.62296: _execute() done 30564 1726882907.62303: dumping result to json 30564 1726882907.62310: done dumping result, returning 30564 1726882907.62323: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0e448fcc-3ce9-4216-acec-0000000021b2] 30564 1726882907.62339: sending task result for task 0e448fcc-3ce9-4216-acec-0000000021b2 skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 30564 1726882907.62496: no more pending results, returning what we have 30564 1726882907.62501: results queue empty 30564 1726882907.62503: checking for any_errors_fatal 30564 1726882907.62510: done checking for any_errors_fatal 30564 1726882907.62511: checking for max_fail_percentage 30564 1726882907.62514: done checking for max_fail_percentage 30564 1726882907.62515: checking to see if all hosts have failed and the running result is not ok 30564 1726882907.62515: done checking to see if all hosts have failed 30564 1726882907.62516: getting the remaining hosts for this loop 30564 1726882907.62518: done getting the remaining hosts for this loop 30564 1726882907.62522: getting the next task for host managed_node2 30564 1726882907.62531: done getting next task for host managed_node2 30564 1726882907.62536: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30564 1726882907.62542: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882907.62569: getting variables 30564 1726882907.62571: in VariableManager get_vars() 30564 1726882907.62619: Calling all_inventory to load vars for managed_node2 30564 1726882907.62622: Calling groups_inventory to load vars for managed_node2 30564 1726882907.62624: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882907.62637: Calling all_plugins_play to load vars for managed_node2 30564 1726882907.62640: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882907.62644: Calling groups_plugins_play to load vars for managed_node2 30564 1726882907.64570: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000021b2 30564 1726882907.64574: WORKER PROCESS EXITING 30564 1726882907.66436: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882907.68583: done with get_vars() 30564 1726882907.68612: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:41:47 -0400 (0:00:00.082) 0:01:46.268 ****** 30564 1726882907.68709: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 30564 1726882907.69702: worker is 1 (out of 1 available) 30564 1726882907.69716: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 30564 1726882907.69730: done queuing things up, now waiting for results queue to drain 30564 1726882907.69731: waiting for pending results... 30564 1726882907.70034: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30564 1726882907.70207: in run() - task 0e448fcc-3ce9-4216-acec-0000000021b3 30564 1726882907.70226: variable 'ansible_search_path' from source: unknown 30564 1726882907.70232: variable 'ansible_search_path' from source: unknown 30564 1726882907.70275: calling self._execute() 30564 1726882907.70390: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882907.70404: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882907.70420: variable 'omit' from source: magic vars 30564 1726882907.70814: variable 'ansible_distribution_major_version' from source: facts 30564 1726882907.70839: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882907.70852: variable 'omit' from source: magic vars 30564 1726882907.70917: variable 'omit' from source: magic vars 30564 1726882907.71094: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882907.73420: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882907.73497: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882907.73540: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882907.73586: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882907.73616: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882907.73704: variable 'network_provider' from source: set_fact 30564 1726882907.73845: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882907.73885: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882907.73917: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882907.73966: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882907.73992: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882907.74073: variable 'omit' from source: magic vars 30564 1726882907.74192: variable 'omit' from source: magic vars 30564 1726882907.74303: variable 'network_connections' from source: include params 30564 1726882907.74432: variable 'interface' from source: play vars 30564 1726882907.74496: variable 'interface' from source: play vars 30564 1726882907.74822: variable 'omit' from source: magic vars 30564 1726882907.74880: variable '__lsr_ansible_managed' from source: task vars 30564 1726882907.74943: variable '__lsr_ansible_managed' from source: task vars 30564 1726882907.75369: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 30564 1726882907.75829: Loaded config def from plugin (lookup/template) 30564 1726882907.75840: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 30564 1726882907.75973: File lookup term: get_ansible_managed.j2 30564 1726882907.75981: variable 'ansible_search_path' from source: unknown 30564 1726882907.75990: evaluation_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 30564 1726882907.76005: search_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 30564 1726882907.76024: variable 'ansible_search_path' from source: unknown 30564 1726882907.93306: variable 'ansible_managed' from source: unknown 30564 1726882907.93448: variable 'omit' from source: magic vars 30564 1726882907.93486: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882907.93508: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882907.93521: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882907.93537: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882907.93545: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882907.93568: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882907.93577: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882907.93580: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882907.93795: Set connection var ansible_timeout to 10 30564 1726882907.93802: Set connection var ansible_pipelining to False 30564 1726882907.93805: Set connection var ansible_shell_type to sh 30564 1726882907.93811: Set connection var ansible_shell_executable to /bin/sh 30564 1726882907.93819: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882907.93821: Set connection var ansible_connection to ssh 30564 1726882907.93906: variable 'ansible_shell_executable' from source: unknown 30564 1726882907.93909: variable 'ansible_connection' from source: unknown 30564 1726882907.93912: variable 'ansible_module_compression' from source: unknown 30564 1726882907.93914: variable 'ansible_shell_type' from source: unknown 30564 1726882907.93917: variable 'ansible_shell_executable' from source: unknown 30564 1726882907.93920: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882907.93922: variable 'ansible_pipelining' from source: unknown 30564 1726882907.93924: variable 'ansible_timeout' from source: unknown 30564 1726882907.93930: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882907.94171: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30564 1726882907.94238: variable 'omit' from source: magic vars 30564 1726882907.94241: starting attempt loop 30564 1726882907.94244: running the handler 30564 1726882907.94253: _low_level_execute_command(): starting 30564 1726882907.94255: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882907.95128: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882907.95145: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882907.95163: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882907.95176: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882907.95228: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882907.95232: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration <<< 30564 1726882907.95245: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882907.95251: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 30564 1726882907.95286: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882907.95396: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882907.95416: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882907.95551: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882907.97222: stdout chunk (state=3): >>>/root <<< 30564 1726882907.97425: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882907.97428: stdout chunk (state=3): >>><<< 30564 1726882907.97432: stderr chunk (state=3): >>><<< 30564 1726882907.97581: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882907.97584: _low_level_execute_command(): starting 30564 1726882907.97587: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882907.9745479-35208-23842222861434 `" && echo ansible-tmp-1726882907.9745479-35208-23842222861434="` echo /root/.ansible/tmp/ansible-tmp-1726882907.9745479-35208-23842222861434 `" ) && sleep 0' 30564 1726882907.98215: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882907.98237: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882907.98254: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882907.98274: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882907.98319: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882907.98339: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882907.98356: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882907.98379: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882907.98391: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882907.98401: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882907.98412: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882907.98429: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882907.98452: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882907.98470: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882907.98483: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882907.98496: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882907.98585: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882907.98601: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882907.98614: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882907.99102: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882908.00971: stdout chunk (state=3): >>>ansible-tmp-1726882907.9745479-35208-23842222861434=/root/.ansible/tmp/ansible-tmp-1726882907.9745479-35208-23842222861434 <<< 30564 1726882908.01084: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882908.01145: stderr chunk (state=3): >>><<< 30564 1726882908.01162: stdout chunk (state=3): >>><<< 30564 1726882908.01211: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882907.9745479-35208-23842222861434=/root/.ansible/tmp/ansible-tmp-1726882907.9745479-35208-23842222861434 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882908.01226: variable 'ansible_module_compression' from source: unknown 30564 1726882908.01273: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 30564 1726882908.01306: variable 'ansible_facts' from source: unknown 30564 1726882908.01637: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882907.9745479-35208-23842222861434/AnsiballZ_network_connections.py 30564 1726882908.01640: Sending initial data 30564 1726882908.01643: Sent initial data (167 bytes) 30564 1726882908.03270: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882908.03274: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882908.03276: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882908.03279: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882908.03281: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882908.03283: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882908.03338: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882908.03341: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882908.03344: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882908.03346: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882908.03349: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882908.03351: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882908.03353: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882908.03355: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882908.03357: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882908.03475: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882908.03478: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882908.03485: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882908.03487: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882908.03593: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882908.05340: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882908.05430: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882908.05531: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmpm8ksiydm /root/.ansible/tmp/ansible-tmp-1726882907.9745479-35208-23842222861434/AnsiballZ_network_connections.py <<< 30564 1726882908.05623: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882908.07281: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882908.07457: stderr chunk (state=3): >>><<< 30564 1726882908.07460: stdout chunk (state=3): >>><<< 30564 1726882908.07462: done transferring module to remote 30564 1726882908.07469: _low_level_execute_command(): starting 30564 1726882908.07472: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882907.9745479-35208-23842222861434/ /root/.ansible/tmp/ansible-tmp-1726882907.9745479-35208-23842222861434/AnsiballZ_network_connections.py && sleep 0' 30564 1726882908.07838: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882908.07842: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882908.07875: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882908.07878: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882908.07881: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882908.07935: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882908.07938: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882908.08043: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882908.09827: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882908.09882: stderr chunk (state=3): >>><<< 30564 1726882908.09885: stdout chunk (state=3): >>><<< 30564 1726882908.09933: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882908.09936: _low_level_execute_command(): starting 30564 1726882908.09939: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882907.9745479-35208-23842222861434/AnsiballZ_network_connections.py && sleep 0' 30564 1726882908.10357: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882908.10381: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882908.10394: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882908.10410: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882908.10451: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882908.10466: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882908.10591: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882908.35830: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 776ac6a9-ad06-421f-84d7-faa75bbe803f\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 30564 1726882908.38286: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882908.38344: stderr chunk (state=3): >>><<< 30564 1726882908.38349: stdout chunk (state=3): >>><<< 30564 1726882908.38372: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 776ac6a9-ad06-421f-84d7-faa75bbe803f\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "present", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882908.38402: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'persistent_state': 'present', 'type': 'bridge', 'ip': {'dhcp4': False, 'auto6': False}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882907.9745479-35208-23842222861434/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882908.38409: _low_level_execute_command(): starting 30564 1726882908.38414: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882907.9745479-35208-23842222861434/ > /dev/null 2>&1 && sleep 0' 30564 1726882908.38883: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882908.38886: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882908.38924: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 30564 1726882908.38927: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882908.38930: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882908.38977: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882908.38988: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882908.39098: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882908.40904: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882908.40953: stderr chunk (state=3): >>><<< 30564 1726882908.40956: stdout chunk (state=3): >>><<< 30564 1726882908.40975: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882908.40982: handler run complete 30564 1726882908.41003: attempt loop complete, returning result 30564 1726882908.41006: _execute() done 30564 1726882908.41008: dumping result to json 30564 1726882908.41013: done dumping result, returning 30564 1726882908.41020: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0e448fcc-3ce9-4216-acec-0000000021b3] 30564 1726882908.41023: sending task result for task 0e448fcc-3ce9-4216-acec-0000000021b3 30564 1726882908.41132: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000021b3 30564 1726882908.41134: WORKER PROCESS EXITING changed: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 776ac6a9-ad06-421f-84d7-faa75bbe803f 30564 1726882908.41249: no more pending results, returning what we have 30564 1726882908.41252: results queue empty 30564 1726882908.41253: checking for any_errors_fatal 30564 1726882908.41259: done checking for any_errors_fatal 30564 1726882908.41260: checking for max_fail_percentage 30564 1726882908.41262: done checking for max_fail_percentage 30564 1726882908.41263: checking to see if all hosts have failed and the running result is not ok 30564 1726882908.41263: done checking to see if all hosts have failed 30564 1726882908.41272: getting the remaining hosts for this loop 30564 1726882908.41274: done getting the remaining hosts for this loop 30564 1726882908.41277: getting the next task for host managed_node2 30564 1726882908.41285: done getting next task for host managed_node2 30564 1726882908.41289: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 30564 1726882908.41294: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882908.41306: getting variables 30564 1726882908.41308: in VariableManager get_vars() 30564 1726882908.41346: Calling all_inventory to load vars for managed_node2 30564 1726882908.41349: Calling groups_inventory to load vars for managed_node2 30564 1726882908.41351: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882908.41360: Calling all_plugins_play to load vars for managed_node2 30564 1726882908.41362: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882908.41367: Calling groups_plugins_play to load vars for managed_node2 30564 1726882908.42361: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882908.43312: done with get_vars() 30564 1726882908.43331: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:41:48 -0400 (0:00:00.746) 0:01:47.015 ****** 30564 1726882908.43397: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 30564 1726882908.43638: worker is 1 (out of 1 available) 30564 1726882908.43650: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 30564 1726882908.43663: done queuing things up, now waiting for results queue to drain 30564 1726882908.43666: waiting for pending results... 30564 1726882908.43866: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 30564 1726882908.43967: in run() - task 0e448fcc-3ce9-4216-acec-0000000021b4 30564 1726882908.43980: variable 'ansible_search_path' from source: unknown 30564 1726882908.43984: variable 'ansible_search_path' from source: unknown 30564 1726882908.44013: calling self._execute() 30564 1726882908.44092: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882908.44097: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882908.44109: variable 'omit' from source: magic vars 30564 1726882908.44388: variable 'ansible_distribution_major_version' from source: facts 30564 1726882908.44398: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882908.44486: variable 'network_state' from source: role '' defaults 30564 1726882908.44494: Evaluated conditional (network_state != {}): False 30564 1726882908.44497: when evaluation is False, skipping this task 30564 1726882908.44499: _execute() done 30564 1726882908.44502: dumping result to json 30564 1726882908.44504: done dumping result, returning 30564 1726882908.44511: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [0e448fcc-3ce9-4216-acec-0000000021b4] 30564 1726882908.44516: sending task result for task 0e448fcc-3ce9-4216-acec-0000000021b4 30564 1726882908.44604: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000021b4 30564 1726882908.44607: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30564 1726882908.44654: no more pending results, returning what we have 30564 1726882908.44658: results queue empty 30564 1726882908.44659: checking for any_errors_fatal 30564 1726882908.44675: done checking for any_errors_fatal 30564 1726882908.44676: checking for max_fail_percentage 30564 1726882908.44679: done checking for max_fail_percentage 30564 1726882908.44680: checking to see if all hosts have failed and the running result is not ok 30564 1726882908.44680: done checking to see if all hosts have failed 30564 1726882908.44681: getting the remaining hosts for this loop 30564 1726882908.44683: done getting the remaining hosts for this loop 30564 1726882908.44687: getting the next task for host managed_node2 30564 1726882908.44694: done getting next task for host managed_node2 30564 1726882908.44698: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30564 1726882908.44704: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882908.44725: getting variables 30564 1726882908.44726: in VariableManager get_vars() 30564 1726882908.44766: Calling all_inventory to load vars for managed_node2 30564 1726882908.44769: Calling groups_inventory to load vars for managed_node2 30564 1726882908.44772: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882908.44781: Calling all_plugins_play to load vars for managed_node2 30564 1726882908.44784: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882908.44786: Calling groups_plugins_play to load vars for managed_node2 30564 1726882908.45580: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882908.46520: done with get_vars() 30564 1726882908.46535: done getting variables 30564 1726882908.46577: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:41:48 -0400 (0:00:00.032) 0:01:47.047 ****** 30564 1726882908.46604: entering _queue_task() for managed_node2/debug 30564 1726882908.46815: worker is 1 (out of 1 available) 30564 1726882908.46828: exiting _queue_task() for managed_node2/debug 30564 1726882908.46840: done queuing things up, now waiting for results queue to drain 30564 1726882908.46842: waiting for pending results... 30564 1726882908.47036: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30564 1726882908.47116: in run() - task 0e448fcc-3ce9-4216-acec-0000000021b5 30564 1726882908.47127: variable 'ansible_search_path' from source: unknown 30564 1726882908.47131: variable 'ansible_search_path' from source: unknown 30564 1726882908.47160: calling self._execute() 30564 1726882908.47242: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882908.47246: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882908.47258: variable 'omit' from source: magic vars 30564 1726882908.47549: variable 'ansible_distribution_major_version' from source: facts 30564 1726882908.47561: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882908.47568: variable 'omit' from source: magic vars 30564 1726882908.47619: variable 'omit' from source: magic vars 30564 1726882908.47644: variable 'omit' from source: magic vars 30564 1726882908.47681: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882908.47710: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882908.47729: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882908.47743: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882908.47753: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882908.47781: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882908.47784: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882908.47787: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882908.47856: Set connection var ansible_timeout to 10 30564 1726882908.47859: Set connection var ansible_pipelining to False 30564 1726882908.47862: Set connection var ansible_shell_type to sh 30564 1726882908.47870: Set connection var ansible_shell_executable to /bin/sh 30564 1726882908.47879: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882908.47882: Set connection var ansible_connection to ssh 30564 1726882908.47899: variable 'ansible_shell_executable' from source: unknown 30564 1726882908.47902: variable 'ansible_connection' from source: unknown 30564 1726882908.47904: variable 'ansible_module_compression' from source: unknown 30564 1726882908.47907: variable 'ansible_shell_type' from source: unknown 30564 1726882908.47910: variable 'ansible_shell_executable' from source: unknown 30564 1726882908.47913: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882908.47915: variable 'ansible_pipelining' from source: unknown 30564 1726882908.47917: variable 'ansible_timeout' from source: unknown 30564 1726882908.47921: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882908.48025: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882908.48036: variable 'omit' from source: magic vars 30564 1726882908.48042: starting attempt loop 30564 1726882908.48045: running the handler 30564 1726882908.48141: variable '__network_connections_result' from source: set_fact 30564 1726882908.48188: handler run complete 30564 1726882908.48200: attempt loop complete, returning result 30564 1726882908.48203: _execute() done 30564 1726882908.48206: dumping result to json 30564 1726882908.48208: done dumping result, returning 30564 1726882908.48217: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0e448fcc-3ce9-4216-acec-0000000021b5] 30564 1726882908.48221: sending task result for task 0e448fcc-3ce9-4216-acec-0000000021b5 30564 1726882908.48305: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000021b5 30564 1726882908.48308: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 776ac6a9-ad06-421f-84d7-faa75bbe803f" ] } 30564 1726882908.48380: no more pending results, returning what we have 30564 1726882908.48384: results queue empty 30564 1726882908.48385: checking for any_errors_fatal 30564 1726882908.48392: done checking for any_errors_fatal 30564 1726882908.48392: checking for max_fail_percentage 30564 1726882908.48394: done checking for max_fail_percentage 30564 1726882908.48395: checking to see if all hosts have failed and the running result is not ok 30564 1726882908.48396: done checking to see if all hosts have failed 30564 1726882908.48396: getting the remaining hosts for this loop 30564 1726882908.48398: done getting the remaining hosts for this loop 30564 1726882908.48402: getting the next task for host managed_node2 30564 1726882908.48409: done getting next task for host managed_node2 30564 1726882908.48413: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30564 1726882908.48424: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882908.48435: getting variables 30564 1726882908.48436: in VariableManager get_vars() 30564 1726882908.48474: Calling all_inventory to load vars for managed_node2 30564 1726882908.48476: Calling groups_inventory to load vars for managed_node2 30564 1726882908.48478: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882908.48487: Calling all_plugins_play to load vars for managed_node2 30564 1726882908.48489: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882908.48491: Calling groups_plugins_play to load vars for managed_node2 30564 1726882908.53234: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882908.54157: done with get_vars() 30564 1726882908.54177: done getting variables 30564 1726882908.54208: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:41:48 -0400 (0:00:00.076) 0:01:47.123 ****** 30564 1726882908.54233: entering _queue_task() for managed_node2/debug 30564 1726882908.54475: worker is 1 (out of 1 available) 30564 1726882908.54487: exiting _queue_task() for managed_node2/debug 30564 1726882908.54500: done queuing things up, now waiting for results queue to drain 30564 1726882908.54501: waiting for pending results... 30564 1726882908.54709: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30564 1726882908.54817: in run() - task 0e448fcc-3ce9-4216-acec-0000000021b6 30564 1726882908.54833: variable 'ansible_search_path' from source: unknown 30564 1726882908.54836: variable 'ansible_search_path' from source: unknown 30564 1726882908.54866: calling self._execute() 30564 1726882908.54949: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882908.54955: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882908.54966: variable 'omit' from source: magic vars 30564 1726882908.55258: variable 'ansible_distribution_major_version' from source: facts 30564 1726882908.55273: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882908.55277: variable 'omit' from source: magic vars 30564 1726882908.55317: variable 'omit' from source: magic vars 30564 1726882908.55341: variable 'omit' from source: magic vars 30564 1726882908.55377: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882908.55404: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882908.55419: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882908.55434: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882908.55445: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882908.55470: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882908.55480: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882908.55483: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882908.55553: Set connection var ansible_timeout to 10 30564 1726882908.55556: Set connection var ansible_pipelining to False 30564 1726882908.55559: Set connection var ansible_shell_type to sh 30564 1726882908.55567: Set connection var ansible_shell_executable to /bin/sh 30564 1726882908.55579: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882908.55582: Set connection var ansible_connection to ssh 30564 1726882908.55603: variable 'ansible_shell_executable' from source: unknown 30564 1726882908.55608: variable 'ansible_connection' from source: unknown 30564 1726882908.55611: variable 'ansible_module_compression' from source: unknown 30564 1726882908.55613: variable 'ansible_shell_type' from source: unknown 30564 1726882908.55615: variable 'ansible_shell_executable' from source: unknown 30564 1726882908.55618: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882908.55620: variable 'ansible_pipelining' from source: unknown 30564 1726882908.55622: variable 'ansible_timeout' from source: unknown 30564 1726882908.55624: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882908.55726: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882908.55737: variable 'omit' from source: magic vars 30564 1726882908.55744: starting attempt loop 30564 1726882908.55747: running the handler 30564 1726882908.55789: variable '__network_connections_result' from source: set_fact 30564 1726882908.55850: variable '__network_connections_result' from source: set_fact 30564 1726882908.55942: handler run complete 30564 1726882908.55962: attempt loop complete, returning result 30564 1726882908.55969: _execute() done 30564 1726882908.55973: dumping result to json 30564 1726882908.55975: done dumping result, returning 30564 1726882908.55981: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0e448fcc-3ce9-4216-acec-0000000021b6] 30564 1726882908.55986: sending task result for task 0e448fcc-3ce9-4216-acec-0000000021b6 30564 1726882908.56089: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000021b6 30564 1726882908.56091: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 776ac6a9-ad06-421f-84d7-faa75bbe803f\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 776ac6a9-ad06-421f-84d7-faa75bbe803f" ] } } 30564 1726882908.56184: no more pending results, returning what we have 30564 1726882908.56187: results queue empty 30564 1726882908.56188: checking for any_errors_fatal 30564 1726882908.56194: done checking for any_errors_fatal 30564 1726882908.56194: checking for max_fail_percentage 30564 1726882908.56196: done checking for max_fail_percentage 30564 1726882908.56196: checking to see if all hosts have failed and the running result is not ok 30564 1726882908.56197: done checking to see if all hosts have failed 30564 1726882908.56198: getting the remaining hosts for this loop 30564 1726882908.56200: done getting the remaining hosts for this loop 30564 1726882908.56203: getting the next task for host managed_node2 30564 1726882908.56209: done getting next task for host managed_node2 30564 1726882908.56212: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30564 1726882908.56220: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882908.56236: getting variables 30564 1726882908.56238: in VariableManager get_vars() 30564 1726882908.56279: Calling all_inventory to load vars for managed_node2 30564 1726882908.56282: Calling groups_inventory to load vars for managed_node2 30564 1726882908.56284: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882908.56292: Calling all_plugins_play to load vars for managed_node2 30564 1726882908.56294: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882908.56296: Calling groups_plugins_play to load vars for managed_node2 30564 1726882908.57079: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882908.58118: done with get_vars() 30564 1726882908.58132: done getting variables 30564 1726882908.58174: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:41:48 -0400 (0:00:00.039) 0:01:47.163 ****** 30564 1726882908.58198: entering _queue_task() for managed_node2/debug 30564 1726882908.58397: worker is 1 (out of 1 available) 30564 1726882908.58410: exiting _queue_task() for managed_node2/debug 30564 1726882908.58422: done queuing things up, now waiting for results queue to drain 30564 1726882908.58424: waiting for pending results... 30564 1726882908.58607: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30564 1726882908.58703: in run() - task 0e448fcc-3ce9-4216-acec-0000000021b7 30564 1726882908.58714: variable 'ansible_search_path' from source: unknown 30564 1726882908.58718: variable 'ansible_search_path' from source: unknown 30564 1726882908.58745: calling self._execute() 30564 1726882908.58822: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882908.58830: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882908.58841: variable 'omit' from source: magic vars 30564 1726882908.59113: variable 'ansible_distribution_major_version' from source: facts 30564 1726882908.59123: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882908.59212: variable 'network_state' from source: role '' defaults 30564 1726882908.59221: Evaluated conditional (network_state != {}): False 30564 1726882908.59224: when evaluation is False, skipping this task 30564 1726882908.59227: _execute() done 30564 1726882908.59229: dumping result to json 30564 1726882908.59232: done dumping result, returning 30564 1726882908.59239: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0e448fcc-3ce9-4216-acec-0000000021b7] 30564 1726882908.59244: sending task result for task 0e448fcc-3ce9-4216-acec-0000000021b7 30564 1726882908.59335: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000021b7 30564 1726882908.59339: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "network_state != {}" } 30564 1726882908.59391: no more pending results, returning what we have 30564 1726882908.59394: results queue empty 30564 1726882908.59395: checking for any_errors_fatal 30564 1726882908.59402: done checking for any_errors_fatal 30564 1726882908.59403: checking for max_fail_percentage 30564 1726882908.59404: done checking for max_fail_percentage 30564 1726882908.59405: checking to see if all hosts have failed and the running result is not ok 30564 1726882908.59406: done checking to see if all hosts have failed 30564 1726882908.59407: getting the remaining hosts for this loop 30564 1726882908.59408: done getting the remaining hosts for this loop 30564 1726882908.59411: getting the next task for host managed_node2 30564 1726882908.59417: done getting next task for host managed_node2 30564 1726882908.59421: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 30564 1726882908.59426: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882908.59442: getting variables 30564 1726882908.59443: in VariableManager get_vars() 30564 1726882908.59478: Calling all_inventory to load vars for managed_node2 30564 1726882908.59480: Calling groups_inventory to load vars for managed_node2 30564 1726882908.59482: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882908.59493: Calling all_plugins_play to load vars for managed_node2 30564 1726882908.59495: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882908.59496: Calling groups_plugins_play to load vars for managed_node2 30564 1726882908.60265: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882908.61221: done with get_vars() 30564 1726882908.61236: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:41:48 -0400 (0:00:00.031) 0:01:47.194 ****** 30564 1726882908.61303: entering _queue_task() for managed_node2/ping 30564 1726882908.61512: worker is 1 (out of 1 available) 30564 1726882908.61527: exiting _queue_task() for managed_node2/ping 30564 1726882908.61538: done queuing things up, now waiting for results queue to drain 30564 1726882908.61539: waiting for pending results... 30564 1726882908.61725: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 30564 1726882908.61815: in run() - task 0e448fcc-3ce9-4216-acec-0000000021b8 30564 1726882908.61826: variable 'ansible_search_path' from source: unknown 30564 1726882908.61830: variable 'ansible_search_path' from source: unknown 30564 1726882908.61858: calling self._execute() 30564 1726882908.61942: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882908.61946: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882908.61956: variable 'omit' from source: magic vars 30564 1726882908.62244: variable 'ansible_distribution_major_version' from source: facts 30564 1726882908.62256: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882908.62261: variable 'omit' from source: magic vars 30564 1726882908.62307: variable 'omit' from source: magic vars 30564 1726882908.62328: variable 'omit' from source: magic vars 30564 1726882908.62360: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882908.62389: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882908.62406: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882908.62422: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882908.62432: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882908.62454: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882908.62457: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882908.62460: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882908.62532: Set connection var ansible_timeout to 10 30564 1726882908.62536: Set connection var ansible_pipelining to False 30564 1726882908.62539: Set connection var ansible_shell_type to sh 30564 1726882908.62541: Set connection var ansible_shell_executable to /bin/sh 30564 1726882908.62550: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882908.62552: Set connection var ansible_connection to ssh 30564 1726882908.62573: variable 'ansible_shell_executable' from source: unknown 30564 1726882908.62576: variable 'ansible_connection' from source: unknown 30564 1726882908.62579: variable 'ansible_module_compression' from source: unknown 30564 1726882908.62581: variable 'ansible_shell_type' from source: unknown 30564 1726882908.62583: variable 'ansible_shell_executable' from source: unknown 30564 1726882908.62585: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882908.62587: variable 'ansible_pipelining' from source: unknown 30564 1726882908.62590: variable 'ansible_timeout' from source: unknown 30564 1726882908.62593: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882908.62742: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30564 1726882908.62747: variable 'omit' from source: magic vars 30564 1726882908.62753: starting attempt loop 30564 1726882908.62755: running the handler 30564 1726882908.62772: _low_level_execute_command(): starting 30564 1726882908.62782: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882908.63316: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882908.63333: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882908.63346: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration <<< 30564 1726882908.63358: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882908.63408: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882908.63424: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882908.63542: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882908.65210: stdout chunk (state=3): >>>/root <<< 30564 1726882908.65305: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882908.65363: stderr chunk (state=3): >>><<< 30564 1726882908.65368: stdout chunk (state=3): >>><<< 30564 1726882908.65391: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882908.65402: _low_level_execute_command(): starting 30564 1726882908.65408: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882908.6538935-35235-250970083948173 `" && echo ansible-tmp-1726882908.6538935-35235-250970083948173="` echo /root/.ansible/tmp/ansible-tmp-1726882908.6538935-35235-250970083948173 `" ) && sleep 0' 30564 1726882908.65848: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882908.65862: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882908.65889: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration <<< 30564 1726882908.65908: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882908.65946: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882908.65962: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882908.66061: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882908.67923: stdout chunk (state=3): >>>ansible-tmp-1726882908.6538935-35235-250970083948173=/root/.ansible/tmp/ansible-tmp-1726882908.6538935-35235-250970083948173 <<< 30564 1726882908.68032: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882908.68080: stderr chunk (state=3): >>><<< 30564 1726882908.68084: stdout chunk (state=3): >>><<< 30564 1726882908.68100: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882908.6538935-35235-250970083948173=/root/.ansible/tmp/ansible-tmp-1726882908.6538935-35235-250970083948173 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882908.68138: variable 'ansible_module_compression' from source: unknown 30564 1726882908.68174: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 30564 1726882908.68203: variable 'ansible_facts' from source: unknown 30564 1726882908.68254: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882908.6538935-35235-250970083948173/AnsiballZ_ping.py 30564 1726882908.68359: Sending initial data 30564 1726882908.68374: Sent initial data (153 bytes) 30564 1726882908.69081: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882908.69084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882908.69108: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882908.69126: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882908.69143: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882908.69162: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882908.69181: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882908.69193: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882908.69205: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882908.69225: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882908.69247: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882908.69260: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882908.69275: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882908.69289: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882908.69374: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882908.69392: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882908.69406: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882908.69536: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882908.71247: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 <<< 30564 1726882908.71253: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882908.71342: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882908.71438: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmptnjwwxp0 /root/.ansible/tmp/ansible-tmp-1726882908.6538935-35235-250970083948173/AnsiballZ_ping.py <<< 30564 1726882908.71532: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882908.72521: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882908.72611: stderr chunk (state=3): >>><<< 30564 1726882908.72614: stdout chunk (state=3): >>><<< 30564 1726882908.72630: done transferring module to remote 30564 1726882908.72639: _low_level_execute_command(): starting 30564 1726882908.72643: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882908.6538935-35235-250970083948173/ /root/.ansible/tmp/ansible-tmp-1726882908.6538935-35235-250970083948173/AnsiballZ_ping.py && sleep 0' 30564 1726882908.73076: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882908.73080: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882908.73113: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882908.73116: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882908.73120: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882908.73184: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882908.73187: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882908.73284: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882908.75001: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882908.75042: stderr chunk (state=3): >>><<< 30564 1726882908.75045: stdout chunk (state=3): >>><<< 30564 1726882908.75060: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882908.75070: _low_level_execute_command(): starting 30564 1726882908.75076: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882908.6538935-35235-250970083948173/AnsiballZ_ping.py && sleep 0' 30564 1726882908.75504: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882908.75518: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882908.75539: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882908.75551: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882908.75599: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882908.75611: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882908.75733: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882908.88519: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 30564 1726882908.89495: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882908.89511: stderr chunk (state=3): >>>Shared connection to 10.31.11.158 closed. <<< 30564 1726882908.89559: stderr chunk (state=3): >>><<< 30564 1726882908.89562: stdout chunk (state=3): >>><<< 30564 1726882908.89583: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882908.89606: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882908.6538935-35235-250970083948173/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882908.89616: _low_level_execute_command(): starting 30564 1726882908.89621: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882908.6538935-35235-250970083948173/ > /dev/null 2>&1 && sleep 0' 30564 1726882908.90087: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882908.90093: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882908.90126: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882908.90138: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882908.90201: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882908.90206: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882908.90318: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882908.92114: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882908.92160: stderr chunk (state=3): >>><<< 30564 1726882908.92168: stdout chunk (state=3): >>><<< 30564 1726882908.92183: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882908.92192: handler run complete 30564 1726882908.92205: attempt loop complete, returning result 30564 1726882908.92208: _execute() done 30564 1726882908.92210: dumping result to json 30564 1726882908.92213: done dumping result, returning 30564 1726882908.92222: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0e448fcc-3ce9-4216-acec-0000000021b8] 30564 1726882908.92227: sending task result for task 0e448fcc-3ce9-4216-acec-0000000021b8 30564 1726882908.92322: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000021b8 30564 1726882908.92325: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 30564 1726882908.92386: no more pending results, returning what we have 30564 1726882908.92390: results queue empty 30564 1726882908.92391: checking for any_errors_fatal 30564 1726882908.92400: done checking for any_errors_fatal 30564 1726882908.92400: checking for max_fail_percentage 30564 1726882908.92402: done checking for max_fail_percentage 30564 1726882908.92403: checking to see if all hosts have failed and the running result is not ok 30564 1726882908.92404: done checking to see if all hosts have failed 30564 1726882908.92405: getting the remaining hosts for this loop 30564 1726882908.92406: done getting the remaining hosts for this loop 30564 1726882908.92410: getting the next task for host managed_node2 30564 1726882908.92422: done getting next task for host managed_node2 30564 1726882908.92424: ^ task is: TASK: meta (role_complete) 30564 1726882908.92429: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882908.92440: getting variables 30564 1726882908.92442: in VariableManager get_vars() 30564 1726882908.92489: Calling all_inventory to load vars for managed_node2 30564 1726882908.92491: Calling groups_inventory to load vars for managed_node2 30564 1726882908.92493: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882908.92504: Calling all_plugins_play to load vars for managed_node2 30564 1726882908.92507: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882908.92509: Calling groups_plugins_play to load vars for managed_node2 30564 1726882908.93531: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882908.94474: done with get_vars() 30564 1726882908.94490: done getting variables 30564 1726882908.94550: done queuing things up, now waiting for results queue to drain 30564 1726882908.94552: results queue empty 30564 1726882908.94552: checking for any_errors_fatal 30564 1726882908.94554: done checking for any_errors_fatal 30564 1726882908.94555: checking for max_fail_percentage 30564 1726882908.94555: done checking for max_fail_percentage 30564 1726882908.94556: checking to see if all hosts have failed and the running result is not ok 30564 1726882908.94556: done checking to see if all hosts have failed 30564 1726882908.94557: getting the remaining hosts for this loop 30564 1726882908.94557: done getting the remaining hosts for this loop 30564 1726882908.94559: getting the next task for host managed_node2 30564 1726882908.94562: done getting next task for host managed_node2 30564 1726882908.94565: ^ task is: TASK: Show result 30564 1726882908.94567: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882908.94569: getting variables 30564 1726882908.94570: in VariableManager get_vars() 30564 1726882908.94578: Calling all_inventory to load vars for managed_node2 30564 1726882908.94579: Calling groups_inventory to load vars for managed_node2 30564 1726882908.94581: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882908.94584: Calling all_plugins_play to load vars for managed_node2 30564 1726882908.94586: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882908.94587: Calling groups_plugins_play to load vars for managed_node2 30564 1726882908.95279: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882908.96227: done with get_vars() 30564 1726882908.96241: done getting variables 30564 1726882908.96283: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show result] ************************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bridge_profile.yml:14 Friday 20 September 2024 21:41:48 -0400 (0:00:00.350) 0:01:47.544 ****** 30564 1726882908.96306: entering _queue_task() for managed_node2/debug 30564 1726882908.96547: worker is 1 (out of 1 available) 30564 1726882908.96561: exiting _queue_task() for managed_node2/debug 30564 1726882908.96577: done queuing things up, now waiting for results queue to drain 30564 1726882908.96578: waiting for pending results... 30564 1726882908.96782: running TaskExecutor() for managed_node2/TASK: Show result 30564 1726882908.96867: in run() - task 0e448fcc-3ce9-4216-acec-00000000213a 30564 1726882908.96879: variable 'ansible_search_path' from source: unknown 30564 1726882908.96883: variable 'ansible_search_path' from source: unknown 30564 1726882908.96914: calling self._execute() 30564 1726882908.96994: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882908.96998: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882908.97012: variable 'omit' from source: magic vars 30564 1726882908.97319: variable 'ansible_distribution_major_version' from source: facts 30564 1726882908.97331: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882908.97336: variable 'omit' from source: magic vars 30564 1726882908.97374: variable 'omit' from source: magic vars 30564 1726882908.97396: variable 'omit' from source: magic vars 30564 1726882908.97429: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882908.97455: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882908.97480: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882908.97494: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882908.97504: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882908.97529: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882908.97532: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882908.97534: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882908.97615: Set connection var ansible_timeout to 10 30564 1726882908.97619: Set connection var ansible_pipelining to False 30564 1726882908.97622: Set connection var ansible_shell_type to sh 30564 1726882908.97627: Set connection var ansible_shell_executable to /bin/sh 30564 1726882908.97634: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882908.97637: Set connection var ansible_connection to ssh 30564 1726882908.97654: variable 'ansible_shell_executable' from source: unknown 30564 1726882908.97657: variable 'ansible_connection' from source: unknown 30564 1726882908.97660: variable 'ansible_module_compression' from source: unknown 30564 1726882908.97662: variable 'ansible_shell_type' from source: unknown 30564 1726882908.97666: variable 'ansible_shell_executable' from source: unknown 30564 1726882908.97676: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882908.97679: variable 'ansible_pipelining' from source: unknown 30564 1726882908.97681: variable 'ansible_timeout' from source: unknown 30564 1726882908.97686: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882908.97804: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882908.97814: variable 'omit' from source: magic vars 30564 1726882908.97819: starting attempt loop 30564 1726882908.97822: running the handler 30564 1726882908.97859: variable '__network_connections_result' from source: set_fact 30564 1726882908.97925: variable '__network_connections_result' from source: set_fact 30564 1726882908.98012: handler run complete 30564 1726882908.98030: attempt loop complete, returning result 30564 1726882908.98033: _execute() done 30564 1726882908.98035: dumping result to json 30564 1726882908.98039: done dumping result, returning 30564 1726882908.98047: done running TaskExecutor() for managed_node2/TASK: Show result [0e448fcc-3ce9-4216-acec-00000000213a] 30564 1726882908.98052: sending task result for task 0e448fcc-3ce9-4216-acec-00000000213a 30564 1726882908.98147: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000213a 30564 1726882908.98150: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "statebr", "persistent_state": "present", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 776ac6a9-ad06-421f-84d7-faa75bbe803f\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'statebr': add connection statebr, 776ac6a9-ad06-421f-84d7-faa75bbe803f" ] } } 30564 1726882908.98230: no more pending results, returning what we have 30564 1726882908.98234: results queue empty 30564 1726882908.98235: checking for any_errors_fatal 30564 1726882908.98238: done checking for any_errors_fatal 30564 1726882908.98238: checking for max_fail_percentage 30564 1726882908.98240: done checking for max_fail_percentage 30564 1726882908.98241: checking to see if all hosts have failed and the running result is not ok 30564 1726882908.98242: done checking to see if all hosts have failed 30564 1726882908.98243: getting the remaining hosts for this loop 30564 1726882908.98245: done getting the remaining hosts for this loop 30564 1726882908.98248: getting the next task for host managed_node2 30564 1726882908.98265: done getting next task for host managed_node2 30564 1726882908.98273: ^ task is: TASK: Include network role 30564 1726882908.98276: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882908.98281: getting variables 30564 1726882908.98282: in VariableManager get_vars() 30564 1726882908.98317: Calling all_inventory to load vars for managed_node2 30564 1726882908.98320: Calling groups_inventory to load vars for managed_node2 30564 1726882908.98323: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882908.98332: Calling all_plugins_play to load vars for managed_node2 30564 1726882908.98335: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882908.98337: Calling groups_plugins_play to load vars for managed_node2 30564 1726882908.99319: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882909.00270: done with get_vars() 30564 1726882909.00288: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/activate_profile.yml:3 Friday 20 September 2024 21:41:49 -0400 (0:00:00.040) 0:01:47.584 ****** 30564 1726882909.00358: entering _queue_task() for managed_node2/include_role 30564 1726882909.00611: worker is 1 (out of 1 available) 30564 1726882909.00624: exiting _queue_task() for managed_node2/include_role 30564 1726882909.00637: done queuing things up, now waiting for results queue to drain 30564 1726882909.00638: waiting for pending results... 30564 1726882909.00840: running TaskExecutor() for managed_node2/TASK: Include network role 30564 1726882909.00939: in run() - task 0e448fcc-3ce9-4216-acec-00000000213e 30564 1726882909.00950: variable 'ansible_search_path' from source: unknown 30564 1726882909.00953: variable 'ansible_search_path' from source: unknown 30564 1726882909.00988: calling self._execute() 30564 1726882909.01064: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882909.01069: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882909.01083: variable 'omit' from source: magic vars 30564 1726882909.01379: variable 'ansible_distribution_major_version' from source: facts 30564 1726882909.01391: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882909.01396: _execute() done 30564 1726882909.01399: dumping result to json 30564 1726882909.01404: done dumping result, returning 30564 1726882909.01410: done running TaskExecutor() for managed_node2/TASK: Include network role [0e448fcc-3ce9-4216-acec-00000000213e] 30564 1726882909.01421: sending task result for task 0e448fcc-3ce9-4216-acec-00000000213e 30564 1726882909.01529: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000213e 30564 1726882909.01533: WORKER PROCESS EXITING 30564 1726882909.01557: no more pending results, returning what we have 30564 1726882909.01562: in VariableManager get_vars() 30564 1726882909.01609: Calling all_inventory to load vars for managed_node2 30564 1726882909.01612: Calling groups_inventory to load vars for managed_node2 30564 1726882909.01616: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882909.01629: Calling all_plugins_play to load vars for managed_node2 30564 1726882909.01633: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882909.01636: Calling groups_plugins_play to load vars for managed_node2 30564 1726882909.02578: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882909.03520: done with get_vars() 30564 1726882909.03534: variable 'ansible_search_path' from source: unknown 30564 1726882909.03535: variable 'ansible_search_path' from source: unknown 30564 1726882909.03628: variable 'omit' from source: magic vars 30564 1726882909.03653: variable 'omit' from source: magic vars 30564 1726882909.03662: variable 'omit' from source: magic vars 30564 1726882909.03666: we have included files to process 30564 1726882909.03666: generating all_blocks data 30564 1726882909.03668: done generating all_blocks data 30564 1726882909.03673: processing included file: fedora.linux_system_roles.network 30564 1726882909.03689: in VariableManager get_vars() 30564 1726882909.03700: done with get_vars() 30564 1726882909.03719: in VariableManager get_vars() 30564 1726882909.03730: done with get_vars() 30564 1726882909.03755: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 30564 1726882909.03830: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 30564 1726882909.03880: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 30564 1726882909.04168: in VariableManager get_vars() 30564 1726882909.04183: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30564 1726882909.05416: iterating over new_blocks loaded from include file 30564 1726882909.05418: in VariableManager get_vars() 30564 1726882909.05431: done with get_vars() 30564 1726882909.05432: filtering new block on tags 30564 1726882909.05598: done filtering new block on tags 30564 1726882909.05600: in VariableManager get_vars() 30564 1726882909.05610: done with get_vars() 30564 1726882909.05611: filtering new block on tags 30564 1726882909.05622: done filtering new block on tags 30564 1726882909.05623: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node2 30564 1726882909.05626: extending task lists for all hosts with included blocks 30564 1726882909.05696: done extending task lists 30564 1726882909.05697: done processing included files 30564 1726882909.05697: results queue empty 30564 1726882909.05698: checking for any_errors_fatal 30564 1726882909.05701: done checking for any_errors_fatal 30564 1726882909.05701: checking for max_fail_percentage 30564 1726882909.05702: done checking for max_fail_percentage 30564 1726882909.05702: checking to see if all hosts have failed and the running result is not ok 30564 1726882909.05703: done checking to see if all hosts have failed 30564 1726882909.05703: getting the remaining hosts for this loop 30564 1726882909.05704: done getting the remaining hosts for this loop 30564 1726882909.05706: getting the next task for host managed_node2 30564 1726882909.05709: done getting next task for host managed_node2 30564 1726882909.05710: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30564 1726882909.05713: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882909.05719: getting variables 30564 1726882909.05720: in VariableManager get_vars() 30564 1726882909.05729: Calling all_inventory to load vars for managed_node2 30564 1726882909.05730: Calling groups_inventory to load vars for managed_node2 30564 1726882909.05731: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882909.05734: Calling all_plugins_play to load vars for managed_node2 30564 1726882909.05736: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882909.05737: Calling groups_plugins_play to load vars for managed_node2 30564 1726882909.06413: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882909.07408: done with get_vars() 30564 1726882909.07423: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:41:49 -0400 (0:00:00.071) 0:01:47.656 ****** 30564 1726882909.07473: entering _queue_task() for managed_node2/include_tasks 30564 1726882909.07702: worker is 1 (out of 1 available) 30564 1726882909.07715: exiting _queue_task() for managed_node2/include_tasks 30564 1726882909.07729: done queuing things up, now waiting for results queue to drain 30564 1726882909.07730: waiting for pending results... 30564 1726882909.07927: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30564 1726882909.08020: in run() - task 0e448fcc-3ce9-4216-acec-000000002328 30564 1726882909.08031: variable 'ansible_search_path' from source: unknown 30564 1726882909.08033: variable 'ansible_search_path' from source: unknown 30564 1726882909.08063: calling self._execute() 30564 1726882909.08141: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882909.08145: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882909.08156: variable 'omit' from source: magic vars 30564 1726882909.08439: variable 'ansible_distribution_major_version' from source: facts 30564 1726882909.08450: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882909.08457: _execute() done 30564 1726882909.08460: dumping result to json 30564 1726882909.08465: done dumping result, returning 30564 1726882909.08471: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0e448fcc-3ce9-4216-acec-000000002328] 30564 1726882909.08484: sending task result for task 0e448fcc-3ce9-4216-acec-000000002328 30564 1726882909.08577: done sending task result for task 0e448fcc-3ce9-4216-acec-000000002328 30564 1726882909.08579: WORKER PROCESS EXITING 30564 1726882909.08643: no more pending results, returning what we have 30564 1726882909.08648: in VariableManager get_vars() 30564 1726882909.08697: Calling all_inventory to load vars for managed_node2 30564 1726882909.08700: Calling groups_inventory to load vars for managed_node2 30564 1726882909.08702: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882909.08711: Calling all_plugins_play to load vars for managed_node2 30564 1726882909.08713: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882909.08716: Calling groups_plugins_play to load vars for managed_node2 30564 1726882909.09510: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882909.11123: done with get_vars() 30564 1726882909.11137: variable 'ansible_search_path' from source: unknown 30564 1726882909.11138: variable 'ansible_search_path' from source: unknown 30564 1726882909.11166: we have included files to process 30564 1726882909.11167: generating all_blocks data 30564 1726882909.11169: done generating all_blocks data 30564 1726882909.11173: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30564 1726882909.11174: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30564 1726882909.11176: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30564 1726882909.11548: done processing included file 30564 1726882909.11550: iterating over new_blocks loaded from include file 30564 1726882909.11551: in VariableManager get_vars() 30564 1726882909.11568: done with get_vars() 30564 1726882909.11570: filtering new block on tags 30564 1726882909.11589: done filtering new block on tags 30564 1726882909.11590: in VariableManager get_vars() 30564 1726882909.11606: done with get_vars() 30564 1726882909.11607: filtering new block on tags 30564 1726882909.11636: done filtering new block on tags 30564 1726882909.11638: in VariableManager get_vars() 30564 1726882909.11653: done with get_vars() 30564 1726882909.11654: filtering new block on tags 30564 1726882909.11681: done filtering new block on tags 30564 1726882909.11682: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 30564 1726882909.11686: extending task lists for all hosts with included blocks 30564 1726882909.12690: done extending task lists 30564 1726882909.12691: done processing included files 30564 1726882909.12692: results queue empty 30564 1726882909.12692: checking for any_errors_fatal 30564 1726882909.12695: done checking for any_errors_fatal 30564 1726882909.12695: checking for max_fail_percentage 30564 1726882909.12696: done checking for max_fail_percentage 30564 1726882909.12697: checking to see if all hosts have failed and the running result is not ok 30564 1726882909.12697: done checking to see if all hosts have failed 30564 1726882909.12698: getting the remaining hosts for this loop 30564 1726882909.12699: done getting the remaining hosts for this loop 30564 1726882909.12700: getting the next task for host managed_node2 30564 1726882909.12703: done getting next task for host managed_node2 30564 1726882909.12705: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30564 1726882909.12718: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882909.12734: getting variables 30564 1726882909.12736: in VariableManager get_vars() 30564 1726882909.12750: Calling all_inventory to load vars for managed_node2 30564 1726882909.12752: Calling groups_inventory to load vars for managed_node2 30564 1726882909.12754: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882909.12758: Calling all_plugins_play to load vars for managed_node2 30564 1726882909.12760: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882909.12765: Calling groups_plugins_play to load vars for managed_node2 30564 1726882909.14463: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882909.16169: done with get_vars() 30564 1726882909.16190: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:41:49 -0400 (0:00:00.087) 0:01:47.743 ****** 30564 1726882909.16260: entering _queue_task() for managed_node2/setup 30564 1726882909.16552: worker is 1 (out of 1 available) 30564 1726882909.16566: exiting _queue_task() for managed_node2/setup 30564 1726882909.16579: done queuing things up, now waiting for results queue to drain 30564 1726882909.16580: waiting for pending results... 30564 1726882909.16873: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30564 1726882909.17037: in run() - task 0e448fcc-3ce9-4216-acec-00000000237f 30564 1726882909.17055: variable 'ansible_search_path' from source: unknown 30564 1726882909.17063: variable 'ansible_search_path' from source: unknown 30564 1726882909.17105: calling self._execute() 30564 1726882909.17210: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882909.17222: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882909.17243: variable 'omit' from source: magic vars 30564 1726882909.17614: variable 'ansible_distribution_major_version' from source: facts 30564 1726882909.17631: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882909.17847: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882909.20393: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882909.20469: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882909.20510: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882909.20549: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882909.20587: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882909.20668: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882909.20706: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882909.20738: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882909.20788: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882909.20807: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882909.20859: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882909.20894: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882909.20924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882909.20971: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882909.20995: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882909.21156: variable '__network_required_facts' from source: role '' defaults 30564 1726882909.21171: variable 'ansible_facts' from source: unknown 30564 1726882909.22151: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 30564 1726882909.22160: when evaluation is False, skipping this task 30564 1726882909.22170: _execute() done 30564 1726882909.22178: dumping result to json 30564 1726882909.22189: done dumping result, returning 30564 1726882909.22201: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0e448fcc-3ce9-4216-acec-00000000237f] 30564 1726882909.22221: sending task result for task 0e448fcc-3ce9-4216-acec-00000000237f skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30564 1726882909.22367: no more pending results, returning what we have 30564 1726882909.22372: results queue empty 30564 1726882909.22374: checking for any_errors_fatal 30564 1726882909.22376: done checking for any_errors_fatal 30564 1726882909.22377: checking for max_fail_percentage 30564 1726882909.22379: done checking for max_fail_percentage 30564 1726882909.22380: checking to see if all hosts have failed and the running result is not ok 30564 1726882909.22381: done checking to see if all hosts have failed 30564 1726882909.22382: getting the remaining hosts for this loop 30564 1726882909.22384: done getting the remaining hosts for this loop 30564 1726882909.22388: getting the next task for host managed_node2 30564 1726882909.22402: done getting next task for host managed_node2 30564 1726882909.22406: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 30564 1726882909.22412: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882909.22438: getting variables 30564 1726882909.22440: in VariableManager get_vars() 30564 1726882909.22489: Calling all_inventory to load vars for managed_node2 30564 1726882909.22492: Calling groups_inventory to load vars for managed_node2 30564 1726882909.22494: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882909.22506: Calling all_plugins_play to load vars for managed_node2 30564 1726882909.22510: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882909.22513: Calling groups_plugins_play to load vars for managed_node2 30564 1726882909.23483: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000237f 30564 1726882909.23491: WORKER PROCESS EXITING 30564 1726882909.24552: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882909.26287: done with get_vars() 30564 1726882909.26311: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:41:49 -0400 (0:00:00.101) 0:01:47.845 ****** 30564 1726882909.26415: entering _queue_task() for managed_node2/stat 30564 1726882909.26705: worker is 1 (out of 1 available) 30564 1726882909.26718: exiting _queue_task() for managed_node2/stat 30564 1726882909.26731: done queuing things up, now waiting for results queue to drain 30564 1726882909.26732: waiting for pending results... 30564 1726882909.27024: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 30564 1726882909.27193: in run() - task 0e448fcc-3ce9-4216-acec-000000002381 30564 1726882909.27213: variable 'ansible_search_path' from source: unknown 30564 1726882909.27220: variable 'ansible_search_path' from source: unknown 30564 1726882909.27257: calling self._execute() 30564 1726882909.27360: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882909.27375: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882909.27394: variable 'omit' from source: magic vars 30564 1726882909.27775: variable 'ansible_distribution_major_version' from source: facts 30564 1726882909.27794: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882909.27972: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882909.28253: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882909.28307: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882909.28343: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882909.28385: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882909.28469: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882909.28502: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882909.28531: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882909.28560: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882909.28654: variable '__network_is_ostree' from source: set_fact 30564 1726882909.28668: Evaluated conditional (not __network_is_ostree is defined): False 30564 1726882909.28676: when evaluation is False, skipping this task 30564 1726882909.28684: _execute() done 30564 1726882909.28690: dumping result to json 30564 1726882909.28700: done dumping result, returning 30564 1726882909.28709: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0e448fcc-3ce9-4216-acec-000000002381] 30564 1726882909.28719: sending task result for task 0e448fcc-3ce9-4216-acec-000000002381 skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30564 1726882909.28860: no more pending results, returning what we have 30564 1726882909.28867: results queue empty 30564 1726882909.28868: checking for any_errors_fatal 30564 1726882909.28876: done checking for any_errors_fatal 30564 1726882909.28877: checking for max_fail_percentage 30564 1726882909.28879: done checking for max_fail_percentage 30564 1726882909.28880: checking to see if all hosts have failed and the running result is not ok 30564 1726882909.28880: done checking to see if all hosts have failed 30564 1726882909.28881: getting the remaining hosts for this loop 30564 1726882909.28883: done getting the remaining hosts for this loop 30564 1726882909.28887: getting the next task for host managed_node2 30564 1726882909.28896: done getting next task for host managed_node2 30564 1726882909.28900: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30564 1726882909.28906: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882909.28930: getting variables 30564 1726882909.28931: in VariableManager get_vars() 30564 1726882909.28978: Calling all_inventory to load vars for managed_node2 30564 1726882909.28981: Calling groups_inventory to load vars for managed_node2 30564 1726882909.28983: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882909.28994: Calling all_plugins_play to load vars for managed_node2 30564 1726882909.28997: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882909.29000: Calling groups_plugins_play to load vars for managed_node2 30564 1726882909.29983: done sending task result for task 0e448fcc-3ce9-4216-acec-000000002381 30564 1726882909.29986: WORKER PROCESS EXITING 30564 1726882909.30688: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882909.32647: done with get_vars() 30564 1726882909.33152: done getting variables 30564 1726882909.33216: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:41:49 -0400 (0:00:00.068) 0:01:47.913 ****** 30564 1726882909.33257: entering _queue_task() for managed_node2/set_fact 30564 1726882909.34073: worker is 1 (out of 1 available) 30564 1726882909.34086: exiting _queue_task() for managed_node2/set_fact 30564 1726882909.34100: done queuing things up, now waiting for results queue to drain 30564 1726882909.34101: waiting for pending results... 30564 1726882909.34396: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30564 1726882909.35190: in run() - task 0e448fcc-3ce9-4216-acec-000000002382 30564 1726882909.35211: variable 'ansible_search_path' from source: unknown 30564 1726882909.35218: variable 'ansible_search_path' from source: unknown 30564 1726882909.35254: calling self._execute() 30564 1726882909.35357: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882909.35381: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882909.35401: variable 'omit' from source: magic vars 30564 1726882909.35773: variable 'ansible_distribution_major_version' from source: facts 30564 1726882909.35793: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882909.35965: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882909.36238: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882909.36294: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882909.36331: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882909.36366: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882909.36453: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882909.36488: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882909.36517: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882909.36549: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882909.36646: variable '__network_is_ostree' from source: set_fact 30564 1726882909.36658: Evaluated conditional (not __network_is_ostree is defined): False 30564 1726882909.36666: when evaluation is False, skipping this task 30564 1726882909.36676: _execute() done 30564 1726882909.36683: dumping result to json 30564 1726882909.36690: done dumping result, returning 30564 1726882909.36704: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0e448fcc-3ce9-4216-acec-000000002382] 30564 1726882909.36716: sending task result for task 0e448fcc-3ce9-4216-acec-000000002382 skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30564 1726882909.36853: no more pending results, returning what we have 30564 1726882909.36857: results queue empty 30564 1726882909.36858: checking for any_errors_fatal 30564 1726882909.36867: done checking for any_errors_fatal 30564 1726882909.36868: checking for max_fail_percentage 30564 1726882909.36870: done checking for max_fail_percentage 30564 1726882909.36871: checking to see if all hosts have failed and the running result is not ok 30564 1726882909.36872: done checking to see if all hosts have failed 30564 1726882909.36873: getting the remaining hosts for this loop 30564 1726882909.36875: done getting the remaining hosts for this loop 30564 1726882909.36878: getting the next task for host managed_node2 30564 1726882909.36891: done getting next task for host managed_node2 30564 1726882909.36895: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 30564 1726882909.36901: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882909.36925: getting variables 30564 1726882909.36927: in VariableManager get_vars() 30564 1726882909.36975: Calling all_inventory to load vars for managed_node2 30564 1726882909.36978: Calling groups_inventory to load vars for managed_node2 30564 1726882909.36980: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882909.36990: Calling all_plugins_play to load vars for managed_node2 30564 1726882909.36993: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882909.36996: Calling groups_plugins_play to load vars for managed_node2 30564 1726882909.37982: done sending task result for task 0e448fcc-3ce9-4216-acec-000000002382 30564 1726882909.37986: WORKER PROCESS EXITING 30564 1726882909.38880: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882909.40869: done with get_vars() 30564 1726882909.40891: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:41:49 -0400 (0:00:00.077) 0:01:47.991 ****** 30564 1726882909.40983: entering _queue_task() for managed_node2/service_facts 30564 1726882909.41253: worker is 1 (out of 1 available) 30564 1726882909.41268: exiting _queue_task() for managed_node2/service_facts 30564 1726882909.41281: done queuing things up, now waiting for results queue to drain 30564 1726882909.41282: waiting for pending results... 30564 1726882909.42351: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 30564 1726882909.42715: in run() - task 0e448fcc-3ce9-4216-acec-000000002384 30564 1726882909.42891: variable 'ansible_search_path' from source: unknown 30564 1726882909.42900: variable 'ansible_search_path' from source: unknown 30564 1726882909.42942: calling self._execute() 30564 1726882909.43079: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882909.43207: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882909.43222: variable 'omit' from source: magic vars 30564 1726882909.44033: variable 'ansible_distribution_major_version' from source: facts 30564 1726882909.44058: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882909.44073: variable 'omit' from source: magic vars 30564 1726882909.44164: variable 'omit' from source: magic vars 30564 1726882909.44205: variable 'omit' from source: magic vars 30564 1726882909.44250: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882909.44298: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882909.44323: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882909.44346: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882909.44365: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882909.44403: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882909.44412: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882909.44420: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882909.44531: Set connection var ansible_timeout to 10 30564 1726882909.44542: Set connection var ansible_pipelining to False 30564 1726882909.44549: Set connection var ansible_shell_type to sh 30564 1726882909.44559: Set connection var ansible_shell_executable to /bin/sh 30564 1726882909.44574: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882909.44582: Set connection var ansible_connection to ssh 30564 1726882909.44615: variable 'ansible_shell_executable' from source: unknown 30564 1726882909.44624: variable 'ansible_connection' from source: unknown 30564 1726882909.44631: variable 'ansible_module_compression' from source: unknown 30564 1726882909.44638: variable 'ansible_shell_type' from source: unknown 30564 1726882909.44645: variable 'ansible_shell_executable' from source: unknown 30564 1726882909.44653: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882909.44661: variable 'ansible_pipelining' from source: unknown 30564 1726882909.44672: variable 'ansible_timeout' from source: unknown 30564 1726882909.44682: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882909.44889: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30564 1726882909.44907: variable 'omit' from source: magic vars 30564 1726882909.44917: starting attempt loop 30564 1726882909.44924: running the handler 30564 1726882909.44946: _low_level_execute_command(): starting 30564 1726882909.44958: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882909.45697: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882909.45780: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882909.45952: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882909.45976: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882909.46018: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882909.46031: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882909.46049: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882909.46069: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882909.46084: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882909.46096: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882909.46108: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882909.46123: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882909.46140: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882909.46155: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882909.46173: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882909.46188: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882909.46262: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882909.46292: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882909.46308: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882909.46444: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882909.48107: stdout chunk (state=3): >>>/root <<< 30564 1726882909.48269: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882909.48290: stdout chunk (state=3): >>><<< 30564 1726882909.48293: stderr chunk (state=3): >>><<< 30564 1726882909.48394: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882909.48398: _low_level_execute_command(): starting 30564 1726882909.48401: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882909.4830942-35262-3188123849888 `" && echo ansible-tmp-1726882909.4830942-35262-3188123849888="` echo /root/.ansible/tmp/ansible-tmp-1726882909.4830942-35262-3188123849888 `" ) && sleep 0' 30564 1726882909.48935: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882909.48950: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882909.48969: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882909.48988: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882909.49027: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882909.49038: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882909.49050: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882909.49071: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882909.49085: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882909.49095: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882909.49106: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882909.49118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882909.49132: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882909.49149: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882909.49176: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882909.49192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882909.49655: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882909.49683: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882909.49699: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882909.49829: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882909.51712: stdout chunk (state=3): >>>ansible-tmp-1726882909.4830942-35262-3188123849888=/root/.ansible/tmp/ansible-tmp-1726882909.4830942-35262-3188123849888 <<< 30564 1726882909.51908: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882909.51911: stdout chunk (state=3): >>><<< 30564 1726882909.51914: stderr chunk (state=3): >>><<< 30564 1726882909.52074: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882909.4830942-35262-3188123849888=/root/.ansible/tmp/ansible-tmp-1726882909.4830942-35262-3188123849888 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882909.52078: variable 'ansible_module_compression' from source: unknown 30564 1726882909.52080: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 30564 1726882909.52082: variable 'ansible_facts' from source: unknown 30564 1726882909.52134: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882909.4830942-35262-3188123849888/AnsiballZ_service_facts.py 30564 1726882909.52274: Sending initial data 30564 1726882909.52278: Sent initial data (160 bytes) 30564 1726882909.53238: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882909.53247: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882909.53265: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882909.53549: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882909.53553: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882909.53555: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882909.53557: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882909.53559: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882909.53561: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882909.53565: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882909.53567: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882909.53569: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882909.53571: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882909.53573: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882909.53575: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882909.53577: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882909.53579: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882909.54156: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882909.54161: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882909.54167: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882909.55738: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882909.55831: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882909.55929: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmpt6qlhit6 /root/.ansible/tmp/ansible-tmp-1726882909.4830942-35262-3188123849888/AnsiballZ_service_facts.py <<< 30564 1726882909.56027: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882909.57387: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882909.57478: stderr chunk (state=3): >>><<< 30564 1726882909.57481: stdout chunk (state=3): >>><<< 30564 1726882909.57502: done transferring module to remote 30564 1726882909.57508: _low_level_execute_command(): starting 30564 1726882909.57519: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882909.4830942-35262-3188123849888/ /root/.ansible/tmp/ansible-tmp-1726882909.4830942-35262-3188123849888/AnsiballZ_service_facts.py && sleep 0' 30564 1726882909.58249: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882909.58381: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882909.58392: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882909.58405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882909.58443: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882909.58449: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882909.58459: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882909.58486: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882909.58585: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882909.58593: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882909.58602: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882909.58615: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882909.58627: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882909.58634: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882909.58640: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882909.58649: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882909.58840: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882909.58856: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882909.58869: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882909.58996: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882909.60786: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882909.60847: stderr chunk (state=3): >>><<< 30564 1726882909.60850: stdout chunk (state=3): >>><<< 30564 1726882909.60871: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882909.60876: _low_level_execute_command(): starting 30564 1726882909.60882: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882909.4830942-35262-3188123849888/AnsiballZ_service_facts.py && sleep 0' 30564 1726882909.61518: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882909.61527: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882909.61538: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882909.61552: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882909.61598: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882909.61609: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882909.61626: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882909.61640: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882909.61648: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882909.61655: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882909.61665: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882909.61680: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882909.61692: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882909.61700: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882909.61707: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882909.61717: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882909.61801: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882909.61820: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882909.61834: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882909.61983: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882910.93943: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "yp<<< 30564 1726882910.93956: stdout chunk (state=3): >>>bind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "<<< 30564 1726882910.93959: stdout chunk (state=3): >>>status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 30564 1726882910.95199: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882910.95260: stderr chunk (state=3): >>><<< 30564 1726882910.95266: stdout chunk (state=3): >>><<< 30564 1726882910.95287: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882910.95700: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882909.4830942-35262-3188123849888/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882910.95707: _low_level_execute_command(): starting 30564 1726882910.95712: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882909.4830942-35262-3188123849888/ > /dev/null 2>&1 && sleep 0' 30564 1726882910.96179: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882910.96200: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882910.96211: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 30564 1726882910.96223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882910.96234: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882910.96278: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882910.96292: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882910.96403: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882910.98203: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882910.98246: stderr chunk (state=3): >>><<< 30564 1726882910.98250: stdout chunk (state=3): >>><<< 30564 1726882910.98267: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882910.98275: handler run complete 30564 1726882910.98380: variable 'ansible_facts' from source: unknown 30564 1726882910.98489: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882910.98739: variable 'ansible_facts' from source: unknown 30564 1726882910.98818: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882910.98925: attempt loop complete, returning result 30564 1726882910.98928: _execute() done 30564 1726882910.98931: dumping result to json 30564 1726882910.98965: done dumping result, returning 30564 1726882910.98975: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [0e448fcc-3ce9-4216-acec-000000002384] 30564 1726882910.98981: sending task result for task 0e448fcc-3ce9-4216-acec-000000002384 30564 1726882911.00083: done sending task result for task 0e448fcc-3ce9-4216-acec-000000002384 30564 1726882911.00087: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30564 1726882911.00166: no more pending results, returning what we have 30564 1726882911.00171: results queue empty 30564 1726882911.00172: checking for any_errors_fatal 30564 1726882911.00177: done checking for any_errors_fatal 30564 1726882911.00178: checking for max_fail_percentage 30564 1726882911.00179: done checking for max_fail_percentage 30564 1726882911.00180: checking to see if all hosts have failed and the running result is not ok 30564 1726882911.00181: done checking to see if all hosts have failed 30564 1726882911.00182: getting the remaining hosts for this loop 30564 1726882911.00183: done getting the remaining hosts for this loop 30564 1726882911.00187: getting the next task for host managed_node2 30564 1726882911.00194: done getting next task for host managed_node2 30564 1726882911.00197: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 30564 1726882911.00203: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882911.00215: getting variables 30564 1726882911.00216: in VariableManager get_vars() 30564 1726882911.00250: Calling all_inventory to load vars for managed_node2 30564 1726882911.00252: Calling groups_inventory to load vars for managed_node2 30564 1726882911.00255: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882911.00265: Calling all_plugins_play to load vars for managed_node2 30564 1726882911.00272: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882911.00275: Calling groups_plugins_play to load vars for managed_node2 30564 1726882911.01785: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882911.03583: done with get_vars() 30564 1726882911.03604: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:41:51 -0400 (0:00:01.627) 0:01:49.618 ****** 30564 1726882911.03701: entering _queue_task() for managed_node2/package_facts 30564 1726882911.03998: worker is 1 (out of 1 available) 30564 1726882911.04010: exiting _queue_task() for managed_node2/package_facts 30564 1726882911.04024: done queuing things up, now waiting for results queue to drain 30564 1726882911.04025: waiting for pending results... 30564 1726882911.04324: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 30564 1726882911.04508: in run() - task 0e448fcc-3ce9-4216-acec-000000002385 30564 1726882911.04528: variable 'ansible_search_path' from source: unknown 30564 1726882911.04535: variable 'ansible_search_path' from source: unknown 30564 1726882911.04580: calling self._execute() 30564 1726882911.04686: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882911.04699: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882911.04714: variable 'omit' from source: magic vars 30564 1726882911.05099: variable 'ansible_distribution_major_version' from source: facts 30564 1726882911.05121: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882911.05131: variable 'omit' from source: magic vars 30564 1726882911.05218: variable 'omit' from source: magic vars 30564 1726882911.05254: variable 'omit' from source: magic vars 30564 1726882911.05303: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882911.05343: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882911.05372: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882911.05393: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882911.05409: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882911.05444: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882911.05451: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882911.05458: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882911.05566: Set connection var ansible_timeout to 10 30564 1726882911.05580: Set connection var ansible_pipelining to False 30564 1726882911.05587: Set connection var ansible_shell_type to sh 30564 1726882911.05596: Set connection var ansible_shell_executable to /bin/sh 30564 1726882911.05606: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882911.05612: Set connection var ansible_connection to ssh 30564 1726882911.05639: variable 'ansible_shell_executable' from source: unknown 30564 1726882911.05646: variable 'ansible_connection' from source: unknown 30564 1726882911.05654: variable 'ansible_module_compression' from source: unknown 30564 1726882911.05662: variable 'ansible_shell_type' from source: unknown 30564 1726882911.05673: variable 'ansible_shell_executable' from source: unknown 30564 1726882911.05680: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882911.05686: variable 'ansible_pipelining' from source: unknown 30564 1726882911.05692: variable 'ansible_timeout' from source: unknown 30564 1726882911.05699: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882911.05905: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30564 1726882911.05922: variable 'omit' from source: magic vars 30564 1726882911.05931: starting attempt loop 30564 1726882911.05937: running the handler 30564 1726882911.05954: _low_level_execute_command(): starting 30564 1726882911.05971: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882911.06745: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882911.06761: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882911.06782: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882911.06801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882911.06849: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882911.06870: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882911.06887: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882911.06909: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882911.06923: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882911.06937: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882911.06952: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882911.06972: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882911.06989: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882911.07003: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882911.07014: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882911.07028: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882911.07110: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882911.07132: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882911.07147: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882911.07289: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882911.08940: stdout chunk (state=3): >>>/root <<< 30564 1726882911.09049: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882911.09133: stderr chunk (state=3): >>><<< 30564 1726882911.09149: stdout chunk (state=3): >>><<< 30564 1726882911.09280: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882911.09283: _low_level_execute_command(): starting 30564 1726882911.09286: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882911.091826-35323-203615913695341 `" && echo ansible-tmp-1726882911.091826-35323-203615913695341="` echo /root/.ansible/tmp/ansible-tmp-1726882911.091826-35323-203615913695341 `" ) && sleep 0' 30564 1726882911.09881: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882911.09894: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882911.09908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882911.09929: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882911.09975: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882911.09987: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882911.10000: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882911.10016: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882911.10026: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882911.10042: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882911.10053: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882911.10067: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882911.10085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882911.10096: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882911.10105: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882911.10117: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882911.10202: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882911.10222: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882911.10236: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882911.10363: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882911.12258: stdout chunk (state=3): >>>ansible-tmp-1726882911.091826-35323-203615913695341=/root/.ansible/tmp/ansible-tmp-1726882911.091826-35323-203615913695341 <<< 30564 1726882911.12372: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882911.12449: stderr chunk (state=3): >>><<< 30564 1726882911.12470: stdout chunk (state=3): >>><<< 30564 1726882911.12775: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882911.091826-35323-203615913695341=/root/.ansible/tmp/ansible-tmp-1726882911.091826-35323-203615913695341 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882911.12779: variable 'ansible_module_compression' from source: unknown 30564 1726882911.12781: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 30564 1726882911.12783: variable 'ansible_facts' from source: unknown 30564 1726882911.12829: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882911.091826-35323-203615913695341/AnsiballZ_package_facts.py 30564 1726882911.13008: Sending initial data 30564 1726882911.13011: Sent initial data (161 bytes) 30564 1726882911.14027: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882911.14041: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882911.14055: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882911.14080: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882911.14125: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882911.14137: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882911.14149: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882911.14167: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882911.14181: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882911.14196: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882911.14208: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882911.14220: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882911.14234: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882911.14244: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882911.14254: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882911.14267: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882911.14349: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882911.14377: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882911.14395: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882911.14535: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882911.16282: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882911.16370: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882911.16472: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmpfkhivakt /root/.ansible/tmp/ansible-tmp-1726882911.091826-35323-203615913695341/AnsiballZ_package_facts.py <<< 30564 1726882911.16565: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882911.19308: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882911.19506: stderr chunk (state=3): >>><<< 30564 1726882911.19509: stdout chunk (state=3): >>><<< 30564 1726882911.19511: done transferring module to remote 30564 1726882911.19517: _low_level_execute_command(): starting 30564 1726882911.19519: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882911.091826-35323-203615913695341/ /root/.ansible/tmp/ansible-tmp-1726882911.091826-35323-203615913695341/AnsiballZ_package_facts.py && sleep 0' 30564 1726882911.20107: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882911.20111: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882911.20127: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882911.20135: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882911.20145: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882911.20160: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882911.20172: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882911.20178: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882911.20185: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882911.20195: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882911.20207: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882911.20214: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882911.20221: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882911.20231: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882911.20304: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882911.20320: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882911.20333: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882911.20458: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882911.22312: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882911.22316: stdout chunk (state=3): >>><<< 30564 1726882911.22323: stderr chunk (state=3): >>><<< 30564 1726882911.22343: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882911.22346: _low_level_execute_command(): starting 30564 1726882911.22352: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882911.091826-35323-203615913695341/AnsiballZ_package_facts.py && sleep 0' 30564 1726882911.22904: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882911.22908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882911.22942: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882911.22946: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882911.22948: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882911.22994: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882911.23002: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882911.23010: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882911.23127: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882911.69097: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": nu<<< 30564 1726882911.69134: stdout chunk (state=3): >>>ll, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]<<< 30564 1726882911.69166: stdout chunk (state=3): >>>, "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202<<< 30564 1726882911.69192: stdout chunk (state=3): >>>", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-<<< 30564 1726882911.69232: stdout chunk (state=3): >>>base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "a<<< 30564 1726882911.69239: stdout chunk (state=3): >>>rch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "sour<<< 30564 1726882911.69242: stdout chunk (state=3): >>>ce": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, <<< 30564 1726882911.69245: stdout chunk (state=3): >>>"arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300<<< 30564 1726882911.69247: stdout chunk (state=3): >>>", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64"<<< 30564 1726882911.69279: stdout chunk (state=3): >>>, "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_6<<< 30564 1726882911.69292: stdout chunk (state=3): >>>4", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", <<< 30564 1726882911.69313: stdout chunk (state=3): >>>"release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch<<< 30564 1726882911.69319: stdout chunk (state=3): >>>", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 30564 1726882911.70710: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882911.70773: stderr chunk (state=3): >>><<< 30564 1726882911.70776: stdout chunk (state=3): >>><<< 30564 1726882911.70808: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882911.72275: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882911.091826-35323-203615913695341/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882911.72292: _low_level_execute_command(): starting 30564 1726882911.72295: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882911.091826-35323-203615913695341/ > /dev/null 2>&1 && sleep 0' 30564 1726882911.72776: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882911.72781: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882911.72881: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882911.72886: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882911.72997: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882911.74803: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882911.74856: stderr chunk (state=3): >>><<< 30564 1726882911.74859: stdout chunk (state=3): >>><<< 30564 1726882911.74875: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882911.74881: handler run complete 30564 1726882911.75390: variable 'ansible_facts' from source: unknown 30564 1726882911.75684: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882911.90100: variable 'ansible_facts' from source: unknown 30564 1726882911.90862: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882911.91786: attempt loop complete, returning result 30564 1726882911.91807: _execute() done 30564 1726882911.91810: dumping result to json 30564 1726882911.92027: done dumping result, returning 30564 1726882911.92035: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0e448fcc-3ce9-4216-acec-000000002385] 30564 1726882911.92038: sending task result for task 0e448fcc-3ce9-4216-acec-000000002385 30564 1726882911.94701: done sending task result for task 0e448fcc-3ce9-4216-acec-000000002385 30564 1726882911.94705: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30564 1726882911.94875: no more pending results, returning what we have 30564 1726882911.94878: results queue empty 30564 1726882911.94879: checking for any_errors_fatal 30564 1726882911.94884: done checking for any_errors_fatal 30564 1726882911.94884: checking for max_fail_percentage 30564 1726882911.94886: done checking for max_fail_percentage 30564 1726882911.94887: checking to see if all hosts have failed and the running result is not ok 30564 1726882911.94888: done checking to see if all hosts have failed 30564 1726882911.94888: getting the remaining hosts for this loop 30564 1726882911.94890: done getting the remaining hosts for this loop 30564 1726882911.94893: getting the next task for host managed_node2 30564 1726882911.94901: done getting next task for host managed_node2 30564 1726882911.94904: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 30564 1726882911.94909: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882911.94921: getting variables 30564 1726882911.94923: in VariableManager get_vars() 30564 1726882911.94957: Calling all_inventory to load vars for managed_node2 30564 1726882911.94961: Calling groups_inventory to load vars for managed_node2 30564 1726882911.94965: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882911.94977: Calling all_plugins_play to load vars for managed_node2 30564 1726882911.94980: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882911.94983: Calling groups_plugins_play to load vars for managed_node2 30564 1726882912.14221: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882912.17911: done with get_vars() 30564 1726882912.17942: done getting variables 30564 1726882912.18104: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:41:52 -0400 (0:00:01.144) 0:01:50.762 ****** 30564 1726882912.18142: entering _queue_task() for managed_node2/debug 30564 1726882912.18959: worker is 1 (out of 1 available) 30564 1726882912.18977: exiting _queue_task() for managed_node2/debug 30564 1726882912.18991: done queuing things up, now waiting for results queue to drain 30564 1726882912.18992: waiting for pending results... 30564 1726882912.19912: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 30564 1726882912.20322: in run() - task 0e448fcc-3ce9-4216-acec-000000002329 30564 1726882912.20345: variable 'ansible_search_path' from source: unknown 30564 1726882912.20356: variable 'ansible_search_path' from source: unknown 30564 1726882912.20513: calling self._execute() 30564 1726882912.20628: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882912.20700: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882912.20715: variable 'omit' from source: magic vars 30564 1726882912.21683: variable 'ansible_distribution_major_version' from source: facts 30564 1726882912.21702: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882912.21713: variable 'omit' from source: magic vars 30564 1726882912.21901: variable 'omit' from source: magic vars 30564 1726882912.22121: variable 'network_provider' from source: set_fact 30564 1726882912.22146: variable 'omit' from source: magic vars 30564 1726882912.22196: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882912.22306: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882912.22339: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882912.22362: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882912.22455: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882912.22495: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882912.22504: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882912.22511: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882912.22732: Set connection var ansible_timeout to 10 30564 1726882912.22742: Set connection var ansible_pipelining to False 30564 1726882912.22773: Set connection var ansible_shell_type to sh 30564 1726882912.22784: Set connection var ansible_shell_executable to /bin/sh 30564 1726882912.22884: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882912.22891: Set connection var ansible_connection to ssh 30564 1726882912.22916: variable 'ansible_shell_executable' from source: unknown 30564 1726882912.22923: variable 'ansible_connection' from source: unknown 30564 1726882912.22929: variable 'ansible_module_compression' from source: unknown 30564 1726882912.22935: variable 'ansible_shell_type' from source: unknown 30564 1726882912.22942: variable 'ansible_shell_executable' from source: unknown 30564 1726882912.22949: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882912.22956: variable 'ansible_pipelining' from source: unknown 30564 1726882912.22962: variable 'ansible_timeout' from source: unknown 30564 1726882912.22974: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882912.23194: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882912.23256: variable 'omit' from source: magic vars 30564 1726882912.23270: starting attempt loop 30564 1726882912.23278: running the handler 30564 1726882912.23332: handler run complete 30564 1726882912.24009: attempt loop complete, returning result 30564 1726882912.24018: _execute() done 30564 1726882912.24025: dumping result to json 30564 1726882912.24033: done dumping result, returning 30564 1726882912.24044: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [0e448fcc-3ce9-4216-acec-000000002329] 30564 1726882912.24055: sending task result for task 0e448fcc-3ce9-4216-acec-000000002329 ok: [managed_node2] => {} MSG: Using network provider: nm 30564 1726882912.24288: no more pending results, returning what we have 30564 1726882912.24292: results queue empty 30564 1726882912.24294: checking for any_errors_fatal 30564 1726882912.24308: done checking for any_errors_fatal 30564 1726882912.24309: checking for max_fail_percentage 30564 1726882912.24311: done checking for max_fail_percentage 30564 1726882912.24312: checking to see if all hosts have failed and the running result is not ok 30564 1726882912.24313: done checking to see if all hosts have failed 30564 1726882912.24314: getting the remaining hosts for this loop 30564 1726882912.24316: done getting the remaining hosts for this loop 30564 1726882912.24320: getting the next task for host managed_node2 30564 1726882912.24330: done getting next task for host managed_node2 30564 1726882912.24336: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30564 1726882912.24342: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882912.24358: getting variables 30564 1726882912.24360: in VariableManager get_vars() 30564 1726882912.24412: Calling all_inventory to load vars for managed_node2 30564 1726882912.24416: Calling groups_inventory to load vars for managed_node2 30564 1726882912.24418: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882912.24429: Calling all_plugins_play to load vars for managed_node2 30564 1726882912.24433: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882912.24436: Calling groups_plugins_play to load vars for managed_node2 30564 1726882912.25385: done sending task result for task 0e448fcc-3ce9-4216-acec-000000002329 30564 1726882912.25389: WORKER PROCESS EXITING 30564 1726882912.27225: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882912.31237: done with get_vars() 30564 1726882912.31273: done getting variables 30564 1726882912.31338: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:41:52 -0400 (0:00:00.134) 0:01:50.897 ****** 30564 1726882912.31591: entering _queue_task() for managed_node2/fail 30564 1726882912.32141: worker is 1 (out of 1 available) 30564 1726882912.32156: exiting _queue_task() for managed_node2/fail 30564 1726882912.32375: done queuing things up, now waiting for results queue to drain 30564 1726882912.32377: waiting for pending results... 30564 1726882912.33230: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30564 1726882912.33611: in run() - task 0e448fcc-3ce9-4216-acec-00000000232a 30564 1726882912.33632: variable 'ansible_search_path' from source: unknown 30564 1726882912.33640: variable 'ansible_search_path' from source: unknown 30564 1726882912.33686: calling self._execute() 30564 1726882912.34020: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882912.34035: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882912.34051: variable 'omit' from source: magic vars 30564 1726882912.34803: variable 'ansible_distribution_major_version' from source: facts 30564 1726882912.34897: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882912.35162: variable 'network_state' from source: role '' defaults 30564 1726882912.35185: Evaluated conditional (network_state != {}): False 30564 1726882912.35193: when evaluation is False, skipping this task 30564 1726882912.35201: _execute() done 30564 1726882912.35212: dumping result to json 30564 1726882912.35222: done dumping result, returning 30564 1726882912.35337: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0e448fcc-3ce9-4216-acec-00000000232a] 30564 1726882912.35349: sending task result for task 0e448fcc-3ce9-4216-acec-00000000232a skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30564 1726882912.35522: no more pending results, returning what we have 30564 1726882912.35526: results queue empty 30564 1726882912.35527: checking for any_errors_fatal 30564 1726882912.35535: done checking for any_errors_fatal 30564 1726882912.35536: checking for max_fail_percentage 30564 1726882912.35538: done checking for max_fail_percentage 30564 1726882912.35539: checking to see if all hosts have failed and the running result is not ok 30564 1726882912.35540: done checking to see if all hosts have failed 30564 1726882912.35541: getting the remaining hosts for this loop 30564 1726882912.35543: done getting the remaining hosts for this loop 30564 1726882912.35547: getting the next task for host managed_node2 30564 1726882912.35557: done getting next task for host managed_node2 30564 1726882912.35561: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30564 1726882912.35572: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882912.35600: getting variables 30564 1726882912.35602: in VariableManager get_vars() 30564 1726882912.35649: Calling all_inventory to load vars for managed_node2 30564 1726882912.35652: Calling groups_inventory to load vars for managed_node2 30564 1726882912.35654: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882912.35671: Calling all_plugins_play to load vars for managed_node2 30564 1726882912.35674: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882912.35677: Calling groups_plugins_play to load vars for managed_node2 30564 1726882912.36787: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000232a 30564 1726882912.36790: WORKER PROCESS EXITING 30564 1726882912.38478: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882912.41922: done with get_vars() 30564 1726882912.41952: done getting variables 30564 1726882912.42019: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:41:52 -0400 (0:00:00.104) 0:01:51.001 ****** 30564 1726882912.42056: entering _queue_task() for managed_node2/fail 30564 1726882912.43010: worker is 1 (out of 1 available) 30564 1726882912.43024: exiting _queue_task() for managed_node2/fail 30564 1726882912.43036: done queuing things up, now waiting for results queue to drain 30564 1726882912.43037: waiting for pending results... 30564 1726882912.43999: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30564 1726882912.44409: in run() - task 0e448fcc-3ce9-4216-acec-00000000232b 30564 1726882912.44430: variable 'ansible_search_path' from source: unknown 30564 1726882912.44474: variable 'ansible_search_path' from source: unknown 30564 1726882912.44518: calling self._execute() 30564 1726882912.44803: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882912.44817: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882912.44834: variable 'omit' from source: magic vars 30564 1726882912.45620: variable 'ansible_distribution_major_version' from source: facts 30564 1726882912.45790: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882912.46042: variable 'network_state' from source: role '' defaults 30564 1726882912.46056: Evaluated conditional (network_state != {}): False 30564 1726882912.46066: when evaluation is False, skipping this task 30564 1726882912.46077: _execute() done 30564 1726882912.46085: dumping result to json 30564 1726882912.46103: done dumping result, returning 30564 1726882912.46115: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0e448fcc-3ce9-4216-acec-00000000232b] 30564 1726882912.46217: sending task result for task 0e448fcc-3ce9-4216-acec-00000000232b skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30564 1726882912.46383: no more pending results, returning what we have 30564 1726882912.46388: results queue empty 30564 1726882912.46389: checking for any_errors_fatal 30564 1726882912.46399: done checking for any_errors_fatal 30564 1726882912.46400: checking for max_fail_percentage 30564 1726882912.46402: done checking for max_fail_percentage 30564 1726882912.46403: checking to see if all hosts have failed and the running result is not ok 30564 1726882912.46404: done checking to see if all hosts have failed 30564 1726882912.46404: getting the remaining hosts for this loop 30564 1726882912.46406: done getting the remaining hosts for this loop 30564 1726882912.46410: getting the next task for host managed_node2 30564 1726882912.46418: done getting next task for host managed_node2 30564 1726882912.46422: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30564 1726882912.46429: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882912.46454: getting variables 30564 1726882912.46456: in VariableManager get_vars() 30564 1726882912.46505: Calling all_inventory to load vars for managed_node2 30564 1726882912.46507: Calling groups_inventory to load vars for managed_node2 30564 1726882912.46509: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882912.46521: Calling all_plugins_play to load vars for managed_node2 30564 1726882912.46524: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882912.46526: Calling groups_plugins_play to load vars for managed_node2 30564 1726882912.48092: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000232b 30564 1726882912.48096: WORKER PROCESS EXITING 30564 1726882912.50076: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882912.53457: done with get_vars() 30564 1726882912.53493: done getting variables 30564 1726882912.53553: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:41:52 -0400 (0:00:00.115) 0:01:51.117 ****** 30564 1726882912.53597: entering _queue_task() for managed_node2/fail 30564 1726882912.54566: worker is 1 (out of 1 available) 30564 1726882912.54585: exiting _queue_task() for managed_node2/fail 30564 1726882912.54597: done queuing things up, now waiting for results queue to drain 30564 1726882912.54598: waiting for pending results... 30564 1726882912.55353: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30564 1726882912.55759: in run() - task 0e448fcc-3ce9-4216-acec-00000000232c 30564 1726882912.55823: variable 'ansible_search_path' from source: unknown 30564 1726882912.55832: variable 'ansible_search_path' from source: unknown 30564 1726882912.55881: calling self._execute() 30564 1726882912.56248: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882912.56262: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882912.56283: variable 'omit' from source: magic vars 30564 1726882912.57030: variable 'ansible_distribution_major_version' from source: facts 30564 1726882912.57130: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882912.57471: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882912.62637: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882912.62727: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882912.62856: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882912.62901: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882912.62931: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882912.63147: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882912.63190: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882912.63305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882912.63350: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882912.63406: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882912.63575: variable 'ansible_distribution_major_version' from source: facts 30564 1726882912.63695: Evaluated conditional (ansible_distribution_major_version | int > 9): False 30564 1726882912.63705: when evaluation is False, skipping this task 30564 1726882912.63718: _execute() done 30564 1726882912.63725: dumping result to json 30564 1726882912.63828: done dumping result, returning 30564 1726882912.63842: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0e448fcc-3ce9-4216-acec-00000000232c] 30564 1726882912.63852: sending task result for task 0e448fcc-3ce9-4216-acec-00000000232c skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 30564 1726882912.64024: no more pending results, returning what we have 30564 1726882912.64028: results queue empty 30564 1726882912.64030: checking for any_errors_fatal 30564 1726882912.64036: done checking for any_errors_fatal 30564 1726882912.64037: checking for max_fail_percentage 30564 1726882912.64040: done checking for max_fail_percentage 30564 1726882912.64041: checking to see if all hosts have failed and the running result is not ok 30564 1726882912.64042: done checking to see if all hosts have failed 30564 1726882912.64043: getting the remaining hosts for this loop 30564 1726882912.64045: done getting the remaining hosts for this loop 30564 1726882912.64049: getting the next task for host managed_node2 30564 1726882912.64059: done getting next task for host managed_node2 30564 1726882912.64065: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30564 1726882912.64073: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882912.64101: getting variables 30564 1726882912.64103: in VariableManager get_vars() 30564 1726882912.64149: Calling all_inventory to load vars for managed_node2 30564 1726882912.64152: Calling groups_inventory to load vars for managed_node2 30564 1726882912.64154: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882912.64171: Calling all_plugins_play to load vars for managed_node2 30564 1726882912.64174: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882912.64178: Calling groups_plugins_play to load vars for managed_node2 30564 1726882912.65185: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000232c 30564 1726882912.65189: WORKER PROCESS EXITING 30564 1726882912.67981: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882912.73152: done with get_vars() 30564 1726882912.73193: done getting variables 30564 1726882912.73255: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:41:52 -0400 (0:00:00.196) 0:01:51.314 ****** 30564 1726882912.73295: entering _queue_task() for managed_node2/dnf 30564 1726882912.74654: worker is 1 (out of 1 available) 30564 1726882912.74671: exiting _queue_task() for managed_node2/dnf 30564 1726882912.74686: done queuing things up, now waiting for results queue to drain 30564 1726882912.74687: waiting for pending results... 30564 1726882912.75557: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30564 1726882912.75808: in run() - task 0e448fcc-3ce9-4216-acec-00000000232d 30564 1726882912.75935: variable 'ansible_search_path' from source: unknown 30564 1726882912.75940: variable 'ansible_search_path' from source: unknown 30564 1726882912.75978: calling self._execute() 30564 1726882912.76301: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882912.76305: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882912.76318: variable 'omit' from source: magic vars 30564 1726882912.77147: variable 'ansible_distribution_major_version' from source: facts 30564 1726882912.77162: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882912.77590: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882912.82896: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882912.82985: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882912.83022: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882912.83052: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882912.83218: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882912.83409: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882912.83435: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882912.83460: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882912.83589: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882912.83636: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882912.83866: variable 'ansible_distribution' from source: facts 30564 1726882912.83872: variable 'ansible_distribution_major_version' from source: facts 30564 1726882912.83885: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 30564 1726882912.84112: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882912.84355: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882912.84522: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882912.84545: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882912.84702: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882912.84717: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882912.84756: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882912.84779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882912.84941: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882912.84982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882912.84997: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882912.85148: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882912.85173: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882912.85195: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882912.85347: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882912.85362: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882912.85625: variable 'network_connections' from source: include params 30564 1726882912.85636: variable 'interface' from source: play vars 30564 1726882912.85917: variable 'interface' from source: play vars 30564 1726882912.85991: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882912.86282: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882912.86429: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882912.86459: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882912.86488: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882912.86531: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882912.86665: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882912.86691: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882912.86716: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882912.86928: variable '__network_team_connections_defined' from source: role '' defaults 30564 1726882912.87319: variable 'network_connections' from source: include params 30564 1726882912.87324: variable 'interface' from source: play vars 30564 1726882912.87385: variable 'interface' from source: play vars 30564 1726882912.87414: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30564 1726882912.87417: when evaluation is False, skipping this task 30564 1726882912.87419: _execute() done 30564 1726882912.87422: dumping result to json 30564 1726882912.87426: done dumping result, returning 30564 1726882912.87434: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0e448fcc-3ce9-4216-acec-00000000232d] 30564 1726882912.87440: sending task result for task 0e448fcc-3ce9-4216-acec-00000000232d 30564 1726882912.87544: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000232d 30564 1726882912.87547: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30564 1726882912.87599: no more pending results, returning what we have 30564 1726882912.87602: results queue empty 30564 1726882912.87603: checking for any_errors_fatal 30564 1726882912.87611: done checking for any_errors_fatal 30564 1726882912.87612: checking for max_fail_percentage 30564 1726882912.87614: done checking for max_fail_percentage 30564 1726882912.87615: checking to see if all hosts have failed and the running result is not ok 30564 1726882912.87615: done checking to see if all hosts have failed 30564 1726882912.87616: getting the remaining hosts for this loop 30564 1726882912.87617: done getting the remaining hosts for this loop 30564 1726882912.87621: getting the next task for host managed_node2 30564 1726882912.87629: done getting next task for host managed_node2 30564 1726882912.87634: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30564 1726882912.87639: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882912.87669: getting variables 30564 1726882912.87671: in VariableManager get_vars() 30564 1726882912.87716: Calling all_inventory to load vars for managed_node2 30564 1726882912.87719: Calling groups_inventory to load vars for managed_node2 30564 1726882912.87721: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882912.87733: Calling all_plugins_play to load vars for managed_node2 30564 1726882912.87736: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882912.87739: Calling groups_plugins_play to load vars for managed_node2 30564 1726882912.91116: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882912.94324: done with get_vars() 30564 1726882912.94353: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30564 1726882912.94431: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:41:52 -0400 (0:00:00.211) 0:01:51.525 ****** 30564 1726882912.94767: entering _queue_task() for managed_node2/yum 30564 1726882912.95328: worker is 1 (out of 1 available) 30564 1726882912.95341: exiting _queue_task() for managed_node2/yum 30564 1726882912.95356: done queuing things up, now waiting for results queue to drain 30564 1726882912.95357: waiting for pending results... 30564 1726882912.96358: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30564 1726882912.96618: in run() - task 0e448fcc-3ce9-4216-acec-00000000232e 30564 1726882912.96674: variable 'ansible_search_path' from source: unknown 30564 1726882912.96773: variable 'ansible_search_path' from source: unknown 30564 1726882912.96816: calling self._execute() 30564 1726882912.97042: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882912.97054: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882912.97076: variable 'omit' from source: magic vars 30564 1726882912.97938: variable 'ansible_distribution_major_version' from source: facts 30564 1726882912.97962: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882912.98357: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882913.04503: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882913.04596: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882913.04754: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882913.04797: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882913.04883: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882913.05101: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882913.05135: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882913.05208: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882913.05306: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882913.05385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882913.05612: variable 'ansible_distribution_major_version' from source: facts 30564 1726882913.05638: Evaluated conditional (ansible_distribution_major_version | int < 8): False 30564 1726882913.05696: when evaluation is False, skipping this task 30564 1726882913.05704: _execute() done 30564 1726882913.05711: dumping result to json 30564 1726882913.05718: done dumping result, returning 30564 1726882913.05736: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0e448fcc-3ce9-4216-acec-00000000232e] 30564 1726882913.05746: sending task result for task 0e448fcc-3ce9-4216-acec-00000000232e skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 30564 1726882913.05925: no more pending results, returning what we have 30564 1726882913.05929: results queue empty 30564 1726882913.05930: checking for any_errors_fatal 30564 1726882913.05941: done checking for any_errors_fatal 30564 1726882913.05943: checking for max_fail_percentage 30564 1726882913.05945: done checking for max_fail_percentage 30564 1726882913.05946: checking to see if all hosts have failed and the running result is not ok 30564 1726882913.05947: done checking to see if all hosts have failed 30564 1726882913.05948: getting the remaining hosts for this loop 30564 1726882913.05950: done getting the remaining hosts for this loop 30564 1726882913.05954: getting the next task for host managed_node2 30564 1726882913.05966: done getting next task for host managed_node2 30564 1726882913.05972: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30564 1726882913.05978: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882913.06005: getting variables 30564 1726882913.06007: in VariableManager get_vars() 30564 1726882913.06054: Calling all_inventory to load vars for managed_node2 30564 1726882913.06057: Calling groups_inventory to load vars for managed_node2 30564 1726882913.06060: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882913.06075: Calling all_plugins_play to load vars for managed_node2 30564 1726882913.06079: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882913.06082: Calling groups_plugins_play to load vars for managed_node2 30564 1726882913.07474: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000232e 30564 1726882913.07480: WORKER PROCESS EXITING 30564 1726882913.09599: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882913.12199: done with get_vars() 30564 1726882913.12238: done getting variables 30564 1726882913.12304: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:41:53 -0400 (0:00:00.178) 0:01:51.704 ****** 30564 1726882913.12346: entering _queue_task() for managed_node2/fail 30564 1726882913.12718: worker is 1 (out of 1 available) 30564 1726882913.12733: exiting _queue_task() for managed_node2/fail 30564 1726882913.12745: done queuing things up, now waiting for results queue to drain 30564 1726882913.12747: waiting for pending results... 30564 1726882913.13058: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30564 1726882913.13204: in run() - task 0e448fcc-3ce9-4216-acec-00000000232f 30564 1726882913.13224: variable 'ansible_search_path' from source: unknown 30564 1726882913.13229: variable 'ansible_search_path' from source: unknown 30564 1726882913.13265: calling self._execute() 30564 1726882913.13378: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882913.13382: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882913.13395: variable 'omit' from source: magic vars 30564 1726882913.13810: variable 'ansible_distribution_major_version' from source: facts 30564 1726882913.13823: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882913.13954: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882913.14160: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882913.16523: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882913.16601: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882913.16636: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882913.16672: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882913.16699: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882913.16777: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882913.16806: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882913.16829: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882913.16872: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882913.16883: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882913.16929: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882913.16950: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882913.16976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882913.17020: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882913.17034: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882913.17074: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882913.17097: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882913.17125: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882913.17164: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882913.17181: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882913.17361: variable 'network_connections' from source: include params 30564 1726882913.17376: variable 'interface' from source: play vars 30564 1726882913.17447: variable 'interface' from source: play vars 30564 1726882913.17520: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882913.17693: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882913.17904: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882913.17933: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882913.17957: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882913.18005: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882913.18024: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882913.18046: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882913.18073: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882913.18126: variable '__network_team_connections_defined' from source: role '' defaults 30564 1726882913.18370: variable 'network_connections' from source: include params 30564 1726882913.18373: variable 'interface' from source: play vars 30564 1726882913.18435: variable 'interface' from source: play vars 30564 1726882913.18457: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30564 1726882913.18461: when evaluation is False, skipping this task 30564 1726882913.18465: _execute() done 30564 1726882913.18469: dumping result to json 30564 1726882913.18472: done dumping result, returning 30564 1726882913.18476: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-4216-acec-00000000232f] 30564 1726882913.18483: sending task result for task 0e448fcc-3ce9-4216-acec-00000000232f 30564 1726882913.18591: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000232f 30564 1726882913.18594: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30564 1726882913.18644: no more pending results, returning what we have 30564 1726882913.18648: results queue empty 30564 1726882913.18649: checking for any_errors_fatal 30564 1726882913.18656: done checking for any_errors_fatal 30564 1726882913.18657: checking for max_fail_percentage 30564 1726882913.18658: done checking for max_fail_percentage 30564 1726882913.18659: checking to see if all hosts have failed and the running result is not ok 30564 1726882913.18660: done checking to see if all hosts have failed 30564 1726882913.18660: getting the remaining hosts for this loop 30564 1726882913.18662: done getting the remaining hosts for this loop 30564 1726882913.18670: getting the next task for host managed_node2 30564 1726882913.18680: done getting next task for host managed_node2 30564 1726882913.18684: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 30564 1726882913.18689: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882913.18716: getting variables 30564 1726882913.18718: in VariableManager get_vars() 30564 1726882913.18760: Calling all_inventory to load vars for managed_node2 30564 1726882913.18762: Calling groups_inventory to load vars for managed_node2 30564 1726882913.18772: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882913.18782: Calling all_plugins_play to load vars for managed_node2 30564 1726882913.18785: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882913.18787: Calling groups_plugins_play to load vars for managed_node2 30564 1726882913.21746: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882913.25700: done with get_vars() 30564 1726882913.25851: done getting variables 30564 1726882913.25917: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:41:53 -0400 (0:00:00.137) 0:01:51.842 ****** 30564 1726882913.26077: entering _queue_task() for managed_node2/package 30564 1726882913.26813: worker is 1 (out of 1 available) 30564 1726882913.26830: exiting _queue_task() for managed_node2/package 30564 1726882913.26843: done queuing things up, now waiting for results queue to drain 30564 1726882913.26844: waiting for pending results... 30564 1726882913.27750: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 30564 1726882913.28003: in run() - task 0e448fcc-3ce9-4216-acec-000000002330 30564 1726882913.28106: variable 'ansible_search_path' from source: unknown 30564 1726882913.28115: variable 'ansible_search_path' from source: unknown 30564 1726882913.28155: calling self._execute() 30564 1726882913.28378: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882913.28391: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882913.28409: variable 'omit' from source: magic vars 30564 1726882913.29249: variable 'ansible_distribution_major_version' from source: facts 30564 1726882913.29271: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882913.29599: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882913.30206: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882913.30315: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882913.30404: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882913.30467: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882913.30820: variable 'network_packages' from source: role '' defaults 30564 1726882913.31043: variable '__network_provider_setup' from source: role '' defaults 30564 1726882913.31062: variable '__network_service_name_default_nm' from source: role '' defaults 30564 1726882913.31242: variable '__network_service_name_default_nm' from source: role '' defaults 30564 1726882913.31258: variable '__network_packages_default_nm' from source: role '' defaults 30564 1726882913.31328: variable '__network_packages_default_nm' from source: role '' defaults 30564 1726882913.31741: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882913.36122: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882913.36198: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882913.36241: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882913.36282: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882913.36314: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882913.36411: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882913.36445: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882913.36477: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882913.36528: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882913.36547: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882913.36598: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882913.36627: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882913.36656: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882913.36707: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882913.36726: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882913.36967: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30564 1726882913.37084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882913.37113: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882913.37148: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882913.37193: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882913.37210: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882913.37305: variable 'ansible_python' from source: facts 30564 1726882913.37326: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30564 1726882913.37414: variable '__network_wpa_supplicant_required' from source: role '' defaults 30564 1726882913.37501: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30564 1726882913.37629: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882913.37657: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882913.37693: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882913.37736: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882913.37753: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882913.37808: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882913.37845: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882913.37875: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882913.37923: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882913.37942: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882913.38091: variable 'network_connections' from source: include params 30564 1726882913.38102: variable 'interface' from source: play vars 30564 1726882913.38206: variable 'interface' from source: play vars 30564 1726882913.38282: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882913.38314: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882913.38352: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882913.38389: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882913.38444: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882913.38739: variable 'network_connections' from source: include params 30564 1726882913.38748: variable 'interface' from source: play vars 30564 1726882913.38852: variable 'interface' from source: play vars 30564 1726882913.38893: variable '__network_packages_default_wireless' from source: role '' defaults 30564 1726882913.38976: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882913.39286: variable 'network_connections' from source: include params 30564 1726882913.39295: variable 'interface' from source: play vars 30564 1726882913.39362: variable 'interface' from source: play vars 30564 1726882913.39390: variable '__network_packages_default_team' from source: role '' defaults 30564 1726882913.39474: variable '__network_team_connections_defined' from source: role '' defaults 30564 1726882913.39789: variable 'network_connections' from source: include params 30564 1726882913.39798: variable 'interface' from source: play vars 30564 1726882913.39869: variable 'interface' from source: play vars 30564 1726882913.39925: variable '__network_service_name_default_initscripts' from source: role '' defaults 30564 1726882913.39994: variable '__network_service_name_default_initscripts' from source: role '' defaults 30564 1726882913.40006: variable '__network_packages_default_initscripts' from source: role '' defaults 30564 1726882913.40074: variable '__network_packages_default_initscripts' from source: role '' defaults 30564 1726882913.40301: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30564 1726882913.40786: variable 'network_connections' from source: include params 30564 1726882913.40796: variable 'interface' from source: play vars 30564 1726882913.40860: variable 'interface' from source: play vars 30564 1726882913.40874: variable 'ansible_distribution' from source: facts 30564 1726882913.40883: variable '__network_rh_distros' from source: role '' defaults 30564 1726882913.40893: variable 'ansible_distribution_major_version' from source: facts 30564 1726882913.40911: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30564 1726882913.41089: variable 'ansible_distribution' from source: facts 30564 1726882913.41099: variable '__network_rh_distros' from source: role '' defaults 30564 1726882913.41109: variable 'ansible_distribution_major_version' from source: facts 30564 1726882913.41126: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30564 1726882913.41300: variable 'ansible_distribution' from source: facts 30564 1726882913.41310: variable '__network_rh_distros' from source: role '' defaults 30564 1726882913.41320: variable 'ansible_distribution_major_version' from source: facts 30564 1726882913.41359: variable 'network_provider' from source: set_fact 30564 1726882913.41385: variable 'ansible_facts' from source: unknown 30564 1726882913.42210: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 30564 1726882913.42219: when evaluation is False, skipping this task 30564 1726882913.42226: _execute() done 30564 1726882913.42232: dumping result to json 30564 1726882913.42245: done dumping result, returning 30564 1726882913.42256: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [0e448fcc-3ce9-4216-acec-000000002330] 30564 1726882913.42268: sending task result for task 0e448fcc-3ce9-4216-acec-000000002330 30564 1726882913.42386: done sending task result for task 0e448fcc-3ce9-4216-acec-000000002330 skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 30564 1726882913.42438: no more pending results, returning what we have 30564 1726882913.42442: results queue empty 30564 1726882913.42443: checking for any_errors_fatal 30564 1726882913.42449: done checking for any_errors_fatal 30564 1726882913.42450: checking for max_fail_percentage 30564 1726882913.42453: done checking for max_fail_percentage 30564 1726882913.42454: checking to see if all hosts have failed and the running result is not ok 30564 1726882913.42455: done checking to see if all hosts have failed 30564 1726882913.42456: getting the remaining hosts for this loop 30564 1726882913.42458: done getting the remaining hosts for this loop 30564 1726882913.42461: getting the next task for host managed_node2 30564 1726882913.42474: done getting next task for host managed_node2 30564 1726882913.42478: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30564 1726882913.42483: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882913.42512: getting variables 30564 1726882913.42514: in VariableManager get_vars() 30564 1726882913.42557: Calling all_inventory to load vars for managed_node2 30564 1726882913.42566: Calling groups_inventory to load vars for managed_node2 30564 1726882913.42571: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882913.42583: Calling all_plugins_play to load vars for managed_node2 30564 1726882913.42586: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882913.42589: Calling groups_plugins_play to load vars for managed_node2 30564 1726882913.43111: WORKER PROCESS EXITING 30564 1726882913.44752: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882913.47324: done with get_vars() 30564 1726882913.47348: done getting variables 30564 1726882913.47418: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:41:53 -0400 (0:00:00.213) 0:01:52.055 ****** 30564 1726882913.47455: entering _queue_task() for managed_node2/package 30564 1726882913.47807: worker is 1 (out of 1 available) 30564 1726882913.47819: exiting _queue_task() for managed_node2/package 30564 1726882913.47832: done queuing things up, now waiting for results queue to drain 30564 1726882913.47834: waiting for pending results... 30564 1726882913.48151: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30564 1726882913.48308: in run() - task 0e448fcc-3ce9-4216-acec-000000002331 30564 1726882913.48323: variable 'ansible_search_path' from source: unknown 30564 1726882913.48326: variable 'ansible_search_path' from source: unknown 30564 1726882913.48357: calling self._execute() 30564 1726882913.48469: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882913.48479: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882913.48490: variable 'omit' from source: magic vars 30564 1726882913.48906: variable 'ansible_distribution_major_version' from source: facts 30564 1726882913.48920: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882913.49056: variable 'network_state' from source: role '' defaults 30564 1726882913.49068: Evaluated conditional (network_state != {}): False 30564 1726882913.49074: when evaluation is False, skipping this task 30564 1726882913.49078: _execute() done 30564 1726882913.49080: dumping result to json 30564 1726882913.49085: done dumping result, returning 30564 1726882913.49096: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0e448fcc-3ce9-4216-acec-000000002331] 30564 1726882913.49099: sending task result for task 0e448fcc-3ce9-4216-acec-000000002331 30564 1726882913.49198: done sending task result for task 0e448fcc-3ce9-4216-acec-000000002331 30564 1726882913.49202: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30564 1726882913.49250: no more pending results, returning what we have 30564 1726882913.49255: results queue empty 30564 1726882913.49256: checking for any_errors_fatal 30564 1726882913.49265: done checking for any_errors_fatal 30564 1726882913.49266: checking for max_fail_percentage 30564 1726882913.49270: done checking for max_fail_percentage 30564 1726882913.49271: checking to see if all hosts have failed and the running result is not ok 30564 1726882913.49272: done checking to see if all hosts have failed 30564 1726882913.49274: getting the remaining hosts for this loop 30564 1726882913.49276: done getting the remaining hosts for this loop 30564 1726882913.49280: getting the next task for host managed_node2 30564 1726882913.49289: done getting next task for host managed_node2 30564 1726882913.49293: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30564 1726882913.49301: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882913.49328: getting variables 30564 1726882913.49330: in VariableManager get_vars() 30564 1726882913.49385: Calling all_inventory to load vars for managed_node2 30564 1726882913.49388: Calling groups_inventory to load vars for managed_node2 30564 1726882913.49391: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882913.49403: Calling all_plugins_play to load vars for managed_node2 30564 1726882913.49406: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882913.49409: Calling groups_plugins_play to load vars for managed_node2 30564 1726882913.51305: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882913.53121: done with get_vars() 30564 1726882913.53147: done getting variables 30564 1726882913.53206: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:41:53 -0400 (0:00:00.057) 0:01:52.113 ****** 30564 1726882913.53238: entering _queue_task() for managed_node2/package 30564 1726882913.53540: worker is 1 (out of 1 available) 30564 1726882913.53550: exiting _queue_task() for managed_node2/package 30564 1726882913.53561: done queuing things up, now waiting for results queue to drain 30564 1726882913.53562: waiting for pending results... 30564 1726882913.53869: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30564 1726882913.54005: in run() - task 0e448fcc-3ce9-4216-acec-000000002332 30564 1726882913.54134: variable 'ansible_search_path' from source: unknown 30564 1726882913.54138: variable 'ansible_search_path' from source: unknown 30564 1726882913.54175: calling self._execute() 30564 1726882913.54397: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882913.54401: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882913.54412: variable 'omit' from source: magic vars 30564 1726882913.55274: variable 'ansible_distribution_major_version' from source: facts 30564 1726882913.55289: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882913.55479: variable 'network_state' from source: role '' defaults 30564 1726882913.55489: Evaluated conditional (network_state != {}): False 30564 1726882913.55492: when evaluation is False, skipping this task 30564 1726882913.55495: _execute() done 30564 1726882913.55498: dumping result to json 30564 1726882913.55500: done dumping result, returning 30564 1726882913.55511: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0e448fcc-3ce9-4216-acec-000000002332] 30564 1726882913.55517: sending task result for task 0e448fcc-3ce9-4216-acec-000000002332 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30564 1726882913.55716: no more pending results, returning what we have 30564 1726882913.55720: results queue empty 30564 1726882913.55722: checking for any_errors_fatal 30564 1726882913.55728: done checking for any_errors_fatal 30564 1726882913.55729: checking for max_fail_percentage 30564 1726882913.55731: done checking for max_fail_percentage 30564 1726882913.55733: checking to see if all hosts have failed and the running result is not ok 30564 1726882913.55733: done checking to see if all hosts have failed 30564 1726882913.55734: getting the remaining hosts for this loop 30564 1726882913.55736: done getting the remaining hosts for this loop 30564 1726882913.55739: getting the next task for host managed_node2 30564 1726882913.55748: done getting next task for host managed_node2 30564 1726882913.55753: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30564 1726882913.55761: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882913.55796: getting variables 30564 1726882913.55798: in VariableManager get_vars() 30564 1726882913.55843: Calling all_inventory to load vars for managed_node2 30564 1726882913.55845: Calling groups_inventory to load vars for managed_node2 30564 1726882913.55848: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882913.55862: Calling all_plugins_play to load vars for managed_node2 30564 1726882913.55867: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882913.55873: Calling groups_plugins_play to load vars for managed_node2 30564 1726882913.56543: done sending task result for task 0e448fcc-3ce9-4216-acec-000000002332 30564 1726882913.56547: WORKER PROCESS EXITING 30564 1726882913.58831: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882913.62682: done with get_vars() 30564 1726882913.62717: done getting variables 30564 1726882913.62895: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:41:53 -0400 (0:00:00.096) 0:01:52.210 ****** 30564 1726882913.62933: entering _queue_task() for managed_node2/service 30564 1726882913.63755: worker is 1 (out of 1 available) 30564 1726882913.63769: exiting _queue_task() for managed_node2/service 30564 1726882913.63781: done queuing things up, now waiting for results queue to drain 30564 1726882913.63782: waiting for pending results... 30564 1726882913.64172: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30564 1726882913.64312: in run() - task 0e448fcc-3ce9-4216-acec-000000002333 30564 1726882913.64325: variable 'ansible_search_path' from source: unknown 30564 1726882913.64329: variable 'ansible_search_path' from source: unknown 30564 1726882913.64402: calling self._execute() 30564 1726882913.64510: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882913.64516: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882913.64526: variable 'omit' from source: magic vars 30564 1726882913.64932: variable 'ansible_distribution_major_version' from source: facts 30564 1726882913.64945: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882913.65082: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882913.65287: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882913.67989: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882913.68066: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882913.68103: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882913.68138: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882913.68169: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882913.68248: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882913.68287: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882913.68312: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882913.68356: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882913.68376: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882913.68421: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882913.68445: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882913.68476: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882913.68532: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882913.68546: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882913.68594: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882913.68618: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882913.68643: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882913.68692: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882913.68707: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882913.68889: variable 'network_connections' from source: include params 30564 1726882913.68901: variable 'interface' from source: play vars 30564 1726882913.68969: variable 'interface' from source: play vars 30564 1726882913.69045: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882913.69220: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882913.69274: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882913.69303: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882913.69334: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882913.69383: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882913.69404: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882913.69428: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882913.69458: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882913.69511: variable '__network_team_connections_defined' from source: role '' defaults 30564 1726882913.69795: variable 'network_connections' from source: include params 30564 1726882913.69798: variable 'interface' from source: play vars 30564 1726882913.69888: variable 'interface' from source: play vars 30564 1726882913.69911: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30564 1726882913.69915: when evaluation is False, skipping this task 30564 1726882913.69917: _execute() done 30564 1726882913.69920: dumping result to json 30564 1726882913.69922: done dumping result, returning 30564 1726882913.69929: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-4216-acec-000000002333] 30564 1726882913.69936: sending task result for task 0e448fcc-3ce9-4216-acec-000000002333 30564 1726882913.70034: done sending task result for task 0e448fcc-3ce9-4216-acec-000000002333 30564 1726882913.70043: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30564 1726882913.70094: no more pending results, returning what we have 30564 1726882913.70099: results queue empty 30564 1726882913.70100: checking for any_errors_fatal 30564 1726882913.70106: done checking for any_errors_fatal 30564 1726882913.70106: checking for max_fail_percentage 30564 1726882913.70108: done checking for max_fail_percentage 30564 1726882913.70109: checking to see if all hosts have failed and the running result is not ok 30564 1726882913.70110: done checking to see if all hosts have failed 30564 1726882913.70110: getting the remaining hosts for this loop 30564 1726882913.70112: done getting the remaining hosts for this loop 30564 1726882913.70116: getting the next task for host managed_node2 30564 1726882913.70125: done getting next task for host managed_node2 30564 1726882913.70129: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30564 1726882913.70134: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882913.70161: getting variables 30564 1726882913.70165: in VariableManager get_vars() 30564 1726882913.70210: Calling all_inventory to load vars for managed_node2 30564 1726882913.70213: Calling groups_inventory to load vars for managed_node2 30564 1726882913.70215: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882913.70224: Calling all_plugins_play to load vars for managed_node2 30564 1726882913.70227: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882913.70229: Calling groups_plugins_play to load vars for managed_node2 30564 1726882913.71851: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882913.75678: done with get_vars() 30564 1726882913.75719: done getting variables 30564 1726882913.75782: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:41:53 -0400 (0:00:00.130) 0:01:52.340 ****** 30564 1726882913.75937: entering _queue_task() for managed_node2/service 30564 1726882913.76725: worker is 1 (out of 1 available) 30564 1726882913.76737: exiting _queue_task() for managed_node2/service 30564 1726882913.76751: done queuing things up, now waiting for results queue to drain 30564 1726882913.76752: waiting for pending results... 30564 1726882913.77564: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30564 1726882913.77806: in run() - task 0e448fcc-3ce9-4216-acec-000000002334 30564 1726882913.77935: variable 'ansible_search_path' from source: unknown 30564 1726882913.77938: variable 'ansible_search_path' from source: unknown 30564 1726882913.77974: calling self._execute() 30564 1726882913.78194: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882913.78198: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882913.78210: variable 'omit' from source: magic vars 30564 1726882913.79237: variable 'ansible_distribution_major_version' from source: facts 30564 1726882913.79252: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882913.79877: variable 'network_provider' from source: set_fact 30564 1726882913.79883: variable 'network_state' from source: role '' defaults 30564 1726882913.79893: Evaluated conditional (network_provider == "nm" or network_state != {}): True 30564 1726882913.79899: variable 'omit' from source: magic vars 30564 1726882913.80199: variable 'omit' from source: magic vars 30564 1726882913.80226: variable 'network_service_name' from source: role '' defaults 30564 1726882913.80517: variable 'network_service_name' from source: role '' defaults 30564 1726882913.80653: variable '__network_provider_setup' from source: role '' defaults 30564 1726882913.80659: variable '__network_service_name_default_nm' from source: role '' defaults 30564 1726882913.80956: variable '__network_service_name_default_nm' from source: role '' defaults 30564 1726882913.80967: variable '__network_packages_default_nm' from source: role '' defaults 30564 1726882913.81027: variable '__network_packages_default_nm' from source: role '' defaults 30564 1726882913.81853: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882913.88881: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882913.88958: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882913.88996: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882913.89145: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882913.89174: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882913.89364: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882913.89394: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882913.89418: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882913.89575: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882913.89589: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882913.89632: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882913.89768: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882913.89797: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882913.89836: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882913.89850: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882913.90368: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30564 1726882913.90757: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882913.90787: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882913.90812: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882913.90866: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882913.90884: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882913.90981: variable 'ansible_python' from source: facts 30564 1726882913.90999: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30564 1726882913.91091: variable '__network_wpa_supplicant_required' from source: role '' defaults 30564 1726882913.91169: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30564 1726882913.91304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882913.91327: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882913.91352: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882913.91400: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882913.91415: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882913.91460: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882913.91489: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882913.91518: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882913.91556: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882913.91575: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882913.91721: variable 'network_connections' from source: include params 30564 1726882913.91727: variable 'interface' from source: play vars 30564 1726882913.91805: variable 'interface' from source: play vars 30564 1726882913.91916: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882913.92125: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882913.92182: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882913.92225: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882913.92272: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882913.92335: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882913.92368: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882913.92407: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882913.92440: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882913.92494: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882913.92842: variable 'network_connections' from source: include params 30564 1726882913.92846: variable 'interface' from source: play vars 30564 1726882913.92966: variable 'interface' from source: play vars 30564 1726882913.92970: variable '__network_packages_default_wireless' from source: role '' defaults 30564 1726882913.93008: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882913.93490: variable 'network_connections' from source: include params 30564 1726882913.93493: variable 'interface' from source: play vars 30564 1726882913.93681: variable 'interface' from source: play vars 30564 1726882913.93702: variable '__network_packages_default_team' from source: role '' defaults 30564 1726882913.93893: variable '__network_team_connections_defined' from source: role '' defaults 30564 1726882913.94498: variable 'network_connections' from source: include params 30564 1726882913.94502: variable 'interface' from source: play vars 30564 1726882913.94585: variable 'interface' from source: play vars 30564 1726882913.94644: variable '__network_service_name_default_initscripts' from source: role '' defaults 30564 1726882913.94713: variable '__network_service_name_default_initscripts' from source: role '' defaults 30564 1726882913.94719: variable '__network_packages_default_initscripts' from source: role '' defaults 30564 1726882913.94789: variable '__network_packages_default_initscripts' from source: role '' defaults 30564 1726882913.95026: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30564 1726882913.95547: variable 'network_connections' from source: include params 30564 1726882913.95550: variable 'interface' from source: play vars 30564 1726882913.95613: variable 'interface' from source: play vars 30564 1726882913.95621: variable 'ansible_distribution' from source: facts 30564 1726882913.95629: variable '__network_rh_distros' from source: role '' defaults 30564 1726882913.95636: variable 'ansible_distribution_major_version' from source: facts 30564 1726882913.95650: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30564 1726882913.95831: variable 'ansible_distribution' from source: facts 30564 1726882913.95834: variable '__network_rh_distros' from source: role '' defaults 30564 1726882913.95845: variable 'ansible_distribution_major_version' from source: facts 30564 1726882913.95857: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30564 1726882913.96035: variable 'ansible_distribution' from source: facts 30564 1726882913.96039: variable '__network_rh_distros' from source: role '' defaults 30564 1726882913.96044: variable 'ansible_distribution_major_version' from source: facts 30564 1726882913.96087: variable 'network_provider' from source: set_fact 30564 1726882913.96110: variable 'omit' from source: magic vars 30564 1726882913.96137: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882913.96166: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882913.96192: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882913.96209: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882913.96219: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882913.96249: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882913.96252: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882913.96254: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882913.96366: Set connection var ansible_timeout to 10 30564 1726882913.96373: Set connection var ansible_pipelining to False 30564 1726882913.96376: Set connection var ansible_shell_type to sh 30564 1726882913.96387: Set connection var ansible_shell_executable to /bin/sh 30564 1726882913.96396: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882913.96398: Set connection var ansible_connection to ssh 30564 1726882913.96427: variable 'ansible_shell_executable' from source: unknown 30564 1726882913.96430: variable 'ansible_connection' from source: unknown 30564 1726882913.96432: variable 'ansible_module_compression' from source: unknown 30564 1726882913.96435: variable 'ansible_shell_type' from source: unknown 30564 1726882913.96437: variable 'ansible_shell_executable' from source: unknown 30564 1726882913.96439: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882913.96443: variable 'ansible_pipelining' from source: unknown 30564 1726882913.96445: variable 'ansible_timeout' from source: unknown 30564 1726882913.96449: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882913.96562: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882913.96574: variable 'omit' from source: magic vars 30564 1726882913.96580: starting attempt loop 30564 1726882913.96583: running the handler 30564 1726882913.96665: variable 'ansible_facts' from source: unknown 30564 1726882913.97502: _low_level_execute_command(): starting 30564 1726882913.97508: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882913.98253: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882913.98266: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882913.98282: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882913.98295: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882913.98334: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882913.98340: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882913.98353: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882913.98367: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882913.98379: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882913.98385: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882913.98393: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882913.98403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882913.98414: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882913.98421: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882913.98428: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882913.98438: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882913.98516: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882913.98533: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882913.98540: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882913.98685: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882914.00350: stdout chunk (state=3): >>>/root <<< 30564 1726882914.00511: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882914.00521: stderr chunk (state=3): >>><<< 30564 1726882914.00524: stdout chunk (state=3): >>><<< 30564 1726882914.00547: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882914.00559: _low_level_execute_command(): starting 30564 1726882914.00567: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882914.005464-35419-225032623520853 `" && echo ansible-tmp-1726882914.005464-35419-225032623520853="` echo /root/.ansible/tmp/ansible-tmp-1726882914.005464-35419-225032623520853 `" ) && sleep 0' 30564 1726882914.01217: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882914.01226: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882914.01237: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882914.01251: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882914.01293: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882914.01305: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882914.01315: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882914.01329: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882914.01336: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882914.01343: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882914.01351: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882914.01360: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882914.01377: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882914.01386: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882914.01395: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882914.01400: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882914.01481: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882914.01505: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882914.01510: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882914.01635: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882914.03597: stdout chunk (state=3): >>>ansible-tmp-1726882914.005464-35419-225032623520853=/root/.ansible/tmp/ansible-tmp-1726882914.005464-35419-225032623520853 <<< 30564 1726882914.03728: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882914.03732: stdout chunk (state=3): >>><<< 30564 1726882914.03734: stderr chunk (state=3): >>><<< 30564 1726882914.03736: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882914.005464-35419-225032623520853=/root/.ansible/tmp/ansible-tmp-1726882914.005464-35419-225032623520853 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882914.03752: variable 'ansible_module_compression' from source: unknown 30564 1726882914.03810: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 30564 1726882914.03868: variable 'ansible_facts' from source: unknown 30564 1726882914.04075: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882914.005464-35419-225032623520853/AnsiballZ_systemd.py 30564 1726882914.04232: Sending initial data 30564 1726882914.04235: Sent initial data (155 bytes) 30564 1726882914.05783: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882914.05827: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882914.05862: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882914.05869: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882914.05914: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882914.05918: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882914.05925: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882914.05958: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882914.05961: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882914.05980: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882914.05988: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882914.06010: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882914.06013: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882914.06016: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882914.06020: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882914.06030: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882914.06121: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882914.06161: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882914.06167: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882914.06287: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882914.08046: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882914.08138: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882914.08241: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmpq9thr_nt /root/.ansible/tmp/ansible-tmp-1726882914.005464-35419-225032623520853/AnsiballZ_systemd.py <<< 30564 1726882914.08343: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882914.11028: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882914.11128: stderr chunk (state=3): >>><<< 30564 1726882914.11131: stdout chunk (state=3): >>><<< 30564 1726882914.11147: done transferring module to remote 30564 1726882914.11156: _low_level_execute_command(): starting 30564 1726882914.11160: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882914.005464-35419-225032623520853/ /root/.ansible/tmp/ansible-tmp-1726882914.005464-35419-225032623520853/AnsiballZ_systemd.py && sleep 0' 30564 1726882914.11599: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882914.11603: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882914.11639: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882914.11657: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882914.11660: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882914.11715: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882914.11718: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882914.11822: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882914.13605: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882914.13645: stderr chunk (state=3): >>><<< 30564 1726882914.13648: stdout chunk (state=3): >>><<< 30564 1726882914.13665: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882914.13669: _low_level_execute_command(): starting 30564 1726882914.13682: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882914.005464-35419-225032623520853/AnsiballZ_systemd.py && sleep 0' 30564 1726882914.14121: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882914.14124: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882914.14156: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882914.14165: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882914.14168: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882914.14222: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882914.14227: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882914.14331: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882914.39270: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6692", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ExecMainStartTimestampMonotonic": "202392137", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6692", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManag<<< 30564 1726882914.39309: stdout chunk (state=3): >>>er.service", "ControlGroupId": "3602", "MemoryCurrent": "9191424", "MemoryAvailable": "infinity", "CPUUsageNSec": "2372950000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Watchdo<<< 30564 1726882914.39322: stdout chunk (state=3): >>>gSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service multi-user.target network.target shutdown.target cloud-init.service", "After": "cloud-init-local.service dbus-broker.service network-pre.target system.slice dbus.socket systemd-journald.socket basic.target sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:57 EDT", "StateChangeTimestampMonotonic": "316658837", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveExitTimestampMonotonic": "202392395", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveEnterTimestampMonotonic": "202472383", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveExitTimestampMonotonic": "202362940", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveEnterTimestampMonotonic": "202381901", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ConditionTimestampMonotonic": "202382734", "AssertTimestamp": "Fri 2024-09-20 21:31:03 EDT", "AssertTimestampMonotonic": "202382737", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "55e27919215348fab37a11b7ea324f90", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 30564 1726882914.40807: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882914.40895: stderr chunk (state=3): >>><<< 30564 1726882914.40898: stdout chunk (state=3): >>><<< 30564 1726882914.40976: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6692", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ExecMainStartTimestampMonotonic": "202392137", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6692", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3602", "MemoryCurrent": "9191424", "MemoryAvailable": "infinity", "CPUUsageNSec": "2372950000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service multi-user.target network.target shutdown.target cloud-init.service", "After": "cloud-init-local.service dbus-broker.service network-pre.target system.slice dbus.socket systemd-journald.socket basic.target sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:57 EDT", "StateChangeTimestampMonotonic": "316658837", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveExitTimestampMonotonic": "202392395", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveEnterTimestampMonotonic": "202472383", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveExitTimestampMonotonic": "202362940", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveEnterTimestampMonotonic": "202381901", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ConditionTimestampMonotonic": "202382734", "AssertTimestamp": "Fri 2024-09-20 21:31:03 EDT", "AssertTimestampMonotonic": "202382737", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "55e27919215348fab37a11b7ea324f90", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882914.41143: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882914.005464-35419-225032623520853/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882914.41175: _low_level_execute_command(): starting 30564 1726882914.41192: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882914.005464-35419-225032623520853/ > /dev/null 2>&1 && sleep 0' 30564 1726882914.42141: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882914.42154: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882914.42170: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882914.42187: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882914.42236: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882914.42247: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882914.42259: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882914.42279: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882914.42290: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882914.42300: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882914.42316: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882914.42337: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882914.42354: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882914.42370: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882914.42382: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882914.42396: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882914.42503: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882914.42520: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882914.42539: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882914.42679: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882914.44477: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882914.44530: stderr chunk (state=3): >>><<< 30564 1726882914.44533: stdout chunk (state=3): >>><<< 30564 1726882914.44561: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882914.44577: handler run complete 30564 1726882914.44628: attempt loop complete, returning result 30564 1726882914.44631: _execute() done 30564 1726882914.44634: dumping result to json 30564 1726882914.44674: done dumping result, returning 30564 1726882914.44678: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0e448fcc-3ce9-4216-acec-000000002334] 30564 1726882914.44680: sending task result for task 0e448fcc-3ce9-4216-acec-000000002334 ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30564 1726882914.45567: no more pending results, returning what we have 30564 1726882914.45571: results queue empty 30564 1726882914.45572: checking for any_errors_fatal 30564 1726882914.45575: done checking for any_errors_fatal 30564 1726882914.45576: checking for max_fail_percentage 30564 1726882914.45578: done checking for max_fail_percentage 30564 1726882914.45579: checking to see if all hosts have failed and the running result is not ok 30564 1726882914.45579: done checking to see if all hosts have failed 30564 1726882914.45580: getting the remaining hosts for this loop 30564 1726882914.45582: done getting the remaining hosts for this loop 30564 1726882914.45585: getting the next task for host managed_node2 30564 1726882914.45591: done getting next task for host managed_node2 30564 1726882914.45595: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30564 1726882914.45614: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882914.45635: getting variables 30564 1726882914.45636: in VariableManager get_vars() 30564 1726882914.45672: Calling all_inventory to load vars for managed_node2 30564 1726882914.45675: Calling groups_inventory to load vars for managed_node2 30564 1726882914.45677: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882914.45687: Calling all_plugins_play to load vars for managed_node2 30564 1726882914.45690: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882914.45693: Calling groups_plugins_play to load vars for managed_node2 30564 1726882914.46784: done sending task result for task 0e448fcc-3ce9-4216-acec-000000002334 30564 1726882914.46803: WORKER PROCESS EXITING 30564 1726882914.49131: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882914.52550: done with get_vars() 30564 1726882914.52627: done getting variables 30564 1726882914.52729: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:41:54 -0400 (0:00:00.768) 0:01:53.108 ****** 30564 1726882914.52768: entering _queue_task() for managed_node2/service 30564 1726882914.53325: worker is 1 (out of 1 available) 30564 1726882914.53339: exiting _queue_task() for managed_node2/service 30564 1726882914.53352: done queuing things up, now waiting for results queue to drain 30564 1726882914.53354: waiting for pending results... 30564 1726882914.53713: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30564 1726882914.53891: in run() - task 0e448fcc-3ce9-4216-acec-000000002335 30564 1726882914.53917: variable 'ansible_search_path' from source: unknown 30564 1726882914.53925: variable 'ansible_search_path' from source: unknown 30564 1726882914.54018: calling self._execute() 30564 1726882914.54206: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882914.54242: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882914.54261: variable 'omit' from source: magic vars 30564 1726882914.54952: variable 'ansible_distribution_major_version' from source: facts 30564 1726882914.54977: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882914.55127: variable 'network_provider' from source: set_fact 30564 1726882914.55138: Evaluated conditional (network_provider == "nm"): True 30564 1726882914.55517: variable '__network_wpa_supplicant_required' from source: role '' defaults 30564 1726882914.55757: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30564 1726882914.55973: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882914.58917: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882914.58976: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882914.59029: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882914.59071: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882914.59100: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882914.59305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882914.59327: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882914.59346: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882914.59376: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882914.59388: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882914.59420: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882914.59438: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882914.59456: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882914.59486: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882914.59497: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882914.59526: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882914.59541: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882914.59560: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882914.59589: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882914.59600: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882914.59704: variable 'network_connections' from source: include params 30564 1726882914.59714: variable 'interface' from source: play vars 30564 1726882914.59761: variable 'interface' from source: play vars 30564 1726882914.59815: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882914.59931: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882914.59958: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882914.59985: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882914.60007: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882914.60037: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882914.60054: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882914.60076: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882914.60095: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882914.60131: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882914.60294: variable 'network_connections' from source: include params 30564 1726882914.60298: variable 'interface' from source: play vars 30564 1726882914.60371: variable 'interface' from source: play vars 30564 1726882914.60382: Evaluated conditional (__network_wpa_supplicant_required): False 30564 1726882914.60386: when evaluation is False, skipping this task 30564 1726882914.60388: _execute() done 30564 1726882914.60391: dumping result to json 30564 1726882914.60393: done dumping result, returning 30564 1726882914.60400: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0e448fcc-3ce9-4216-acec-000000002335] 30564 1726882914.60410: sending task result for task 0e448fcc-3ce9-4216-acec-000000002335 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 30564 1726882914.60600: no more pending results, returning what we have 30564 1726882914.60604: results queue empty 30564 1726882914.60605: checking for any_errors_fatal 30564 1726882914.60630: done checking for any_errors_fatal 30564 1726882914.60631: checking for max_fail_percentage 30564 1726882914.60632: done checking for max_fail_percentage 30564 1726882914.60633: checking to see if all hosts have failed and the running result is not ok 30564 1726882914.60634: done checking to see if all hosts have failed 30564 1726882914.60635: getting the remaining hosts for this loop 30564 1726882914.60637: done getting the remaining hosts for this loop 30564 1726882914.60641: getting the next task for host managed_node2 30564 1726882914.60691: done getting next task for host managed_node2 30564 1726882914.60695: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 30564 1726882914.60700: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882914.60721: getting variables 30564 1726882914.60722: in VariableManager get_vars() 30564 1726882914.60798: Calling all_inventory to load vars for managed_node2 30564 1726882914.60801: Calling groups_inventory to load vars for managed_node2 30564 1726882914.60803: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882914.60812: Calling all_plugins_play to load vars for managed_node2 30564 1726882914.60815: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882914.60817: Calling groups_plugins_play to load vars for managed_node2 30564 1726882914.61981: done sending task result for task 0e448fcc-3ce9-4216-acec-000000002335 30564 1726882914.61988: WORKER PROCESS EXITING 30564 1726882914.62294: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882914.63661: done with get_vars() 30564 1726882914.63686: done getting variables 30564 1726882914.63729: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:41:54 -0400 (0:00:00.109) 0:01:53.218 ****** 30564 1726882914.63752: entering _queue_task() for managed_node2/service 30564 1726882914.64025: worker is 1 (out of 1 available) 30564 1726882914.64036: exiting _queue_task() for managed_node2/service 30564 1726882914.64050: done queuing things up, now waiting for results queue to drain 30564 1726882914.64051: waiting for pending results... 30564 1726882914.64286: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 30564 1726882914.64384: in run() - task 0e448fcc-3ce9-4216-acec-000000002336 30564 1726882914.64396: variable 'ansible_search_path' from source: unknown 30564 1726882914.64401: variable 'ansible_search_path' from source: unknown 30564 1726882914.64429: calling self._execute() 30564 1726882914.64513: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882914.64518: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882914.64526: variable 'omit' from source: magic vars 30564 1726882914.64817: variable 'ansible_distribution_major_version' from source: facts 30564 1726882914.64829: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882914.64914: variable 'network_provider' from source: set_fact 30564 1726882914.64918: Evaluated conditional (network_provider == "initscripts"): False 30564 1726882914.64920: when evaluation is False, skipping this task 30564 1726882914.64923: _execute() done 30564 1726882914.64928: dumping result to json 30564 1726882914.64930: done dumping result, returning 30564 1726882914.64938: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [0e448fcc-3ce9-4216-acec-000000002336] 30564 1726882914.64943: sending task result for task 0e448fcc-3ce9-4216-acec-000000002336 30564 1726882914.65033: done sending task result for task 0e448fcc-3ce9-4216-acec-000000002336 30564 1726882914.65036: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30564 1726882914.65105: no more pending results, returning what we have 30564 1726882914.65109: results queue empty 30564 1726882914.65110: checking for any_errors_fatal 30564 1726882914.65116: done checking for any_errors_fatal 30564 1726882914.65117: checking for max_fail_percentage 30564 1726882914.65119: done checking for max_fail_percentage 30564 1726882914.65120: checking to see if all hosts have failed and the running result is not ok 30564 1726882914.65121: done checking to see if all hosts have failed 30564 1726882914.65122: getting the remaining hosts for this loop 30564 1726882914.65124: done getting the remaining hosts for this loop 30564 1726882914.65127: getting the next task for host managed_node2 30564 1726882914.65135: done getting next task for host managed_node2 30564 1726882914.65139: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30564 1726882914.65151: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882914.65175: getting variables 30564 1726882914.65177: in VariableManager get_vars() 30564 1726882914.65214: Calling all_inventory to load vars for managed_node2 30564 1726882914.65216: Calling groups_inventory to load vars for managed_node2 30564 1726882914.65218: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882914.65227: Calling all_plugins_play to load vars for managed_node2 30564 1726882914.65229: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882914.65232: Calling groups_plugins_play to load vars for managed_node2 30564 1726882914.66491: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882914.67475: done with get_vars() 30564 1726882914.67491: done getting variables 30564 1726882914.67532: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:41:54 -0400 (0:00:00.038) 0:01:53.256 ****** 30564 1726882914.67559: entering _queue_task() for managed_node2/copy 30564 1726882914.67774: worker is 1 (out of 1 available) 30564 1726882914.67787: exiting _queue_task() for managed_node2/copy 30564 1726882914.67798: done queuing things up, now waiting for results queue to drain 30564 1726882914.67800: waiting for pending results... 30564 1726882914.67994: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30564 1726882914.68079: in run() - task 0e448fcc-3ce9-4216-acec-000000002337 30564 1726882914.68091: variable 'ansible_search_path' from source: unknown 30564 1726882914.68094: variable 'ansible_search_path' from source: unknown 30564 1726882914.68129: calling self._execute() 30564 1726882914.68215: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882914.68224: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882914.68233: variable 'omit' from source: magic vars 30564 1726882914.68515: variable 'ansible_distribution_major_version' from source: facts 30564 1726882914.68525: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882914.68608: variable 'network_provider' from source: set_fact 30564 1726882914.68614: Evaluated conditional (network_provider == "initscripts"): False 30564 1726882914.68617: when evaluation is False, skipping this task 30564 1726882914.68619: _execute() done 30564 1726882914.68622: dumping result to json 30564 1726882914.68626: done dumping result, returning 30564 1726882914.68634: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0e448fcc-3ce9-4216-acec-000000002337] 30564 1726882914.68641: sending task result for task 0e448fcc-3ce9-4216-acec-000000002337 30564 1726882914.68730: done sending task result for task 0e448fcc-3ce9-4216-acec-000000002337 30564 1726882914.68733: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 30564 1726882914.68787: no more pending results, returning what we have 30564 1726882914.68791: results queue empty 30564 1726882914.68792: checking for any_errors_fatal 30564 1726882914.68797: done checking for any_errors_fatal 30564 1726882914.68798: checking for max_fail_percentage 30564 1726882914.68799: done checking for max_fail_percentage 30564 1726882914.68800: checking to see if all hosts have failed and the running result is not ok 30564 1726882914.68801: done checking to see if all hosts have failed 30564 1726882914.68802: getting the remaining hosts for this loop 30564 1726882914.68803: done getting the remaining hosts for this loop 30564 1726882914.68806: getting the next task for host managed_node2 30564 1726882914.68814: done getting next task for host managed_node2 30564 1726882914.68817: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30564 1726882914.68822: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882914.68843: getting variables 30564 1726882914.68844: in VariableManager get_vars() 30564 1726882914.68888: Calling all_inventory to load vars for managed_node2 30564 1726882914.68891: Calling groups_inventory to load vars for managed_node2 30564 1726882914.68893: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882914.68899: Calling all_plugins_play to load vars for managed_node2 30564 1726882914.68901: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882914.68903: Calling groups_plugins_play to load vars for managed_node2 30564 1726882914.69796: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882914.70742: done with get_vars() 30564 1726882914.70756: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:41:54 -0400 (0:00:00.032) 0:01:53.289 ****** 30564 1726882914.70821: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 30564 1726882914.71012: worker is 1 (out of 1 available) 30564 1726882914.71026: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 30564 1726882914.71038: done queuing things up, now waiting for results queue to drain 30564 1726882914.71039: waiting for pending results... 30564 1726882914.71225: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30564 1726882914.71311: in run() - task 0e448fcc-3ce9-4216-acec-000000002338 30564 1726882914.71323: variable 'ansible_search_path' from source: unknown 30564 1726882914.71327: variable 'ansible_search_path' from source: unknown 30564 1726882914.71354: calling self._execute() 30564 1726882914.71426: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882914.71430: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882914.71439: variable 'omit' from source: magic vars 30564 1726882914.71713: variable 'ansible_distribution_major_version' from source: facts 30564 1726882914.71723: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882914.71729: variable 'omit' from source: magic vars 30564 1726882914.71776: variable 'omit' from source: magic vars 30564 1726882914.71885: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882914.73412: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882914.73459: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882914.73490: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882914.73514: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882914.73536: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882914.73593: variable 'network_provider' from source: set_fact 30564 1726882914.73686: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882914.73705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882914.73723: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882914.73753: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882914.73766: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882914.73816: variable 'omit' from source: magic vars 30564 1726882914.73894: variable 'omit' from source: magic vars 30564 1726882914.73962: variable 'network_connections' from source: include params 30564 1726882914.73976: variable 'interface' from source: play vars 30564 1726882914.74020: variable 'interface' from source: play vars 30564 1726882914.74126: variable 'omit' from source: magic vars 30564 1726882914.74133: variable '__lsr_ansible_managed' from source: task vars 30564 1726882914.74178: variable '__lsr_ansible_managed' from source: task vars 30564 1726882914.74317: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 30564 1726882914.74454: Loaded config def from plugin (lookup/template) 30564 1726882914.74458: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 30564 1726882914.74482: File lookup term: get_ansible_managed.j2 30564 1726882914.74485: variable 'ansible_search_path' from source: unknown 30564 1726882914.74489: evaluation_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 30564 1726882914.74500: search_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 30564 1726882914.74515: variable 'ansible_search_path' from source: unknown 30564 1726882914.83423: variable 'ansible_managed' from source: unknown 30564 1726882914.83511: variable 'omit' from source: magic vars 30564 1726882914.83531: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882914.83553: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882914.83561: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882914.83577: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882914.83584: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882914.83598: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882914.83601: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882914.83604: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882914.83667: Set connection var ansible_timeout to 10 30564 1726882914.83674: Set connection var ansible_pipelining to False 30564 1726882914.83677: Set connection var ansible_shell_type to sh 30564 1726882914.83682: Set connection var ansible_shell_executable to /bin/sh 30564 1726882914.83689: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882914.83691: Set connection var ansible_connection to ssh 30564 1726882914.83709: variable 'ansible_shell_executable' from source: unknown 30564 1726882914.83712: variable 'ansible_connection' from source: unknown 30564 1726882914.83714: variable 'ansible_module_compression' from source: unknown 30564 1726882914.83716: variable 'ansible_shell_type' from source: unknown 30564 1726882914.83718: variable 'ansible_shell_executable' from source: unknown 30564 1726882914.83721: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882914.83724: variable 'ansible_pipelining' from source: unknown 30564 1726882914.83727: variable 'ansible_timeout' from source: unknown 30564 1726882914.83731: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882914.83822: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30564 1726882914.83832: variable 'omit' from source: magic vars 30564 1726882914.83835: starting attempt loop 30564 1726882914.83837: running the handler 30564 1726882914.83846: _low_level_execute_command(): starting 30564 1726882914.83849: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882914.84554: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882914.84557: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882914.84560: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882914.84676: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882914.84931: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882914.84935: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882914.84937: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882914.84951: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882914.86406: stdout chunk (state=3): >>>/root <<< 30564 1726882914.86510: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882914.86561: stderr chunk (state=3): >>><<< 30564 1726882914.86566: stdout chunk (state=3): >>><<< 30564 1726882914.86628: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882914.86631: _low_level_execute_command(): starting 30564 1726882914.86634: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882914.8658075-35456-85270362736369 `" && echo ansible-tmp-1726882914.8658075-35456-85270362736369="` echo /root/.ansible/tmp/ansible-tmp-1726882914.8658075-35456-85270362736369 `" ) && sleep 0' 30564 1726882914.87178: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882914.87182: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882914.87227: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882914.87230: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882914.87245: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882914.87251: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882914.87342: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882914.87362: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882914.87496: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882914.89384: stdout chunk (state=3): >>>ansible-tmp-1726882914.8658075-35456-85270362736369=/root/.ansible/tmp/ansible-tmp-1726882914.8658075-35456-85270362736369 <<< 30564 1726882914.89883: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882914.89945: stderr chunk (state=3): >>><<< 30564 1726882914.89948: stdout chunk (state=3): >>><<< 30564 1726882914.90073: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882914.8658075-35456-85270362736369=/root/.ansible/tmp/ansible-tmp-1726882914.8658075-35456-85270362736369 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882914.90077: variable 'ansible_module_compression' from source: unknown 30564 1726882914.90079: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 30564 1726882914.90186: variable 'ansible_facts' from source: unknown 30564 1726882914.90294: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882914.8658075-35456-85270362736369/AnsiballZ_network_connections.py 30564 1726882914.90730: Sending initial data 30564 1726882914.90735: Sent initial data (167 bytes) 30564 1726882914.91492: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882914.91495: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882914.91498: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882914.91532: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882914.91535: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882914.91537: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882914.91641: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882914.91653: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882914.91789: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882914.93540: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 <<< 30564 1726882914.93544: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882914.93634: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882914.93733: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmpz2dtdmt4 /root/.ansible/tmp/ansible-tmp-1726882914.8658075-35456-85270362736369/AnsiballZ_network_connections.py <<< 30564 1726882914.93823: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882914.96022: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882914.96190: stderr chunk (state=3): >>><<< 30564 1726882914.96193: stdout chunk (state=3): >>><<< 30564 1726882914.96195: done transferring module to remote 30564 1726882914.96197: _low_level_execute_command(): starting 30564 1726882914.96199: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882914.8658075-35456-85270362736369/ /root/.ansible/tmp/ansible-tmp-1726882914.8658075-35456-85270362736369/AnsiballZ_network_connections.py && sleep 0' 30564 1726882914.97511: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882914.97644: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882914.97660: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882914.97684: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882914.97727: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882914.97762: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882914.97781: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882914.97798: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882914.97856: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882914.97875: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882914.97887: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882914.97898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882914.97912: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882914.97922: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882914.97931: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882914.97942: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882914.98024: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882914.98093: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882914.98108: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882914.98306: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882915.00203: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882915.00206: stdout chunk (state=3): >>><<< 30564 1726882915.00209: stderr chunk (state=3): >>><<< 30564 1726882915.00278: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882915.00283: _low_level_execute_command(): starting 30564 1726882915.00285: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882914.8658075-35456-85270362736369/AnsiballZ_network_connections.py && sleep 0' 30564 1726882915.02005: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882915.02052: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882915.02076: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882915.02096: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882915.02194: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882915.02264: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882915.02291: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882915.02309: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882915.02322: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882915.02334: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882915.02347: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882915.02378: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882915.02398: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882915.02412: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882915.02424: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882915.02438: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882915.02543: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882915.02616: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882915.02632: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882915.02810: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882915.25043: stdout chunk (state=3): >>> {"changed": false, "warnings": [], "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 776ac6a9-ad06-421f-84d7-faa75bbe803f skipped because already active\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 30564 1726882915.26461: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882915.26517: stderr chunk (state=3): >>><<< 30564 1726882915.26520: stdout chunk (state=3): >>><<< 30564 1726882915.26535: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "warnings": [], "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 776ac6a9-ad06-421f-84d7-faa75bbe803f skipped because already active\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "state": "up"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882915.26561: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'state': 'up'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882914.8658075-35456-85270362736369/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882915.26571: _low_level_execute_command(): starting 30564 1726882915.26574: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882914.8658075-35456-85270362736369/ > /dev/null 2>&1 && sleep 0' 30564 1726882915.27021: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882915.27025: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882915.27055: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882915.27058: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882915.27109: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882915.27113: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882915.27218: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882915.29668: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882915.29675: stdout chunk (state=3): >>><<< 30564 1726882915.29677: stderr chunk (state=3): >>><<< 30564 1726882915.29679: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882915.29682: handler run complete 30564 1726882915.29684: attempt loop complete, returning result 30564 1726882915.29696: _execute() done 30564 1726882915.29699: dumping result to json 30564 1726882915.29701: done dumping result, returning 30564 1726882915.29703: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0e448fcc-3ce9-4216-acec-000000002338] 30564 1726882915.29705: sending task result for task 0e448fcc-3ce9-4216-acec-000000002338 30564 1726882915.29772: done sending task result for task 0e448fcc-3ce9-4216-acec-000000002338 30564 1726882915.29776: WORKER PROCESS EXITING ok: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "state": "up" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false } STDERR: [002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 776ac6a9-ad06-421f-84d7-faa75bbe803f skipped because already active 30564 1726882915.29860: no more pending results, returning what we have 30564 1726882915.29864: results queue empty 30564 1726882915.29865: checking for any_errors_fatal 30564 1726882915.29871: done checking for any_errors_fatal 30564 1726882915.29872: checking for max_fail_percentage 30564 1726882915.29873: done checking for max_fail_percentage 30564 1726882915.29874: checking to see if all hosts have failed and the running result is not ok 30564 1726882915.29874: done checking to see if all hosts have failed 30564 1726882915.29875: getting the remaining hosts for this loop 30564 1726882915.29876: done getting the remaining hosts for this loop 30564 1726882915.29879: getting the next task for host managed_node2 30564 1726882915.29885: done getting next task for host managed_node2 30564 1726882915.29887: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 30564 1726882915.29891: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882915.29900: getting variables 30564 1726882915.29902: in VariableManager get_vars() 30564 1726882915.29943: Calling all_inventory to load vars for managed_node2 30564 1726882915.29945: Calling groups_inventory to load vars for managed_node2 30564 1726882915.29947: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882915.29954: Calling all_plugins_play to load vars for managed_node2 30564 1726882915.29956: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882915.29958: Calling groups_plugins_play to load vars for managed_node2 30564 1726882915.36049: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882915.36994: done with get_vars() 30564 1726882915.37013: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:41:55 -0400 (0:00:00.662) 0:01:53.952 ****** 30564 1726882915.37071: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 30564 1726882915.37314: worker is 1 (out of 1 available) 30564 1726882915.37328: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 30564 1726882915.37341: done queuing things up, now waiting for results queue to drain 30564 1726882915.37343: waiting for pending results... 30564 1726882915.37541: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 30564 1726882915.37652: in run() - task 0e448fcc-3ce9-4216-acec-000000002339 30564 1726882915.37662: variable 'ansible_search_path' from source: unknown 30564 1726882915.37679: variable 'ansible_search_path' from source: unknown 30564 1726882915.37702: calling self._execute() 30564 1726882915.37822: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882915.37827: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882915.37833: variable 'omit' from source: magic vars 30564 1726882915.38241: variable 'ansible_distribution_major_version' from source: facts 30564 1726882915.38277: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882915.38480: variable 'network_state' from source: role '' defaults 30564 1726882915.38507: Evaluated conditional (network_state != {}): False 30564 1726882915.38518: when evaluation is False, skipping this task 30564 1726882915.38527: _execute() done 30564 1726882915.38531: dumping result to json 30564 1726882915.38542: done dumping result, returning 30564 1726882915.38546: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [0e448fcc-3ce9-4216-acec-000000002339] 30564 1726882915.38554: sending task result for task 0e448fcc-3ce9-4216-acec-000000002339 30564 1726882915.38690: done sending task result for task 0e448fcc-3ce9-4216-acec-000000002339 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30564 1726882915.38758: no more pending results, returning what we have 30564 1726882915.38762: results queue empty 30564 1726882915.38765: checking for any_errors_fatal 30564 1726882915.38778: done checking for any_errors_fatal 30564 1726882915.38779: checking for max_fail_percentage 30564 1726882915.38780: done checking for max_fail_percentage 30564 1726882915.38781: checking to see if all hosts have failed and the running result is not ok 30564 1726882915.38782: done checking to see if all hosts have failed 30564 1726882915.38783: getting the remaining hosts for this loop 30564 1726882915.38785: done getting the remaining hosts for this loop 30564 1726882915.38789: getting the next task for host managed_node2 30564 1726882915.38796: done getting next task for host managed_node2 30564 1726882915.38806: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30564 1726882915.38812: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882915.38842: getting variables 30564 1726882915.38844: in VariableManager get_vars() 30564 1726882915.38904: Calling all_inventory to load vars for managed_node2 30564 1726882915.38920: Calling groups_inventory to load vars for managed_node2 30564 1726882915.38927: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882915.38934: WORKER PROCESS EXITING 30564 1726882915.38950: Calling all_plugins_play to load vars for managed_node2 30564 1726882915.38954: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882915.38958: Calling groups_plugins_play to load vars for managed_node2 30564 1726882915.40202: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882915.41732: done with get_vars() 30564 1726882915.41762: done getting variables 30564 1726882915.41837: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:41:55 -0400 (0:00:00.048) 0:01:54.000 ****** 30564 1726882915.41888: entering _queue_task() for managed_node2/debug 30564 1726882915.42207: worker is 1 (out of 1 available) 30564 1726882915.42220: exiting _queue_task() for managed_node2/debug 30564 1726882915.42232: done queuing things up, now waiting for results queue to drain 30564 1726882915.42234: waiting for pending results... 30564 1726882915.42570: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30564 1726882915.42670: in run() - task 0e448fcc-3ce9-4216-acec-00000000233a 30564 1726882915.42690: variable 'ansible_search_path' from source: unknown 30564 1726882915.42693: variable 'ansible_search_path' from source: unknown 30564 1726882915.42719: calling self._execute() 30564 1726882915.42804: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882915.42810: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882915.42819: variable 'omit' from source: magic vars 30564 1726882915.43127: variable 'ansible_distribution_major_version' from source: facts 30564 1726882915.43131: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882915.43136: variable 'omit' from source: magic vars 30564 1726882915.43202: variable 'omit' from source: magic vars 30564 1726882915.43226: variable 'omit' from source: magic vars 30564 1726882915.43261: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882915.43293: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882915.43308: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882915.43322: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882915.43332: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882915.43388: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882915.43393: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882915.43395: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882915.43452: Set connection var ansible_timeout to 10 30564 1726882915.43456: Set connection var ansible_pipelining to False 30564 1726882915.43459: Set connection var ansible_shell_type to sh 30564 1726882915.43469: Set connection var ansible_shell_executable to /bin/sh 30564 1726882915.43475: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882915.43478: Set connection var ansible_connection to ssh 30564 1726882915.43497: variable 'ansible_shell_executable' from source: unknown 30564 1726882915.43500: variable 'ansible_connection' from source: unknown 30564 1726882915.43503: variable 'ansible_module_compression' from source: unknown 30564 1726882915.43521: variable 'ansible_shell_type' from source: unknown 30564 1726882915.43524: variable 'ansible_shell_executable' from source: unknown 30564 1726882915.43527: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882915.43529: variable 'ansible_pipelining' from source: unknown 30564 1726882915.43531: variable 'ansible_timeout' from source: unknown 30564 1726882915.43533: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882915.43709: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882915.43718: variable 'omit' from source: magic vars 30564 1726882915.43723: starting attempt loop 30564 1726882915.43726: running the handler 30564 1726882915.43824: variable '__network_connections_result' from source: set_fact 30564 1726882915.43866: handler run complete 30564 1726882915.43882: attempt loop complete, returning result 30564 1726882915.43885: _execute() done 30564 1726882915.43888: dumping result to json 30564 1726882915.43890: done dumping result, returning 30564 1726882915.43897: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0e448fcc-3ce9-4216-acec-00000000233a] 30564 1726882915.43907: sending task result for task 0e448fcc-3ce9-4216-acec-00000000233a 30564 1726882915.43995: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000233a 30564 1726882915.43998: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 776ac6a9-ad06-421f-84d7-faa75bbe803f skipped because already active" ] } 30564 1726882915.44072: no more pending results, returning what we have 30564 1726882915.44075: results queue empty 30564 1726882915.44076: checking for any_errors_fatal 30564 1726882915.44080: done checking for any_errors_fatal 30564 1726882915.44081: checking for max_fail_percentage 30564 1726882915.44083: done checking for max_fail_percentage 30564 1726882915.44083: checking to see if all hosts have failed and the running result is not ok 30564 1726882915.44084: done checking to see if all hosts have failed 30564 1726882915.44085: getting the remaining hosts for this loop 30564 1726882915.44086: done getting the remaining hosts for this loop 30564 1726882915.44089: getting the next task for host managed_node2 30564 1726882915.44096: done getting next task for host managed_node2 30564 1726882915.44099: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30564 1726882915.44104: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882915.44121: getting variables 30564 1726882915.44123: in VariableManager get_vars() 30564 1726882915.44156: Calling all_inventory to load vars for managed_node2 30564 1726882915.44159: Calling groups_inventory to load vars for managed_node2 30564 1726882915.44161: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882915.44171: Calling all_plugins_play to load vars for managed_node2 30564 1726882915.44173: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882915.44177: Calling groups_plugins_play to load vars for managed_node2 30564 1726882915.45098: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882915.46091: done with get_vars() 30564 1726882915.46106: done getting variables 30564 1726882915.46145: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:41:55 -0400 (0:00:00.042) 0:01:54.043 ****** 30564 1726882915.46174: entering _queue_task() for managed_node2/debug 30564 1726882915.46373: worker is 1 (out of 1 available) 30564 1726882915.46386: exiting _queue_task() for managed_node2/debug 30564 1726882915.46398: done queuing things up, now waiting for results queue to drain 30564 1726882915.46399: waiting for pending results... 30564 1726882915.46587: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30564 1726882915.46684: in run() - task 0e448fcc-3ce9-4216-acec-00000000233b 30564 1726882915.46697: variable 'ansible_search_path' from source: unknown 30564 1726882915.46700: variable 'ansible_search_path' from source: unknown 30564 1726882915.46726: calling self._execute() 30564 1726882915.46808: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882915.46812: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882915.46822: variable 'omit' from source: magic vars 30564 1726882915.47110: variable 'ansible_distribution_major_version' from source: facts 30564 1726882915.47121: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882915.47126: variable 'omit' from source: magic vars 30564 1726882915.47173: variable 'omit' from source: magic vars 30564 1726882915.47196: variable 'omit' from source: magic vars 30564 1726882915.47249: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882915.47335: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882915.47366: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882915.47422: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882915.47435: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882915.47475: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882915.47480: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882915.47511: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882915.47581: Set connection var ansible_timeout to 10 30564 1726882915.47591: Set connection var ansible_pipelining to False 30564 1726882915.47594: Set connection var ansible_shell_type to sh 30564 1726882915.47618: Set connection var ansible_shell_executable to /bin/sh 30564 1726882915.47621: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882915.47623: Set connection var ansible_connection to ssh 30564 1726882915.47644: variable 'ansible_shell_executable' from source: unknown 30564 1726882915.47648: variable 'ansible_connection' from source: unknown 30564 1726882915.47651: variable 'ansible_module_compression' from source: unknown 30564 1726882915.47653: variable 'ansible_shell_type' from source: unknown 30564 1726882915.47655: variable 'ansible_shell_executable' from source: unknown 30564 1726882915.47657: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882915.47659: variable 'ansible_pipelining' from source: unknown 30564 1726882915.47661: variable 'ansible_timeout' from source: unknown 30564 1726882915.47665: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882915.47866: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882915.47870: variable 'omit' from source: magic vars 30564 1726882915.47873: starting attempt loop 30564 1726882915.47887: running the handler 30564 1726882915.47922: variable '__network_connections_result' from source: set_fact 30564 1726882915.48006: variable '__network_connections_result' from source: set_fact 30564 1726882915.48146: handler run complete 30564 1726882915.48179: attempt loop complete, returning result 30564 1726882915.48183: _execute() done 30564 1726882915.48185: dumping result to json 30564 1726882915.48187: done dumping result, returning 30564 1726882915.48212: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0e448fcc-3ce9-4216-acec-00000000233b] 30564 1726882915.48217: sending task result for task 0e448fcc-3ce9-4216-acec-00000000233b 30564 1726882915.48316: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000233b 30564 1726882915.48319: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "state": "up" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false, "failed": false, "stderr": "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 776ac6a9-ad06-421f-84d7-faa75bbe803f skipped because already active\n", "stderr_lines": [ "[002] #0, state:up persistent_state:present, 'statebr': up connection statebr, 776ac6a9-ad06-421f-84d7-faa75bbe803f skipped because already active" ] } } 30564 1726882915.48553: no more pending results, returning what we have 30564 1726882915.48556: results queue empty 30564 1726882915.48557: checking for any_errors_fatal 30564 1726882915.48562: done checking for any_errors_fatal 30564 1726882915.48565: checking for max_fail_percentage 30564 1726882915.48566: done checking for max_fail_percentage 30564 1726882915.48567: checking to see if all hosts have failed and the running result is not ok 30564 1726882915.48569: done checking to see if all hosts have failed 30564 1726882915.48570: getting the remaining hosts for this loop 30564 1726882915.48571: done getting the remaining hosts for this loop 30564 1726882915.48573: getting the next task for host managed_node2 30564 1726882915.48579: done getting next task for host managed_node2 30564 1726882915.48581: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30564 1726882915.48584: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882915.48599: getting variables 30564 1726882915.48600: in VariableManager get_vars() 30564 1726882915.48645: Calling all_inventory to load vars for managed_node2 30564 1726882915.48648: Calling groups_inventory to load vars for managed_node2 30564 1726882915.48655: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882915.48662: Calling all_plugins_play to load vars for managed_node2 30564 1726882915.48666: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882915.48669: Calling groups_plugins_play to load vars for managed_node2 30564 1726882915.50073: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882915.51342: done with get_vars() 30564 1726882915.51382: done getting variables 30564 1726882915.51444: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:41:55 -0400 (0:00:00.053) 0:01:54.096 ****** 30564 1726882915.51506: entering _queue_task() for managed_node2/debug 30564 1726882915.51778: worker is 1 (out of 1 available) 30564 1726882915.51792: exiting _queue_task() for managed_node2/debug 30564 1726882915.51804: done queuing things up, now waiting for results queue to drain 30564 1726882915.51805: waiting for pending results... 30564 1726882915.52103: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30564 1726882915.52219: in run() - task 0e448fcc-3ce9-4216-acec-00000000233c 30564 1726882915.52230: variable 'ansible_search_path' from source: unknown 30564 1726882915.52234: variable 'ansible_search_path' from source: unknown 30564 1726882915.52260: calling self._execute() 30564 1726882915.52352: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882915.52355: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882915.52366: variable 'omit' from source: magic vars 30564 1726882915.52651: variable 'ansible_distribution_major_version' from source: facts 30564 1726882915.52663: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882915.52752: variable 'network_state' from source: role '' defaults 30564 1726882915.52761: Evaluated conditional (network_state != {}): False 30564 1726882915.52766: when evaluation is False, skipping this task 30564 1726882915.52771: _execute() done 30564 1726882915.52774: dumping result to json 30564 1726882915.52777: done dumping result, returning 30564 1726882915.52780: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0e448fcc-3ce9-4216-acec-00000000233c] 30564 1726882915.52787: sending task result for task 0e448fcc-3ce9-4216-acec-00000000233c 30564 1726882915.52883: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000233c 30564 1726882915.52886: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "network_state != {}" } 30564 1726882915.52929: no more pending results, returning what we have 30564 1726882915.52933: results queue empty 30564 1726882915.52934: checking for any_errors_fatal 30564 1726882915.52941: done checking for any_errors_fatal 30564 1726882915.52942: checking for max_fail_percentage 30564 1726882915.52944: done checking for max_fail_percentage 30564 1726882915.52945: checking to see if all hosts have failed and the running result is not ok 30564 1726882915.52945: done checking to see if all hosts have failed 30564 1726882915.52946: getting the remaining hosts for this loop 30564 1726882915.52948: done getting the remaining hosts for this loop 30564 1726882915.52951: getting the next task for host managed_node2 30564 1726882915.52958: done getting next task for host managed_node2 30564 1726882915.52961: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 30564 1726882915.52970: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882915.52998: getting variables 30564 1726882915.53000: in VariableManager get_vars() 30564 1726882915.53033: Calling all_inventory to load vars for managed_node2 30564 1726882915.53035: Calling groups_inventory to load vars for managed_node2 30564 1726882915.53037: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882915.53043: Calling all_plugins_play to load vars for managed_node2 30564 1726882915.53045: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882915.53046: Calling groups_plugins_play to load vars for managed_node2 30564 1726882915.53852: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882915.55190: done with get_vars() 30564 1726882915.55205: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:41:55 -0400 (0:00:00.037) 0:01:54.134 ****** 30564 1726882915.55270: entering _queue_task() for managed_node2/ping 30564 1726882915.55461: worker is 1 (out of 1 available) 30564 1726882915.55475: exiting _queue_task() for managed_node2/ping 30564 1726882915.55487: done queuing things up, now waiting for results queue to drain 30564 1726882915.55489: waiting for pending results... 30564 1726882915.55690: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 30564 1726882915.55783: in run() - task 0e448fcc-3ce9-4216-acec-00000000233d 30564 1726882915.55794: variable 'ansible_search_path' from source: unknown 30564 1726882915.55797: variable 'ansible_search_path' from source: unknown 30564 1726882915.55827: calling self._execute() 30564 1726882915.55904: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882915.55908: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882915.55919: variable 'omit' from source: magic vars 30564 1726882915.56215: variable 'ansible_distribution_major_version' from source: facts 30564 1726882915.56226: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882915.56231: variable 'omit' from source: magic vars 30564 1726882915.56308: variable 'omit' from source: magic vars 30564 1726882915.56320: variable 'omit' from source: magic vars 30564 1726882915.56355: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882915.56423: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882915.56429: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882915.56445: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882915.56493: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882915.56554: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882915.56558: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882915.56561: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882915.56616: Set connection var ansible_timeout to 10 30564 1726882915.56651: Set connection var ansible_pipelining to False 30564 1726882915.56654: Set connection var ansible_shell_type to sh 30564 1726882915.56658: Set connection var ansible_shell_executable to /bin/sh 30564 1726882915.56661: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882915.56663: Set connection var ansible_connection to ssh 30564 1726882915.56674: variable 'ansible_shell_executable' from source: unknown 30564 1726882915.56677: variable 'ansible_connection' from source: unknown 30564 1726882915.56680: variable 'ansible_module_compression' from source: unknown 30564 1726882915.56682: variable 'ansible_shell_type' from source: unknown 30564 1726882915.56685: variable 'ansible_shell_executable' from source: unknown 30564 1726882915.56687: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882915.56690: variable 'ansible_pipelining' from source: unknown 30564 1726882915.56698: variable 'ansible_timeout' from source: unknown 30564 1726882915.56701: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882915.56966: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30564 1726882915.56974: variable 'omit' from source: magic vars 30564 1726882915.56992: starting attempt loop 30564 1726882915.56995: running the handler 30564 1726882915.57000: _low_level_execute_command(): starting 30564 1726882915.57003: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882915.57538: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882915.57556: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882915.57579: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882915.57624: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882915.57683: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882915.57700: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882915.57813: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882915.59472: stdout chunk (state=3): >>>/root <<< 30564 1726882915.59571: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882915.59633: stderr chunk (state=3): >>><<< 30564 1726882915.59639: stdout chunk (state=3): >>><<< 30564 1726882915.59672: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882915.59687: _low_level_execute_command(): starting 30564 1726882915.59690: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882915.596676-35493-112933495146062 `" && echo ansible-tmp-1726882915.596676-35493-112933495146062="` echo /root/.ansible/tmp/ansible-tmp-1726882915.596676-35493-112933495146062 `" ) && sleep 0' 30564 1726882915.60321: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882915.60324: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882915.60352: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882915.60373: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882915.60424: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882915.60427: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882915.60432: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882915.60529: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882915.62395: stdout chunk (state=3): >>>ansible-tmp-1726882915.596676-35493-112933495146062=/root/.ansible/tmp/ansible-tmp-1726882915.596676-35493-112933495146062 <<< 30564 1726882915.62495: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882915.62536: stderr chunk (state=3): >>><<< 30564 1726882915.62539: stdout chunk (state=3): >>><<< 30564 1726882915.62577: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882915.596676-35493-112933495146062=/root/.ansible/tmp/ansible-tmp-1726882915.596676-35493-112933495146062 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882915.62708: variable 'ansible_module_compression' from source: unknown 30564 1726882915.62711: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 30564 1726882915.62714: variable 'ansible_facts' from source: unknown 30564 1726882915.62883: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882915.596676-35493-112933495146062/AnsiballZ_ping.py 30564 1726882915.63095: Sending initial data 30564 1726882915.63102: Sent initial data (152 bytes) 30564 1726882915.64486: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882915.64490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882915.64493: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882915.64528: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882915.64535: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882915.64550: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882915.64556: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882915.64569: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882915.64580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882915.64589: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882915.64594: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882915.64642: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882915.64664: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882915.64668: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882915.64788: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882915.66518: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882915.66610: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882915.66707: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmps4tuahj8 /root/.ansible/tmp/ansible-tmp-1726882915.596676-35493-112933495146062/AnsiballZ_ping.py <<< 30564 1726882915.66801: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882915.67797: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882915.67948: stderr chunk (state=3): >>><<< 30564 1726882915.67951: stdout chunk (state=3): >>><<< 30564 1726882915.67954: done transferring module to remote 30564 1726882915.67956: _low_level_execute_command(): starting 30564 1726882915.67958: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882915.596676-35493-112933495146062/ /root/.ansible/tmp/ansible-tmp-1726882915.596676-35493-112933495146062/AnsiballZ_ping.py && sleep 0' 30564 1726882915.68352: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882915.68357: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882915.68394: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882915.68402: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882915.68410: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration <<< 30564 1726882915.68416: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882915.68439: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 30564 1726882915.68442: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882915.68490: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882915.68499: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882915.68513: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882915.68621: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882915.70373: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882915.70420: stderr chunk (state=3): >>><<< 30564 1726882915.70423: stdout chunk (state=3): >>><<< 30564 1726882915.70441: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882915.70444: _low_level_execute_command(): starting 30564 1726882915.70449: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882915.596676-35493-112933495146062/AnsiballZ_ping.py && sleep 0' 30564 1726882915.71035: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882915.71045: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882915.71055: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882915.71070: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882915.71105: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882915.71112: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882915.71122: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882915.71135: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882915.71143: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882915.71153: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882915.71156: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882915.71171: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882915.71180: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882915.71188: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882915.71194: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882915.71204: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882915.71274: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882915.71287: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882915.71298: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882915.71441: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882915.84303: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 30564 1726882915.85388: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882915.85414: stderr chunk (state=3): >>><<< 30564 1726882915.85417: stdout chunk (state=3): >>><<< 30564 1726882915.85437: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882915.85464: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882915.596676-35493-112933495146062/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882915.85478: _low_level_execute_command(): starting 30564 1726882915.85484: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882915.596676-35493-112933495146062/ > /dev/null 2>&1 && sleep 0' 30564 1726882915.86223: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882915.86239: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882915.86250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882915.86268: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882915.86308: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882915.86315: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882915.86328: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882915.86348: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882915.86356: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882915.86366: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882915.86382: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882915.86391: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882915.86403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882915.86410: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882915.86417: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882915.86427: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882915.86508: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882915.86523: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882915.86536: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882915.86659: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882915.88545: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882915.88549: stdout chunk (state=3): >>><<< 30564 1726882915.88556: stderr chunk (state=3): >>><<< 30564 1726882915.88578: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882915.88584: handler run complete 30564 1726882915.88602: attempt loop complete, returning result 30564 1726882915.88605: _execute() done 30564 1726882915.88608: dumping result to json 30564 1726882915.88610: done dumping result, returning 30564 1726882915.88620: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0e448fcc-3ce9-4216-acec-00000000233d] 30564 1726882915.88627: sending task result for task 0e448fcc-3ce9-4216-acec-00000000233d 30564 1726882915.88724: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000233d 30564 1726882915.88726: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 30564 1726882915.88840: no more pending results, returning what we have 30564 1726882915.88844: results queue empty 30564 1726882915.88845: checking for any_errors_fatal 30564 1726882915.88852: done checking for any_errors_fatal 30564 1726882915.88853: checking for max_fail_percentage 30564 1726882915.88854: done checking for max_fail_percentage 30564 1726882915.88855: checking to see if all hosts have failed and the running result is not ok 30564 1726882915.88856: done checking to see if all hosts have failed 30564 1726882915.88857: getting the remaining hosts for this loop 30564 1726882915.88859: done getting the remaining hosts for this loop 30564 1726882915.88863: getting the next task for host managed_node2 30564 1726882915.88888: done getting next task for host managed_node2 30564 1726882915.88891: ^ task is: TASK: meta (role_complete) 30564 1726882915.88897: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882915.88911: getting variables 30564 1726882915.88914: in VariableManager get_vars() 30564 1726882915.88970: Calling all_inventory to load vars for managed_node2 30564 1726882915.88972: Calling groups_inventory to load vars for managed_node2 30564 1726882915.88975: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882915.89103: Calling all_plugins_play to load vars for managed_node2 30564 1726882915.89106: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882915.89109: Calling groups_plugins_play to load vars for managed_node2 30564 1726882915.92019: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882915.94599: done with get_vars() 30564 1726882915.94627: done getting variables 30564 1726882915.94731: done queuing things up, now waiting for results queue to drain 30564 1726882915.94734: results queue empty 30564 1726882915.94735: checking for any_errors_fatal 30564 1726882915.94738: done checking for any_errors_fatal 30564 1726882915.94739: checking for max_fail_percentage 30564 1726882915.94740: done checking for max_fail_percentage 30564 1726882915.94741: checking to see if all hosts have failed and the running result is not ok 30564 1726882915.94741: done checking to see if all hosts have failed 30564 1726882915.94746: getting the remaining hosts for this loop 30564 1726882915.94748: done getting the remaining hosts for this loop 30564 1726882915.94751: getting the next task for host managed_node2 30564 1726882915.94761: done getting next task for host managed_node2 30564 1726882915.94764: ^ task is: TASK: Include network role 30564 1726882915.94771: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882915.94775: getting variables 30564 1726882915.94776: in VariableManager get_vars() 30564 1726882915.94789: Calling all_inventory to load vars for managed_node2 30564 1726882915.94791: Calling groups_inventory to load vars for managed_node2 30564 1726882915.94794: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882915.94799: Calling all_plugins_play to load vars for managed_node2 30564 1726882915.94801: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882915.94804: Calling groups_plugins_play to load vars for managed_node2 30564 1726882915.96184: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882915.98095: done with get_vars() 30564 1726882915.98134: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml:3 Friday 20 September 2024 21:41:55 -0400 (0:00:00.429) 0:01:54.563 ****** 30564 1726882915.98216: entering _queue_task() for managed_node2/include_role 30564 1726882915.98599: worker is 1 (out of 1 available) 30564 1726882915.98614: exiting _queue_task() for managed_node2/include_role 30564 1726882915.98627: done queuing things up, now waiting for results queue to drain 30564 1726882915.98629: waiting for pending results... 30564 1726882915.98975: running TaskExecutor() for managed_node2/TASK: Include network role 30564 1726882915.99131: in run() - task 0e448fcc-3ce9-4216-acec-000000002142 30564 1726882915.99145: variable 'ansible_search_path' from source: unknown 30564 1726882915.99149: variable 'ansible_search_path' from source: unknown 30564 1726882915.99190: calling self._execute() 30564 1726882915.99302: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882915.99315: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882915.99331: variable 'omit' from source: magic vars 30564 1726882915.99775: variable 'ansible_distribution_major_version' from source: facts 30564 1726882915.99788: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882915.99794: _execute() done 30564 1726882915.99797: dumping result to json 30564 1726882915.99801: done dumping result, returning 30564 1726882915.99809: done running TaskExecutor() for managed_node2/TASK: Include network role [0e448fcc-3ce9-4216-acec-000000002142] 30564 1726882915.99814: sending task result for task 0e448fcc-3ce9-4216-acec-000000002142 30564 1726882915.99932: done sending task result for task 0e448fcc-3ce9-4216-acec-000000002142 30564 1726882915.99936: WORKER PROCESS EXITING 30564 1726882915.99971: no more pending results, returning what we have 30564 1726882915.99976: in VariableManager get_vars() 30564 1726882916.00025: Calling all_inventory to load vars for managed_node2 30564 1726882916.00028: Calling groups_inventory to load vars for managed_node2 30564 1726882916.00032: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882916.00047: Calling all_plugins_play to load vars for managed_node2 30564 1726882916.00051: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882916.00054: Calling groups_plugins_play to load vars for managed_node2 30564 1726882916.02034: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882916.03898: done with get_vars() 30564 1726882916.03922: variable 'ansible_search_path' from source: unknown 30564 1726882916.03923: variable 'ansible_search_path' from source: unknown 30564 1726882916.04103: variable 'omit' from source: magic vars 30564 1726882916.04144: variable 'omit' from source: magic vars 30564 1726882916.04158: variable 'omit' from source: magic vars 30564 1726882916.04162: we have included files to process 30564 1726882916.04163: generating all_blocks data 30564 1726882916.04173: done generating all_blocks data 30564 1726882916.04182: processing included file: fedora.linux_system_roles.network 30564 1726882916.04203: in VariableManager get_vars() 30564 1726882916.04219: done with get_vars() 30564 1726882916.04246: in VariableManager get_vars() 30564 1726882916.04274: done with get_vars() 30564 1726882916.04323: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 30564 1726882916.04452: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 30564 1726882916.04542: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 30564 1726882916.05053: in VariableManager get_vars() 30564 1726882916.05083: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30564 1726882916.07306: iterating over new_blocks loaded from include file 30564 1726882916.07308: in VariableManager get_vars() 30564 1726882916.07338: done with get_vars() 30564 1726882916.07340: filtering new block on tags 30564 1726882916.07678: done filtering new block on tags 30564 1726882916.07682: in VariableManager get_vars() 30564 1726882916.07700: done with get_vars() 30564 1726882916.07701: filtering new block on tags 30564 1726882916.07718: done filtering new block on tags 30564 1726882916.07721: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node2 30564 1726882916.07727: extending task lists for all hosts with included blocks 30564 1726882916.07855: done extending task lists 30564 1726882916.07856: done processing included files 30564 1726882916.07857: results queue empty 30564 1726882916.07858: checking for any_errors_fatal 30564 1726882916.07859: done checking for any_errors_fatal 30564 1726882916.07860: checking for max_fail_percentage 30564 1726882916.07861: done checking for max_fail_percentage 30564 1726882916.07867: checking to see if all hosts have failed and the running result is not ok 30564 1726882916.07870: done checking to see if all hosts have failed 30564 1726882916.07871: getting the remaining hosts for this loop 30564 1726882916.07873: done getting the remaining hosts for this loop 30564 1726882916.07881: getting the next task for host managed_node2 30564 1726882916.07886: done getting next task for host managed_node2 30564 1726882916.07888: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30564 1726882916.07892: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882916.07904: getting variables 30564 1726882916.07905: in VariableManager get_vars() 30564 1726882916.07920: Calling all_inventory to load vars for managed_node2 30564 1726882916.07923: Calling groups_inventory to load vars for managed_node2 30564 1726882916.07925: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882916.07930: Calling all_plugins_play to load vars for managed_node2 30564 1726882916.07932: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882916.07935: Calling groups_plugins_play to load vars for managed_node2 30564 1726882916.09424: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882916.11325: done with get_vars() 30564 1726882916.11360: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:41:56 -0400 (0:00:00.132) 0:01:54.695 ****** 30564 1726882916.11442: entering _queue_task() for managed_node2/include_tasks 30564 1726882916.11836: worker is 1 (out of 1 available) 30564 1726882916.11850: exiting _queue_task() for managed_node2/include_tasks 30564 1726882916.11863: done queuing things up, now waiting for results queue to drain 30564 1726882916.11897: waiting for pending results... 30564 1726882916.12234: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30564 1726882916.12499: in run() - task 0e448fcc-3ce9-4216-acec-0000000024a4 30564 1726882916.12586: variable 'ansible_search_path' from source: unknown 30564 1726882916.12606: variable 'ansible_search_path' from source: unknown 30564 1726882916.12702: calling self._execute() 30564 1726882916.12898: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882916.12912: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882916.12924: variable 'omit' from source: magic vars 30564 1726882916.13395: variable 'ansible_distribution_major_version' from source: facts 30564 1726882916.13409: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882916.13416: _execute() done 30564 1726882916.13419: dumping result to json 30564 1726882916.13427: done dumping result, returning 30564 1726882916.13434: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0e448fcc-3ce9-4216-acec-0000000024a4] 30564 1726882916.13440: sending task result for task 0e448fcc-3ce9-4216-acec-0000000024a4 30564 1726882916.13552: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000024a4 30564 1726882916.13556: WORKER PROCESS EXITING 30564 1726882916.13619: no more pending results, returning what we have 30564 1726882916.13625: in VariableManager get_vars() 30564 1726882916.13696: Calling all_inventory to load vars for managed_node2 30564 1726882916.13700: Calling groups_inventory to load vars for managed_node2 30564 1726882916.13702: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882916.13716: Calling all_plugins_play to load vars for managed_node2 30564 1726882916.13720: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882916.13723: Calling groups_plugins_play to load vars for managed_node2 30564 1726882916.16024: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882916.17979: done with get_vars() 30564 1726882916.18006: variable 'ansible_search_path' from source: unknown 30564 1726882916.18008: variable 'ansible_search_path' from source: unknown 30564 1726882916.18056: we have included files to process 30564 1726882916.18058: generating all_blocks data 30564 1726882916.18060: done generating all_blocks data 30564 1726882916.18065: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30564 1726882916.18066: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30564 1726882916.18069: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30564 1726882916.18682: done processing included file 30564 1726882916.18684: iterating over new_blocks loaded from include file 30564 1726882916.18686: in VariableManager get_vars() 30564 1726882916.18715: done with get_vars() 30564 1726882916.18716: filtering new block on tags 30564 1726882916.18747: done filtering new block on tags 30564 1726882916.18751: in VariableManager get_vars() 30564 1726882916.18778: done with get_vars() 30564 1726882916.18780: filtering new block on tags 30564 1726882916.19240: done filtering new block on tags 30564 1726882916.19242: in VariableManager get_vars() 30564 1726882916.19568: done with get_vars() 30564 1726882916.19570: filtering new block on tags 30564 1726882916.19615: done filtering new block on tags 30564 1726882916.19617: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 30564 1726882916.19623: extending task lists for all hosts with included blocks 30564 1726882916.21732: done extending task lists 30564 1726882916.21734: done processing included files 30564 1726882916.21735: results queue empty 30564 1726882916.21736: checking for any_errors_fatal 30564 1726882916.21739: done checking for any_errors_fatal 30564 1726882916.21739: checking for max_fail_percentage 30564 1726882916.21741: done checking for max_fail_percentage 30564 1726882916.21742: checking to see if all hosts have failed and the running result is not ok 30564 1726882916.21743: done checking to see if all hosts have failed 30564 1726882916.21743: getting the remaining hosts for this loop 30564 1726882916.21745: done getting the remaining hosts for this loop 30564 1726882916.21748: getting the next task for host managed_node2 30564 1726882916.21754: done getting next task for host managed_node2 30564 1726882916.21756: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30564 1726882916.21761: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882916.21779: getting variables 30564 1726882916.21781: in VariableManager get_vars() 30564 1726882916.21798: Calling all_inventory to load vars for managed_node2 30564 1726882916.21800: Calling groups_inventory to load vars for managed_node2 30564 1726882916.21802: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882916.21807: Calling all_plugins_play to load vars for managed_node2 30564 1726882916.21810: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882916.21813: Calling groups_plugins_play to load vars for managed_node2 30564 1726882916.24425: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882916.28826: done with get_vars() 30564 1726882916.28867: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:41:56 -0400 (0:00:00.175) 0:01:54.870 ****** 30564 1726882916.28961: entering _queue_task() for managed_node2/setup 30564 1726882916.29409: worker is 1 (out of 1 available) 30564 1726882916.29424: exiting _queue_task() for managed_node2/setup 30564 1726882916.29442: done queuing things up, now waiting for results queue to drain 30564 1726882916.29444: waiting for pending results... 30564 1726882916.29800: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30564 1726882916.29978: in run() - task 0e448fcc-3ce9-4216-acec-0000000024fb 30564 1726882916.30004: variable 'ansible_search_path' from source: unknown 30564 1726882916.30011: variable 'ansible_search_path' from source: unknown 30564 1726882916.30054: calling self._execute() 30564 1726882916.30170: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882916.30184: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882916.30203: variable 'omit' from source: magic vars 30564 1726882916.30624: variable 'ansible_distribution_major_version' from source: facts 30564 1726882916.30649: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882916.31584: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882916.34885: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882916.34968: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882916.35008: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882916.35046: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882916.35085: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882916.35173: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882916.35209: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882916.35242: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882916.35296: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882916.35317: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882916.35373: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882916.35407: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882916.35431: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882916.35469: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882916.35492: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882916.35657: variable '__network_required_facts' from source: role '' defaults 30564 1726882916.35675: variable 'ansible_facts' from source: unknown 30564 1726882916.37690: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 30564 1726882916.37699: when evaluation is False, skipping this task 30564 1726882916.37705: _execute() done 30564 1726882916.37711: dumping result to json 30564 1726882916.37718: done dumping result, returning 30564 1726882916.37729: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0e448fcc-3ce9-4216-acec-0000000024fb] 30564 1726882916.37737: sending task result for task 0e448fcc-3ce9-4216-acec-0000000024fb skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30564 1726882916.37893: no more pending results, returning what we have 30564 1726882916.37898: results queue empty 30564 1726882916.37899: checking for any_errors_fatal 30564 1726882916.37901: done checking for any_errors_fatal 30564 1726882916.37902: checking for max_fail_percentage 30564 1726882916.37903: done checking for max_fail_percentage 30564 1726882916.37904: checking to see if all hosts have failed and the running result is not ok 30564 1726882916.37905: done checking to see if all hosts have failed 30564 1726882916.37906: getting the remaining hosts for this loop 30564 1726882916.37908: done getting the remaining hosts for this loop 30564 1726882916.37912: getting the next task for host managed_node2 30564 1726882916.37925: done getting next task for host managed_node2 30564 1726882916.37929: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 30564 1726882916.37935: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882916.37965: getting variables 30564 1726882916.37967: in VariableManager get_vars() 30564 1726882916.38021: Calling all_inventory to load vars for managed_node2 30564 1726882916.38024: Calling groups_inventory to load vars for managed_node2 30564 1726882916.38027: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882916.38039: Calling all_plugins_play to load vars for managed_node2 30564 1726882916.38042: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882916.38045: Calling groups_plugins_play to load vars for managed_node2 30564 1726882916.39285: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000024fb 30564 1726882916.39295: WORKER PROCESS EXITING 30564 1726882916.40009: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882916.42823: done with get_vars() 30564 1726882916.42853: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:41:56 -0400 (0:00:00.139) 0:01:55.010 ****** 30564 1726882916.42960: entering _queue_task() for managed_node2/stat 30564 1726882916.43329: worker is 1 (out of 1 available) 30564 1726882916.43344: exiting _queue_task() for managed_node2/stat 30564 1726882916.43356: done queuing things up, now waiting for results queue to drain 30564 1726882916.43357: waiting for pending results... 30564 1726882916.43675: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 30564 1726882916.44634: in run() - task 0e448fcc-3ce9-4216-acec-0000000024fd 30564 1726882916.44669: variable 'ansible_search_path' from source: unknown 30564 1726882916.44681: variable 'ansible_search_path' from source: unknown 30564 1726882916.44812: calling self._execute() 30564 1726882916.44914: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882916.45010: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882916.45035: variable 'omit' from source: magic vars 30564 1726882916.45657: variable 'ansible_distribution_major_version' from source: facts 30564 1726882916.45686: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882916.45907: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882916.46226: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882916.46287: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882916.46337: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882916.46388: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882916.46488: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882916.46519: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882916.46555: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882916.46594: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882916.46693: variable '__network_is_ostree' from source: set_fact 30564 1726882916.46705: Evaluated conditional (not __network_is_ostree is defined): False 30564 1726882916.46713: when evaluation is False, skipping this task 30564 1726882916.46720: _execute() done 30564 1726882916.46727: dumping result to json 30564 1726882916.46734: done dumping result, returning 30564 1726882916.46750: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0e448fcc-3ce9-4216-acec-0000000024fd] 30564 1726882916.46768: sending task result for task 0e448fcc-3ce9-4216-acec-0000000024fd skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30564 1726882916.46947: no more pending results, returning what we have 30564 1726882916.46952: results queue empty 30564 1726882916.46953: checking for any_errors_fatal 30564 1726882916.46965: done checking for any_errors_fatal 30564 1726882916.46966: checking for max_fail_percentage 30564 1726882916.46968: done checking for max_fail_percentage 30564 1726882916.46969: checking to see if all hosts have failed and the running result is not ok 30564 1726882916.46970: done checking to see if all hosts have failed 30564 1726882916.46971: getting the remaining hosts for this loop 30564 1726882916.46973: done getting the remaining hosts for this loop 30564 1726882916.46979: getting the next task for host managed_node2 30564 1726882916.46989: done getting next task for host managed_node2 30564 1726882916.46993: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30564 1726882916.46999: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882916.47033: getting variables 30564 1726882916.47035: in VariableManager get_vars() 30564 1726882916.47096: Calling all_inventory to load vars for managed_node2 30564 1726882916.47099: Calling groups_inventory to load vars for managed_node2 30564 1726882916.47102: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882916.47113: Calling all_plugins_play to load vars for managed_node2 30564 1726882916.47116: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882916.47119: Calling groups_plugins_play to load vars for managed_node2 30564 1726882916.48269: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000024fd 30564 1726882916.48272: WORKER PROCESS EXITING 30564 1726882916.50168: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882916.52054: done with get_vars() 30564 1726882916.52080: done getting variables 30564 1726882916.52140: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:41:56 -0400 (0:00:00.092) 0:01:55.103 ****** 30564 1726882916.52185: entering _queue_task() for managed_node2/set_fact 30564 1726882916.53478: worker is 1 (out of 1 available) 30564 1726882916.53491: exiting _queue_task() for managed_node2/set_fact 30564 1726882916.53504: done queuing things up, now waiting for results queue to drain 30564 1726882916.53505: waiting for pending results... 30564 1726882916.53830: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30564 1726882916.54003: in run() - task 0e448fcc-3ce9-4216-acec-0000000024fe 30564 1726882916.54026: variable 'ansible_search_path' from source: unknown 30564 1726882916.54035: variable 'ansible_search_path' from source: unknown 30564 1726882916.54087: calling self._execute() 30564 1726882916.54202: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882916.54213: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882916.54237: variable 'omit' from source: magic vars 30564 1726882916.54653: variable 'ansible_distribution_major_version' from source: facts 30564 1726882916.54682: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882916.54884: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882916.55192: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882916.55243: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882916.55291: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882916.55330: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882916.55428: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882916.55459: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882916.55504: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882916.55537: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882916.55632: variable '__network_is_ostree' from source: set_fact 30564 1726882916.55644: Evaluated conditional (not __network_is_ostree is defined): False 30564 1726882916.55652: when evaluation is False, skipping this task 30564 1726882916.55659: _execute() done 30564 1726882916.55670: dumping result to json 30564 1726882916.55679: done dumping result, returning 30564 1726882916.55694: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0e448fcc-3ce9-4216-acec-0000000024fe] 30564 1726882916.55709: sending task result for task 0e448fcc-3ce9-4216-acec-0000000024fe skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30564 1726882916.55860: no more pending results, returning what we have 30564 1726882916.55866: results queue empty 30564 1726882916.55867: checking for any_errors_fatal 30564 1726882916.55875: done checking for any_errors_fatal 30564 1726882916.55876: checking for max_fail_percentage 30564 1726882916.55878: done checking for max_fail_percentage 30564 1726882916.55879: checking to see if all hosts have failed and the running result is not ok 30564 1726882916.55880: done checking to see if all hosts have failed 30564 1726882916.55881: getting the remaining hosts for this loop 30564 1726882916.55883: done getting the remaining hosts for this loop 30564 1726882916.55887: getting the next task for host managed_node2 30564 1726882916.55900: done getting next task for host managed_node2 30564 1726882916.55904: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 30564 1726882916.55911: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882916.55942: getting variables 30564 1726882916.55944: in VariableManager get_vars() 30564 1726882916.55998: Calling all_inventory to load vars for managed_node2 30564 1726882916.56000: Calling groups_inventory to load vars for managed_node2 30564 1726882916.56003: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882916.56014: Calling all_plugins_play to load vars for managed_node2 30564 1726882916.56018: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882916.56021: Calling groups_plugins_play to load vars for managed_node2 30564 1726882916.57996: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000024fe 30564 1726882916.57999: WORKER PROCESS EXITING 30564 1726882916.58857: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882916.61948: done with get_vars() 30564 1726882916.61978: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:41:56 -0400 (0:00:00.105) 0:01:55.208 ****** 30564 1726882916.62709: entering _queue_task() for managed_node2/service_facts 30564 1726882916.63030: worker is 1 (out of 1 available) 30564 1726882916.63042: exiting _queue_task() for managed_node2/service_facts 30564 1726882916.63056: done queuing things up, now waiting for results queue to drain 30564 1726882916.63057: waiting for pending results... 30564 1726882916.63756: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 30564 1726882916.64053: in run() - task 0e448fcc-3ce9-4216-acec-000000002500 30564 1726882916.64068: variable 'ansible_search_path' from source: unknown 30564 1726882916.64072: variable 'ansible_search_path' from source: unknown 30564 1726882916.64109: calling self._execute() 30564 1726882916.64417: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882916.64424: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882916.64526: variable 'omit' from source: magic vars 30564 1726882916.65074: variable 'ansible_distribution_major_version' from source: facts 30564 1726882916.65098: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882916.65107: variable 'omit' from source: magic vars 30564 1726882916.65192: variable 'omit' from source: magic vars 30564 1726882916.65232: variable 'omit' from source: magic vars 30564 1726882916.65277: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882916.65320: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882916.65343: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882916.65358: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882916.65375: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882916.65406: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882916.65409: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882916.65411: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882916.65535: Set connection var ansible_timeout to 10 30564 1726882916.65538: Set connection var ansible_pipelining to False 30564 1726882916.65783: Set connection var ansible_shell_type to sh 30564 1726882916.65795: Set connection var ansible_shell_executable to /bin/sh 30564 1726882916.65807: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882916.65813: Set connection var ansible_connection to ssh 30564 1726882916.65842: variable 'ansible_shell_executable' from source: unknown 30564 1726882916.65851: variable 'ansible_connection' from source: unknown 30564 1726882916.65858: variable 'ansible_module_compression' from source: unknown 30564 1726882916.65871: variable 'ansible_shell_type' from source: unknown 30564 1726882916.65883: variable 'ansible_shell_executable' from source: unknown 30564 1726882916.65890: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882916.65897: variable 'ansible_pipelining' from source: unknown 30564 1726882916.65904: variable 'ansible_timeout' from source: unknown 30564 1726882916.65911: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882916.66115: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30564 1726882916.66132: variable 'omit' from source: magic vars 30564 1726882916.66140: starting attempt loop 30564 1726882916.66148: running the handler 30564 1726882916.66168: _low_level_execute_command(): starting 30564 1726882916.66181: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882916.67282: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882916.67299: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882916.67315: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882916.67335: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882916.67388: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882916.67402: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882916.67418: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882916.67438: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882916.67451: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882916.67470: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882916.67486: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882916.67502: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882916.67519: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882916.67535: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882916.67548: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882916.67562: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882916.67645: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882916.67662: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882916.67680: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882916.67929: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882916.69585: stdout chunk (state=3): >>>/root <<< 30564 1726882916.69772: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882916.69775: stdout chunk (state=3): >>><<< 30564 1726882916.69785: stderr chunk (state=3): >>><<< 30564 1726882916.69908: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882916.69912: _low_level_execute_command(): starting 30564 1726882916.69915: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882916.698057-35549-56257748743751 `" && echo ansible-tmp-1726882916.698057-35549-56257748743751="` echo /root/.ansible/tmp/ansible-tmp-1726882916.698057-35549-56257748743751 `" ) && sleep 0' 30564 1726882916.70991: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882916.70994: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882916.71032: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882916.71036: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882916.71039: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882916.71097: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882916.71130: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882916.71559: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882916.73150: stdout chunk (state=3): >>>ansible-tmp-1726882916.698057-35549-56257748743751=/root/.ansible/tmp/ansible-tmp-1726882916.698057-35549-56257748743751 <<< 30564 1726882916.73323: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882916.73327: stdout chunk (state=3): >>><<< 30564 1726882916.73334: stderr chunk (state=3): >>><<< 30564 1726882916.73358: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882916.698057-35549-56257748743751=/root/.ansible/tmp/ansible-tmp-1726882916.698057-35549-56257748743751 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882916.73410: variable 'ansible_module_compression' from source: unknown 30564 1726882916.73457: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 30564 1726882916.73496: variable 'ansible_facts' from source: unknown 30564 1726882916.73574: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882916.698057-35549-56257748743751/AnsiballZ_service_facts.py 30564 1726882916.73716: Sending initial data 30564 1726882916.73719: Sent initial data (160 bytes) 30564 1726882916.75478: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882916.75482: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882916.75484: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882916.75486: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882916.75488: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882916.75490: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882916.75492: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882916.75494: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882916.75497: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882916.75499: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882916.75501: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882916.75503: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882916.75505: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882916.75507: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882916.75509: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882916.75511: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882916.75513: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882916.75515: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882916.75517: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882916.75782: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882916.77227: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882916.77316: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882916.77422: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmp0rmnhy37 /root/.ansible/tmp/ansible-tmp-1726882916.698057-35549-56257748743751/AnsiballZ_service_facts.py <<< 30564 1726882916.77521: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882916.79070: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882916.79073: stderr chunk (state=3): >>><<< 30564 1726882916.79076: stdout chunk (state=3): >>><<< 30564 1726882916.79178: done transferring module to remote 30564 1726882916.79181: _low_level_execute_command(): starting 30564 1726882916.79184: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882916.698057-35549-56257748743751/ /root/.ansible/tmp/ansible-tmp-1726882916.698057-35549-56257748743751/AnsiballZ_service_facts.py && sleep 0' 30564 1726882916.80015: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882916.80031: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882916.80046: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882916.80079: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882916.80122: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882916.80135: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882916.80150: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882916.80175: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882916.80189: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882916.80201: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882916.80214: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882916.80295: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882916.80313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882916.80326: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882916.80338: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882916.80353: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882916.80434: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882916.80519: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882916.80537: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882916.80670: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882916.82515: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882916.82518: stdout chunk (state=3): >>><<< 30564 1726882916.82520: stderr chunk (state=3): >>><<< 30564 1726882916.82611: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882916.82618: _low_level_execute_command(): starting 30564 1726882916.82620: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882916.698057-35549-56257748743751/AnsiballZ_service_facts.py && sleep 0' 30564 1726882916.83219: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882916.83234: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882916.83249: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882916.83270: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882916.83319: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882916.83331: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882916.83345: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882916.83362: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882916.83378: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882916.83398: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882916.83411: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882916.83425: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882916.83440: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882916.83452: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882916.83462: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882916.83480: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882916.83562: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882916.83586: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882916.83603: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882916.83746: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882918.16319: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", <<< 30564 1726882918.16339: stdout chunk (state=3): >>>"source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-qu<<< 30564 1726882918.16374: stdout chunk (state=3): >>>it-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "sourc<<< 30564 1726882918.16378: stdout chunk (state=3): >>>e": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.servi<<< 30564 1726882918.16384: stdout chunk (state=3): >>>ce": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "ina<<< 30564 1726882918.16386: stdout chunk (state=3): >>>ctive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "system<<< 30564 1726882918.16390: stdout chunk (state=3): >>>d"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 30564 1726882918.17682: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882918.17734: stderr chunk (state=3): >>><<< 30564 1726882918.17737: stdout chunk (state=3): >>><<< 30564 1726882918.17765: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882918.18769: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882916.698057-35549-56257748743751/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882918.18780: _low_level_execute_command(): starting 30564 1726882918.18785: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882916.698057-35549-56257748743751/ > /dev/null 2>&1 && sleep 0' 30564 1726882918.19421: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882918.19430: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882918.19441: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882918.19455: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882918.19500: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882918.19508: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882918.19520: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882918.19532: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882918.19540: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882918.19547: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882918.19573: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882918.19576: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882918.19586: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882918.19593: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882918.19600: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882918.19609: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882918.19682: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882918.19696: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882918.19702: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882918.19833: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882918.21645: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882918.21725: stderr chunk (state=3): >>><<< 30564 1726882918.21741: stdout chunk (state=3): >>><<< 30564 1726882918.21771: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882918.21874: handler run complete 30564 1726882918.21982: variable 'ansible_facts' from source: unknown 30564 1726882918.22145: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882918.22616: variable 'ansible_facts' from source: unknown 30564 1726882918.22758: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882918.22970: attempt loop complete, returning result 30564 1726882918.22982: _execute() done 30564 1726882918.22990: dumping result to json 30564 1726882918.23051: done dumping result, returning 30564 1726882918.23079: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [0e448fcc-3ce9-4216-acec-000000002500] 30564 1726882918.23089: sending task result for task 0e448fcc-3ce9-4216-acec-000000002500 ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30564 1726882918.24178: no more pending results, returning what we have 30564 1726882918.24182: results queue empty 30564 1726882918.24183: checking for any_errors_fatal 30564 1726882918.24187: done checking for any_errors_fatal 30564 1726882918.24188: checking for max_fail_percentage 30564 1726882918.24190: done checking for max_fail_percentage 30564 1726882918.24191: checking to see if all hosts have failed and the running result is not ok 30564 1726882918.24192: done checking to see if all hosts have failed 30564 1726882918.24192: getting the remaining hosts for this loop 30564 1726882918.24195: done getting the remaining hosts for this loop 30564 1726882918.24199: getting the next task for host managed_node2 30564 1726882918.24207: done getting next task for host managed_node2 30564 1726882918.24211: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 30564 1726882918.24218: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882918.24231: getting variables 30564 1726882918.24233: in VariableManager get_vars() 30564 1726882918.24284: Calling all_inventory to load vars for managed_node2 30564 1726882918.24287: Calling groups_inventory to load vars for managed_node2 30564 1726882918.24290: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882918.24302: Calling all_plugins_play to load vars for managed_node2 30564 1726882918.24305: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882918.24308: Calling groups_plugins_play to load vars for managed_node2 30564 1726882918.25438: done sending task result for task 0e448fcc-3ce9-4216-acec-000000002500 30564 1726882918.25442: WORKER PROCESS EXITING 30564 1726882918.26155: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882918.28747: done with get_vars() 30564 1726882918.28775: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:41:58 -0400 (0:00:01.661) 0:01:56.870 ****** 30564 1726882918.28883: entering _queue_task() for managed_node2/package_facts 30564 1726882918.29221: worker is 1 (out of 1 available) 30564 1726882918.29233: exiting _queue_task() for managed_node2/package_facts 30564 1726882918.29246: done queuing things up, now waiting for results queue to drain 30564 1726882918.29247: waiting for pending results... 30564 1726882918.29560: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 30564 1726882918.29759: in run() - task 0e448fcc-3ce9-4216-acec-000000002501 30564 1726882918.29785: variable 'ansible_search_path' from source: unknown 30564 1726882918.29799: variable 'ansible_search_path' from source: unknown 30564 1726882918.29842: calling self._execute() 30564 1726882918.29966: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882918.29981: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882918.29997: variable 'omit' from source: magic vars 30564 1726882918.30396: variable 'ansible_distribution_major_version' from source: facts 30564 1726882918.30415: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882918.30425: variable 'omit' from source: magic vars 30564 1726882918.30513: variable 'omit' from source: magic vars 30564 1726882918.30547: variable 'omit' from source: magic vars 30564 1726882918.30602: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882918.30642: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882918.30677: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882918.30702: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882918.30723: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882918.30758: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882918.30772: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882918.30785: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882918.30914: Set connection var ansible_timeout to 10 30564 1726882918.30931: Set connection var ansible_pipelining to False 30564 1726882918.30937: Set connection var ansible_shell_type to sh 30564 1726882918.30948: Set connection var ansible_shell_executable to /bin/sh 30564 1726882918.30961: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882918.30973: Set connection var ansible_connection to ssh 30564 1726882918.31007: variable 'ansible_shell_executable' from source: unknown 30564 1726882918.31014: variable 'ansible_connection' from source: unknown 30564 1726882918.31021: variable 'ansible_module_compression' from source: unknown 30564 1726882918.31027: variable 'ansible_shell_type' from source: unknown 30564 1726882918.31038: variable 'ansible_shell_executable' from source: unknown 30564 1726882918.31044: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882918.31051: variable 'ansible_pipelining' from source: unknown 30564 1726882918.31057: variable 'ansible_timeout' from source: unknown 30564 1726882918.31066: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882918.31280: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30564 1726882918.31295: variable 'omit' from source: magic vars 30564 1726882918.31303: starting attempt loop 30564 1726882918.31309: running the handler 30564 1726882918.31328: _low_level_execute_command(): starting 30564 1726882918.31338: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882918.32139: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882918.32152: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882918.32173: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882918.32191: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882918.32240: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882918.32251: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882918.32262: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882918.32285: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882918.32295: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882918.32305: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882918.32317: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882918.32332: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882918.32348: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882918.32358: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882918.32373: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882918.32386: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882918.32460: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882918.32485: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882918.32499: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882918.32628: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882918.34298: stdout chunk (state=3): >>>/root <<< 30564 1726882918.34408: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882918.34489: stderr chunk (state=3): >>><<< 30564 1726882918.34503: stdout chunk (state=3): >>><<< 30564 1726882918.34629: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882918.34632: _low_level_execute_command(): starting 30564 1726882918.34635: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882918.3453424-35707-91527913060970 `" && echo ansible-tmp-1726882918.3453424-35707-91527913060970="` echo /root/.ansible/tmp/ansible-tmp-1726882918.3453424-35707-91527913060970 `" ) && sleep 0' 30564 1726882918.35246: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882918.35259: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882918.35283: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882918.35302: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882918.35349: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882918.35363: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882918.35383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882918.35399: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882918.35410: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882918.35426: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882918.35438: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882918.35451: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882918.35470: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882918.35484: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882918.35494: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882918.35507: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882918.35591: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882918.35612: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882918.35627: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882918.35772: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882918.37628: stdout chunk (state=3): >>>ansible-tmp-1726882918.3453424-35707-91527913060970=/root/.ansible/tmp/ansible-tmp-1726882918.3453424-35707-91527913060970 <<< 30564 1726882918.37750: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882918.37825: stderr chunk (state=3): >>><<< 30564 1726882918.37835: stdout chunk (state=3): >>><<< 30564 1726882918.38174: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882918.3453424-35707-91527913060970=/root/.ansible/tmp/ansible-tmp-1726882918.3453424-35707-91527913060970 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882918.38178: variable 'ansible_module_compression' from source: unknown 30564 1726882918.38181: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 30564 1726882918.38183: variable 'ansible_facts' from source: unknown 30564 1726882918.38237: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882918.3453424-35707-91527913060970/AnsiballZ_package_facts.py 30564 1726882918.38412: Sending initial data 30564 1726882918.38415: Sent initial data (161 bytes) 30564 1726882918.41049: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882918.41058: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882918.41073: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882918.41085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882918.41123: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882918.41130: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882918.41141: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882918.41153: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882918.41160: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882918.41173: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882918.41181: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882918.41191: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882918.41202: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882918.41211: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882918.41217: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882918.41227: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882918.41299: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882918.41312: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882918.41322: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882918.41451: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882918.43243: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882918.43339: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882918.43442: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmp3ojipcjg /root/.ansible/tmp/ansible-tmp-1726882918.3453424-35707-91527913060970/AnsiballZ_package_facts.py <<< 30564 1726882918.43535: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882918.46233: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882918.46387: stderr chunk (state=3): >>><<< 30564 1726882918.46391: stdout chunk (state=3): >>><<< 30564 1726882918.46410: done transferring module to remote 30564 1726882918.46422: _low_level_execute_command(): starting 30564 1726882918.46427: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882918.3453424-35707-91527913060970/ /root/.ansible/tmp/ansible-tmp-1726882918.3453424-35707-91527913060970/AnsiballZ_package_facts.py && sleep 0' 30564 1726882918.47060: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882918.47074: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882918.47086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882918.47100: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882918.47138: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882918.47146: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882918.47153: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882918.47169: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882918.47181: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882918.47188: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882918.47196: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882918.47206: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882918.47218: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882918.47225: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882918.47232: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882918.47242: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882918.47317: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882918.47332: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882918.47342: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882918.47468: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882918.49305: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882918.49379: stderr chunk (state=3): >>><<< 30564 1726882918.49383: stdout chunk (state=3): >>><<< 30564 1726882918.49399: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882918.49403: _low_level_execute_command(): starting 30564 1726882918.49408: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882918.3453424-35707-91527913060970/AnsiballZ_package_facts.py && sleep 0' 30564 1726882918.50040: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882918.50049: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882918.50059: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882918.50081: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882918.50121: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882918.50127: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882918.50137: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882918.50150: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882918.50157: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882918.50165: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882918.50176: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882918.50186: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882918.50197: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882918.50204: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882918.50212: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882918.50219: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882918.50295: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882918.50308: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882918.50319: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882918.50461: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882918.96699: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": nu<<< 30564 1726882918.96758: stdout chunk (state=3): >>>ll, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects"<<< 30564 1726882918.96762: stdout chunk (state=3): >>>: [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release<<< 30564 1726882918.96820: stdout chunk (state=3): >>>": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]<<< 30564 1726882918.96825: stdout chunk (state=3): >>>, "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.1<<< 30564 1726882918.96852: stdout chunk (state=3): >>>6.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202<<< 30564 1726882918.96857: stdout chunk (state=3): >>>", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-<<< 30564 1726882918.96862: stdout chunk (state=3): >>>base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "a<<< 30564 1726882918.96886: stdout chunk (state=3): >>>rch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "sour<<< 30564 1726882918.96892: stdout chunk (state=3): >>>ce": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, <<< 30564 1726882918.96898: stdout chunk (state=3): >>>"arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300<<< 30564 1726882918.96903: stdout chunk (state=3): >>>", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64"<<< 30564 1726882918.96905: stdout chunk (state=3): >>>, "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_6<<< 30564 1726882918.96909: stdout chunk (state=3): >>>4", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", <<< 30564 1726882918.96911: stdout chunk (state=3): >>>"release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch<<< 30564 1726882918.96914: stdout chunk (state=3): >>>", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 30564 1726882918.98380: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882918.98462: stderr chunk (state=3): >>><<< 30564 1726882918.98467: stdout chunk (state=3): >>><<< 30564 1726882918.98979: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882919.01256: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882918.3453424-35707-91527913060970/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882919.01288: _low_level_execute_command(): starting 30564 1726882919.01298: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882918.3453424-35707-91527913060970/ > /dev/null 2>&1 && sleep 0' 30564 1726882919.01979: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882919.01999: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882919.02016: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882919.02035: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882919.02084: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882919.02098: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882919.02112: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882919.02129: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882919.02140: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882919.02153: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882919.02170: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882919.02185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882919.02204: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882919.02216: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882919.02226: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882919.02238: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882919.02325: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882919.02342: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882919.02356: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882919.02499: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882919.04404: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882919.04408: stdout chunk (state=3): >>><<< 30564 1726882919.04410: stderr chunk (state=3): >>><<< 30564 1726882919.04869: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882919.04873: handler run complete 30564 1726882919.15423: variable 'ansible_facts' from source: unknown 30564 1726882919.16096: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882919.19387: variable 'ansible_facts' from source: unknown 30564 1726882919.19914: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882919.21286: attempt loop complete, returning result 30564 1726882919.21311: _execute() done 30564 1726882919.21314: dumping result to json 30564 1726882919.21761: done dumping result, returning 30564 1726882919.21777: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0e448fcc-3ce9-4216-acec-000000002501] 30564 1726882919.21784: sending task result for task 0e448fcc-3ce9-4216-acec-000000002501 30564 1726882919.33931: done sending task result for task 0e448fcc-3ce9-4216-acec-000000002501 30564 1726882919.33935: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30564 1726882919.34025: no more pending results, returning what we have 30564 1726882919.34028: results queue empty 30564 1726882919.34029: checking for any_errors_fatal 30564 1726882919.34033: done checking for any_errors_fatal 30564 1726882919.34034: checking for max_fail_percentage 30564 1726882919.34035: done checking for max_fail_percentage 30564 1726882919.34036: checking to see if all hosts have failed and the running result is not ok 30564 1726882919.34037: done checking to see if all hosts have failed 30564 1726882919.34038: getting the remaining hosts for this loop 30564 1726882919.34039: done getting the remaining hosts for this loop 30564 1726882919.34046: getting the next task for host managed_node2 30564 1726882919.34056: done getting next task for host managed_node2 30564 1726882919.34060: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 30564 1726882919.34071: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882919.34084: getting variables 30564 1726882919.34085: in VariableManager get_vars() 30564 1726882919.34114: Calling all_inventory to load vars for managed_node2 30564 1726882919.34117: Calling groups_inventory to load vars for managed_node2 30564 1726882919.34129: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882919.34138: Calling all_plugins_play to load vars for managed_node2 30564 1726882919.34141: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882919.34144: Calling groups_plugins_play to load vars for managed_node2 30564 1726882919.35594: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882919.37437: done with get_vars() 30564 1726882919.37471: done getting variables 30564 1726882919.37530: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:41:59 -0400 (0:00:01.086) 0:01:57.956 ****** 30564 1726882919.37560: entering _queue_task() for managed_node2/debug 30564 1726882919.37967: worker is 1 (out of 1 available) 30564 1726882919.37983: exiting _queue_task() for managed_node2/debug 30564 1726882919.37996: done queuing things up, now waiting for results queue to drain 30564 1726882919.37997: waiting for pending results... 30564 1726882919.38364: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 30564 1726882919.38519: in run() - task 0e448fcc-3ce9-4216-acec-0000000024a5 30564 1726882919.38531: variable 'ansible_search_path' from source: unknown 30564 1726882919.38535: variable 'ansible_search_path' from source: unknown 30564 1726882919.38581: calling self._execute() 30564 1726882919.38693: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882919.38699: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882919.38709: variable 'omit' from source: magic vars 30564 1726882919.39147: variable 'ansible_distribution_major_version' from source: facts 30564 1726882919.39159: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882919.39168: variable 'omit' from source: magic vars 30564 1726882919.39253: variable 'omit' from source: magic vars 30564 1726882919.39359: variable 'network_provider' from source: set_fact 30564 1726882919.39392: variable 'omit' from source: magic vars 30564 1726882919.39462: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882919.39499: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882919.39531: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882919.39556: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882919.39579: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882919.39600: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882919.39604: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882919.39607: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882919.39719: Set connection var ansible_timeout to 10 30564 1726882919.39722: Set connection var ansible_pipelining to False 30564 1726882919.39725: Set connection var ansible_shell_type to sh 30564 1726882919.39731: Set connection var ansible_shell_executable to /bin/sh 30564 1726882919.39739: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882919.39743: Set connection var ansible_connection to ssh 30564 1726882919.39771: variable 'ansible_shell_executable' from source: unknown 30564 1726882919.39776: variable 'ansible_connection' from source: unknown 30564 1726882919.39779: variable 'ansible_module_compression' from source: unknown 30564 1726882919.39781: variable 'ansible_shell_type' from source: unknown 30564 1726882919.39785: variable 'ansible_shell_executable' from source: unknown 30564 1726882919.39787: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882919.39790: variable 'ansible_pipelining' from source: unknown 30564 1726882919.39792: variable 'ansible_timeout' from source: unknown 30564 1726882919.39797: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882919.39952: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882919.39967: variable 'omit' from source: magic vars 30564 1726882919.39999: starting attempt loop 30564 1726882919.40003: running the handler 30564 1726882919.40036: handler run complete 30564 1726882919.40058: attempt loop complete, returning result 30564 1726882919.40069: _execute() done 30564 1726882919.40072: dumping result to json 30564 1726882919.40077: done dumping result, returning 30564 1726882919.40087: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [0e448fcc-3ce9-4216-acec-0000000024a5] 30564 1726882919.40098: sending task result for task 0e448fcc-3ce9-4216-acec-0000000024a5 ok: [managed_node2] => {} MSG: Using network provider: nm 30564 1726882919.40290: no more pending results, returning what we have 30564 1726882919.40294: results queue empty 30564 1726882919.40295: checking for any_errors_fatal 30564 1726882919.40309: done checking for any_errors_fatal 30564 1726882919.40310: checking for max_fail_percentage 30564 1726882919.40312: done checking for max_fail_percentage 30564 1726882919.40313: checking to see if all hosts have failed and the running result is not ok 30564 1726882919.40314: done checking to see if all hosts have failed 30564 1726882919.40316: getting the remaining hosts for this loop 30564 1726882919.40318: done getting the remaining hosts for this loop 30564 1726882919.40322: getting the next task for host managed_node2 30564 1726882919.40331: done getting next task for host managed_node2 30564 1726882919.40335: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30564 1726882919.40342: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882919.40356: getting variables 30564 1726882919.40358: in VariableManager get_vars() 30564 1726882919.40421: Calling all_inventory to load vars for managed_node2 30564 1726882919.40423: Calling groups_inventory to load vars for managed_node2 30564 1726882919.40427: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882919.40438: Calling all_plugins_play to load vars for managed_node2 30564 1726882919.40441: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882919.40445: Calling groups_plugins_play to load vars for managed_node2 30564 1726882919.41386: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000024a5 30564 1726882919.41390: WORKER PROCESS EXITING 30564 1726882919.42397: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882919.44329: done with get_vars() 30564 1726882919.44362: done getting variables 30564 1726882919.44430: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:41:59 -0400 (0:00:00.069) 0:01:58.026 ****** 30564 1726882919.44484: entering _queue_task() for managed_node2/fail 30564 1726882919.44831: worker is 1 (out of 1 available) 30564 1726882919.44848: exiting _queue_task() for managed_node2/fail 30564 1726882919.44860: done queuing things up, now waiting for results queue to drain 30564 1726882919.44862: waiting for pending results... 30564 1726882919.45197: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30564 1726882919.45369: in run() - task 0e448fcc-3ce9-4216-acec-0000000024a6 30564 1726882919.45391: variable 'ansible_search_path' from source: unknown 30564 1726882919.45400: variable 'ansible_search_path' from source: unknown 30564 1726882919.45451: calling self._execute() 30564 1726882919.45578: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882919.45591: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882919.45606: variable 'omit' from source: magic vars 30564 1726882919.46004: variable 'ansible_distribution_major_version' from source: facts 30564 1726882919.46023: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882919.46156: variable 'network_state' from source: role '' defaults 30564 1726882919.46178: Evaluated conditional (network_state != {}): False 30564 1726882919.46190: when evaluation is False, skipping this task 30564 1726882919.46198: _execute() done 30564 1726882919.46205: dumping result to json 30564 1726882919.46212: done dumping result, returning 30564 1726882919.46222: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0e448fcc-3ce9-4216-acec-0000000024a6] 30564 1726882919.46232: sending task result for task 0e448fcc-3ce9-4216-acec-0000000024a6 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30564 1726882919.46407: no more pending results, returning what we have 30564 1726882919.46412: results queue empty 30564 1726882919.46414: checking for any_errors_fatal 30564 1726882919.46426: done checking for any_errors_fatal 30564 1726882919.46427: checking for max_fail_percentage 30564 1726882919.46430: done checking for max_fail_percentage 30564 1726882919.46434: checking to see if all hosts have failed and the running result is not ok 30564 1726882919.46435: done checking to see if all hosts have failed 30564 1726882919.46436: getting the remaining hosts for this loop 30564 1726882919.46438: done getting the remaining hosts for this loop 30564 1726882919.46447: getting the next task for host managed_node2 30564 1726882919.46462: done getting next task for host managed_node2 30564 1726882919.46471: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30564 1726882919.46478: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882919.46514: getting variables 30564 1726882919.46516: in VariableManager get_vars() 30564 1726882919.46565: Calling all_inventory to load vars for managed_node2 30564 1726882919.46571: Calling groups_inventory to load vars for managed_node2 30564 1726882919.46574: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882919.46586: Calling all_plugins_play to load vars for managed_node2 30564 1726882919.46590: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882919.46593: Calling groups_plugins_play to load vars for managed_node2 30564 1726882919.47576: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000024a6 30564 1726882919.47580: WORKER PROCESS EXITING 30564 1726882919.48559: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882919.50380: done with get_vars() 30564 1726882919.50404: done getting variables 30564 1726882919.50463: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:41:59 -0400 (0:00:00.060) 0:01:58.086 ****** 30564 1726882919.50503: entering _queue_task() for managed_node2/fail 30564 1726882919.50819: worker is 1 (out of 1 available) 30564 1726882919.50832: exiting _queue_task() for managed_node2/fail 30564 1726882919.50844: done queuing things up, now waiting for results queue to drain 30564 1726882919.50846: waiting for pending results... 30564 1726882919.51145: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30564 1726882919.51302: in run() - task 0e448fcc-3ce9-4216-acec-0000000024a7 30564 1726882919.51322: variable 'ansible_search_path' from source: unknown 30564 1726882919.51330: variable 'ansible_search_path' from source: unknown 30564 1726882919.51373: calling self._execute() 30564 1726882919.51488: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882919.51506: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882919.51523: variable 'omit' from source: magic vars 30564 1726882919.51912: variable 'ansible_distribution_major_version' from source: facts 30564 1726882919.51929: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882919.52073: variable 'network_state' from source: role '' defaults 30564 1726882919.52088: Evaluated conditional (network_state != {}): False 30564 1726882919.52096: when evaluation is False, skipping this task 30564 1726882919.52103: _execute() done 30564 1726882919.52110: dumping result to json 30564 1726882919.52117: done dumping result, returning 30564 1726882919.52128: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0e448fcc-3ce9-4216-acec-0000000024a7] 30564 1726882919.52138: sending task result for task 0e448fcc-3ce9-4216-acec-0000000024a7 30564 1726882919.52253: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000024a7 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30564 1726882919.52308: no more pending results, returning what we have 30564 1726882919.52313: results queue empty 30564 1726882919.52314: checking for any_errors_fatal 30564 1726882919.52324: done checking for any_errors_fatal 30564 1726882919.52325: checking for max_fail_percentage 30564 1726882919.52327: done checking for max_fail_percentage 30564 1726882919.52329: checking to see if all hosts have failed and the running result is not ok 30564 1726882919.52329: done checking to see if all hosts have failed 30564 1726882919.52330: getting the remaining hosts for this loop 30564 1726882919.52332: done getting the remaining hosts for this loop 30564 1726882919.52336: getting the next task for host managed_node2 30564 1726882919.52346: done getting next task for host managed_node2 30564 1726882919.52350: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30564 1726882919.52357: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882919.52394: getting variables 30564 1726882919.52396: in VariableManager get_vars() 30564 1726882919.52442: Calling all_inventory to load vars for managed_node2 30564 1726882919.52445: Calling groups_inventory to load vars for managed_node2 30564 1726882919.52448: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882919.52460: Calling all_plugins_play to load vars for managed_node2 30564 1726882919.52465: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882919.52470: Calling groups_plugins_play to load vars for managed_node2 30564 1726882919.53685: WORKER PROCESS EXITING 30564 1726882919.54287: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882919.56063: done with get_vars() 30564 1726882919.56090: done getting variables 30564 1726882919.56146: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:41:59 -0400 (0:00:00.056) 0:01:58.143 ****** 30564 1726882919.56185: entering _queue_task() for managed_node2/fail 30564 1726882919.56472: worker is 1 (out of 1 available) 30564 1726882919.56484: exiting _queue_task() for managed_node2/fail 30564 1726882919.56495: done queuing things up, now waiting for results queue to drain 30564 1726882919.56496: waiting for pending results... 30564 1726882919.56788: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30564 1726882919.56937: in run() - task 0e448fcc-3ce9-4216-acec-0000000024a8 30564 1726882919.56956: variable 'ansible_search_path' from source: unknown 30564 1726882919.56966: variable 'ansible_search_path' from source: unknown 30564 1726882919.57011: calling self._execute() 30564 1726882919.57126: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882919.57137: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882919.57157: variable 'omit' from source: magic vars 30564 1726882919.57548: variable 'ansible_distribution_major_version' from source: facts 30564 1726882919.57571: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882919.57752: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882919.60549: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882919.60631: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882919.60684: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882919.60725: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882919.60761: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882919.60854: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882919.60899: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882919.60932: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882919.60989: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882919.61009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882919.61121: variable 'ansible_distribution_major_version' from source: facts 30564 1726882919.61142: Evaluated conditional (ansible_distribution_major_version | int > 9): False 30564 1726882919.61151: when evaluation is False, skipping this task 30564 1726882919.61158: _execute() done 30564 1726882919.61166: dumping result to json 30564 1726882919.61177: done dumping result, returning 30564 1726882919.61194: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0e448fcc-3ce9-4216-acec-0000000024a8] 30564 1726882919.61204: sending task result for task 0e448fcc-3ce9-4216-acec-0000000024a8 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 30564 1726882919.61376: no more pending results, returning what we have 30564 1726882919.61381: results queue empty 30564 1726882919.61382: checking for any_errors_fatal 30564 1726882919.61389: done checking for any_errors_fatal 30564 1726882919.61390: checking for max_fail_percentage 30564 1726882919.61393: done checking for max_fail_percentage 30564 1726882919.61394: checking to see if all hosts have failed and the running result is not ok 30564 1726882919.61395: done checking to see if all hosts have failed 30564 1726882919.61396: getting the remaining hosts for this loop 30564 1726882919.61398: done getting the remaining hosts for this loop 30564 1726882919.61402: getting the next task for host managed_node2 30564 1726882919.61413: done getting next task for host managed_node2 30564 1726882919.61418: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30564 1726882919.61425: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882919.61460: getting variables 30564 1726882919.61462: in VariableManager get_vars() 30564 1726882919.61520: Calling all_inventory to load vars for managed_node2 30564 1726882919.61523: Calling groups_inventory to load vars for managed_node2 30564 1726882919.61526: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882919.61538: Calling all_plugins_play to load vars for managed_node2 30564 1726882919.61542: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882919.61545: Calling groups_plugins_play to load vars for managed_node2 30564 1726882919.62485: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000024a8 30564 1726882919.62488: WORKER PROCESS EXITING 30564 1726882919.63505: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882919.65249: done with get_vars() 30564 1726882919.65278: done getting variables 30564 1726882919.65337: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:41:59 -0400 (0:00:00.091) 0:01:58.235 ****** 30564 1726882919.65377: entering _queue_task() for managed_node2/dnf 30564 1726882919.65719: worker is 1 (out of 1 available) 30564 1726882919.65732: exiting _queue_task() for managed_node2/dnf 30564 1726882919.65745: done queuing things up, now waiting for results queue to drain 30564 1726882919.65746: waiting for pending results... 30564 1726882919.66047: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30564 1726882919.66209: in run() - task 0e448fcc-3ce9-4216-acec-0000000024a9 30564 1726882919.66229: variable 'ansible_search_path' from source: unknown 30564 1726882919.66239: variable 'ansible_search_path' from source: unknown 30564 1726882919.66290: calling self._execute() 30564 1726882919.66387: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882919.66391: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882919.66406: variable 'omit' from source: magic vars 30564 1726882919.66698: variable 'ansible_distribution_major_version' from source: facts 30564 1726882919.66710: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882919.66853: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882919.68645: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882919.68723: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882919.68766: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882919.68807: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882919.68836: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882919.68918: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882919.68971: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882919.69003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882919.69051: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882919.69075: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882919.69198: variable 'ansible_distribution' from source: facts 30564 1726882919.69202: variable 'ansible_distribution_major_version' from source: facts 30564 1726882919.69213: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 30564 1726882919.69310: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882919.69401: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882919.69425: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882919.69450: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882919.69480: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882919.69492: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882919.69518: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882919.69540: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882919.69560: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882919.69590: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882919.69600: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882919.69626: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882919.69645: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882919.69669: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882919.69693: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882919.69703: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882919.69816: variable 'network_connections' from source: include params 30564 1726882919.69824: variable 'interface' from source: play vars 30564 1726882919.69875: variable 'interface' from source: play vars 30564 1726882919.69924: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882919.70035: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882919.70061: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882919.70093: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882919.70111: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882919.70140: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882919.70156: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882919.70181: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882919.70204: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882919.70240: variable '__network_team_connections_defined' from source: role '' defaults 30564 1726882919.70546: variable 'network_connections' from source: include params 30564 1726882919.70549: variable 'interface' from source: play vars 30564 1726882919.70596: variable 'interface' from source: play vars 30564 1726882919.70614: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30564 1726882919.70619: when evaluation is False, skipping this task 30564 1726882919.70622: _execute() done 30564 1726882919.70625: dumping result to json 30564 1726882919.70627: done dumping result, returning 30564 1726882919.70631: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0e448fcc-3ce9-4216-acec-0000000024a9] 30564 1726882919.70636: sending task result for task 0e448fcc-3ce9-4216-acec-0000000024a9 30564 1726882919.70731: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000024a9 30564 1726882919.70734: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30564 1726882919.70789: no more pending results, returning what we have 30564 1726882919.70793: results queue empty 30564 1726882919.70794: checking for any_errors_fatal 30564 1726882919.70801: done checking for any_errors_fatal 30564 1726882919.70802: checking for max_fail_percentage 30564 1726882919.70804: done checking for max_fail_percentage 30564 1726882919.70805: checking to see if all hosts have failed and the running result is not ok 30564 1726882919.70805: done checking to see if all hosts have failed 30564 1726882919.70806: getting the remaining hosts for this loop 30564 1726882919.70808: done getting the remaining hosts for this loop 30564 1726882919.70811: getting the next task for host managed_node2 30564 1726882919.70820: done getting next task for host managed_node2 30564 1726882919.70824: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30564 1726882919.70829: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882919.70858: getting variables 30564 1726882919.70859: in VariableManager get_vars() 30564 1726882919.70903: Calling all_inventory to load vars for managed_node2 30564 1726882919.70905: Calling groups_inventory to load vars for managed_node2 30564 1726882919.70907: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882919.70917: Calling all_plugins_play to load vars for managed_node2 30564 1726882919.70919: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882919.70922: Calling groups_plugins_play to load vars for managed_node2 30564 1726882919.72316: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882919.74073: done with get_vars() 30564 1726882919.74099: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30564 1726882919.74175: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:41:59 -0400 (0:00:00.088) 0:01:58.323 ****** 30564 1726882919.74209: entering _queue_task() for managed_node2/yum 30564 1726882919.74649: worker is 1 (out of 1 available) 30564 1726882919.74661: exiting _queue_task() for managed_node2/yum 30564 1726882919.74674: done queuing things up, now waiting for results queue to drain 30564 1726882919.74676: waiting for pending results... 30564 1726882919.75219: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30564 1726882919.75380: in run() - task 0e448fcc-3ce9-4216-acec-0000000024aa 30564 1726882919.75537: variable 'ansible_search_path' from source: unknown 30564 1726882919.75572: variable 'ansible_search_path' from source: unknown 30564 1726882919.76183: calling self._execute() 30564 1726882919.76312: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882919.76435: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882919.76476: variable 'omit' from source: magic vars 30564 1726882919.76887: variable 'ansible_distribution_major_version' from source: facts 30564 1726882919.76921: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882919.77154: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882919.80346: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882919.80421: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882919.80466: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882919.80511: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882919.80541: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882919.80629: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882919.80682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882919.80717: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882919.80765: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882919.80786: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882919.80888: variable 'ansible_distribution_major_version' from source: facts 30564 1726882919.80909: Evaluated conditional (ansible_distribution_major_version | int < 8): False 30564 1726882919.80916: when evaluation is False, skipping this task 30564 1726882919.80923: _execute() done 30564 1726882919.80939: dumping result to json 30564 1726882919.80947: done dumping result, returning 30564 1726882919.80959: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0e448fcc-3ce9-4216-acec-0000000024aa] 30564 1726882919.80974: sending task result for task 0e448fcc-3ce9-4216-acec-0000000024aa skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 30564 1726882919.81136: no more pending results, returning what we have 30564 1726882919.81140: results queue empty 30564 1726882919.81142: checking for any_errors_fatal 30564 1726882919.81153: done checking for any_errors_fatal 30564 1726882919.81154: checking for max_fail_percentage 30564 1726882919.81156: done checking for max_fail_percentage 30564 1726882919.81157: checking to see if all hosts have failed and the running result is not ok 30564 1726882919.81158: done checking to see if all hosts have failed 30564 1726882919.81159: getting the remaining hosts for this loop 30564 1726882919.81161: done getting the remaining hosts for this loop 30564 1726882919.81167: getting the next task for host managed_node2 30564 1726882919.81178: done getting next task for host managed_node2 30564 1726882919.81183: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30564 1726882919.81188: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882919.81222: getting variables 30564 1726882919.81224: in VariableManager get_vars() 30564 1726882919.81274: Calling all_inventory to load vars for managed_node2 30564 1726882919.81277: Calling groups_inventory to load vars for managed_node2 30564 1726882919.81280: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882919.81291: Calling all_plugins_play to load vars for managed_node2 30564 1726882919.81294: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882919.81296: Calling groups_plugins_play to load vars for managed_node2 30564 1726882919.82258: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000024aa 30564 1726882919.82262: WORKER PROCESS EXITING 30564 1726882919.83278: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882919.85101: done with get_vars() 30564 1726882919.85127: done getting variables 30564 1726882919.85190: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:41:59 -0400 (0:00:00.110) 0:01:58.433 ****** 30564 1726882919.85225: entering _queue_task() for managed_node2/fail 30564 1726882919.85579: worker is 1 (out of 1 available) 30564 1726882919.85592: exiting _queue_task() for managed_node2/fail 30564 1726882919.85604: done queuing things up, now waiting for results queue to drain 30564 1726882919.85605: waiting for pending results... 30564 1726882919.86238: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30564 1726882919.86404: in run() - task 0e448fcc-3ce9-4216-acec-0000000024ab 30564 1726882919.86429: variable 'ansible_search_path' from source: unknown 30564 1726882919.86437: variable 'ansible_search_path' from source: unknown 30564 1726882919.86482: calling self._execute() 30564 1726882919.86598: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882919.86609: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882919.86627: variable 'omit' from source: magic vars 30564 1726882919.87011: variable 'ansible_distribution_major_version' from source: facts 30564 1726882919.87030: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882919.87151: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882919.87346: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882919.89953: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882919.90027: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882919.90069: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882919.90113: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882919.90144: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882919.90228: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882919.90261: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882919.90295: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882919.90350: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882919.90375: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882919.90423: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882919.90455: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882919.90486: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882919.90531: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882919.90554: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882919.90599: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882919.90626: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882919.90659: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882919.90707: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882919.90726: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882919.90910: variable 'network_connections' from source: include params 30564 1726882919.90926: variable 'interface' from source: play vars 30564 1726882919.91000: variable 'interface' from source: play vars 30564 1726882919.91076: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882919.91251: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882919.91295: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882919.91329: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882919.91358: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882919.91405: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882919.91434: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882919.91465: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882919.91496: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882919.91550: variable '__network_team_connections_defined' from source: role '' defaults 30564 1726882919.91798: variable 'network_connections' from source: include params 30564 1726882919.91808: variable 'interface' from source: play vars 30564 1726882919.91878: variable 'interface' from source: play vars 30564 1726882919.91906: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30564 1726882919.91914: when evaluation is False, skipping this task 30564 1726882919.91921: _execute() done 30564 1726882919.91928: dumping result to json 30564 1726882919.91935: done dumping result, returning 30564 1726882919.91945: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-4216-acec-0000000024ab] 30564 1726882919.91958: sending task result for task 0e448fcc-3ce9-4216-acec-0000000024ab skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30564 1726882919.92124: no more pending results, returning what we have 30564 1726882919.92128: results queue empty 30564 1726882919.92130: checking for any_errors_fatal 30564 1726882919.92136: done checking for any_errors_fatal 30564 1726882919.92137: checking for max_fail_percentage 30564 1726882919.92139: done checking for max_fail_percentage 30564 1726882919.92140: checking to see if all hosts have failed and the running result is not ok 30564 1726882919.92141: done checking to see if all hosts have failed 30564 1726882919.92142: getting the remaining hosts for this loop 30564 1726882919.92144: done getting the remaining hosts for this loop 30564 1726882919.92148: getting the next task for host managed_node2 30564 1726882919.92157: done getting next task for host managed_node2 30564 1726882919.92162: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 30564 1726882919.92169: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882919.92199: getting variables 30564 1726882919.92202: in VariableManager get_vars() 30564 1726882919.92251: Calling all_inventory to load vars for managed_node2 30564 1726882919.92253: Calling groups_inventory to load vars for managed_node2 30564 1726882919.92256: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882919.92268: Calling all_plugins_play to load vars for managed_node2 30564 1726882919.92273: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882919.92277: Calling groups_plugins_play to load vars for managed_node2 30564 1726882919.93521: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000024ab 30564 1726882919.93524: WORKER PROCESS EXITING 30564 1726882919.94156: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882919.95930: done with get_vars() 30564 1726882919.96636: done getting variables 30564 1726882919.96772: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:41:59 -0400 (0:00:00.115) 0:01:58.549 ****** 30564 1726882919.96809: entering _queue_task() for managed_node2/package 30564 1726882919.97199: worker is 1 (out of 1 available) 30564 1726882919.97211: exiting _queue_task() for managed_node2/package 30564 1726882919.97222: done queuing things up, now waiting for results queue to drain 30564 1726882919.97223: waiting for pending results... 30564 1726882919.97502: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 30564 1726882919.97652: in run() - task 0e448fcc-3ce9-4216-acec-0000000024ac 30564 1726882919.97677: variable 'ansible_search_path' from source: unknown 30564 1726882919.97685: variable 'ansible_search_path' from source: unknown 30564 1726882919.97723: calling self._execute() 30564 1726882919.97835: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882919.97846: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882919.97861: variable 'omit' from source: magic vars 30564 1726882919.98236: variable 'ansible_distribution_major_version' from source: facts 30564 1726882919.98253: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882919.98461: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882919.98807: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882919.98859: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882919.99123: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882919.99191: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882919.99310: variable 'network_packages' from source: role '' defaults 30564 1726882919.99425: variable '__network_provider_setup' from source: role '' defaults 30564 1726882919.99444: variable '__network_service_name_default_nm' from source: role '' defaults 30564 1726882919.99531: variable '__network_service_name_default_nm' from source: role '' defaults 30564 1726882919.99545: variable '__network_packages_default_nm' from source: role '' defaults 30564 1726882919.99617: variable '__network_packages_default_nm' from source: role '' defaults 30564 1726882919.99817: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882920.03355: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882920.03431: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882920.03620: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882920.03658: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882920.03694: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882920.03891: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882920.03922: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882920.04071: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882920.04117: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882920.04137: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882920.04192: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882920.04295: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882920.04325: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882920.04373: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882920.04392: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882920.04739: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30564 1726882920.04959: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882920.05050: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882920.05083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882920.05266: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882920.05289: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882920.05499: variable 'ansible_python' from source: facts 30564 1726882920.05521: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30564 1726882920.05662: variable '__network_wpa_supplicant_required' from source: role '' defaults 30564 1726882920.05875: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30564 1726882920.06118: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882920.06177: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882920.06208: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882920.06256: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882920.06280: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882920.06330: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882920.06375: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882920.06405: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882920.06450: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882920.06474: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882920.06725: variable 'network_connections' from source: include params 30564 1726882920.06807: variable 'interface' from source: play vars 30564 1726882920.06929: variable 'interface' from source: play vars 30564 1726882920.07248: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882920.07296: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882920.07333: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882920.07371: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882920.07431: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882920.07761: variable 'network_connections' from source: include params 30564 1726882920.07774: variable 'interface' from source: play vars 30564 1726882920.07887: variable 'interface' from source: play vars 30564 1726882920.07910: variable '__network_packages_default_wireless' from source: role '' defaults 30564 1726882920.07987: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882920.08192: variable 'network_connections' from source: include params 30564 1726882920.08195: variable 'interface' from source: play vars 30564 1726882920.08240: variable 'interface' from source: play vars 30564 1726882920.08257: variable '__network_packages_default_team' from source: role '' defaults 30564 1726882920.08316: variable '__network_team_connections_defined' from source: role '' defaults 30564 1726882920.08528: variable 'network_connections' from source: include params 30564 1726882920.08531: variable 'interface' from source: play vars 30564 1726882920.08591: variable 'interface' from source: play vars 30564 1726882920.08635: variable '__network_service_name_default_initscripts' from source: role '' defaults 30564 1726882920.08695: variable '__network_service_name_default_initscripts' from source: role '' defaults 30564 1726882920.08713: variable '__network_packages_default_initscripts' from source: role '' defaults 30564 1726882920.08782: variable '__network_packages_default_initscripts' from source: role '' defaults 30564 1726882920.09036: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30564 1726882920.09611: variable 'network_connections' from source: include params 30564 1726882920.09621: variable 'interface' from source: play vars 30564 1726882920.09687: variable 'interface' from source: play vars 30564 1726882920.09709: variable 'ansible_distribution' from source: facts 30564 1726882920.09718: variable '__network_rh_distros' from source: role '' defaults 30564 1726882920.09729: variable 'ansible_distribution_major_version' from source: facts 30564 1726882920.09744: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30564 1726882920.09940: variable 'ansible_distribution' from source: facts 30564 1726882920.09944: variable '__network_rh_distros' from source: role '' defaults 30564 1726882920.09949: variable 'ansible_distribution_major_version' from source: facts 30564 1726882920.09962: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30564 1726882920.10099: variable 'ansible_distribution' from source: facts 30564 1726882920.10102: variable '__network_rh_distros' from source: role '' defaults 30564 1726882920.10115: variable 'ansible_distribution_major_version' from source: facts 30564 1726882920.10157: variable 'network_provider' from source: set_fact 30564 1726882920.10175: variable 'ansible_facts' from source: unknown 30564 1726882920.11153: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 30564 1726882920.11156: when evaluation is False, skipping this task 30564 1726882920.11158: _execute() done 30564 1726882920.11160: dumping result to json 30564 1726882920.11162: done dumping result, returning 30564 1726882920.11165: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [0e448fcc-3ce9-4216-acec-0000000024ac] 30564 1726882920.11168: sending task result for task 0e448fcc-3ce9-4216-acec-0000000024ac 30564 1726882920.11237: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000024ac 30564 1726882920.11240: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 30564 1726882920.11297: no more pending results, returning what we have 30564 1726882920.11301: results queue empty 30564 1726882920.11302: checking for any_errors_fatal 30564 1726882920.11309: done checking for any_errors_fatal 30564 1726882920.11310: checking for max_fail_percentage 30564 1726882920.11312: done checking for max_fail_percentage 30564 1726882920.11313: checking to see if all hosts have failed and the running result is not ok 30564 1726882920.11313: done checking to see if all hosts have failed 30564 1726882920.11314: getting the remaining hosts for this loop 30564 1726882920.11316: done getting the remaining hosts for this loop 30564 1726882920.11321: getting the next task for host managed_node2 30564 1726882920.11327: done getting next task for host managed_node2 30564 1726882920.11330: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30564 1726882920.11335: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882920.11354: getting variables 30564 1726882920.11357: in VariableManager get_vars() 30564 1726882920.11399: Calling all_inventory to load vars for managed_node2 30564 1726882920.11402: Calling groups_inventory to load vars for managed_node2 30564 1726882920.11405: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882920.11416: Calling all_plugins_play to load vars for managed_node2 30564 1726882920.11419: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882920.11422: Calling groups_plugins_play to load vars for managed_node2 30564 1726882920.13527: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882920.15075: done with get_vars() 30564 1726882920.15114: done getting variables 30564 1726882920.15176: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:42:00 -0400 (0:00:00.184) 0:01:58.733 ****** 30564 1726882920.15221: entering _queue_task() for managed_node2/package 30564 1726882920.15584: worker is 1 (out of 1 available) 30564 1726882920.15597: exiting _queue_task() for managed_node2/package 30564 1726882920.15610: done queuing things up, now waiting for results queue to drain 30564 1726882920.15611: waiting for pending results... 30564 1726882920.15802: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30564 1726882920.15899: in run() - task 0e448fcc-3ce9-4216-acec-0000000024ad 30564 1726882920.15910: variable 'ansible_search_path' from source: unknown 30564 1726882920.15914: variable 'ansible_search_path' from source: unknown 30564 1726882920.15949: calling self._execute() 30564 1726882920.16032: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882920.16036: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882920.16045: variable 'omit' from source: magic vars 30564 1726882920.16346: variable 'ansible_distribution_major_version' from source: facts 30564 1726882920.16369: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882920.16475: variable 'network_state' from source: role '' defaults 30564 1726882920.16485: Evaluated conditional (network_state != {}): False 30564 1726882920.16488: when evaluation is False, skipping this task 30564 1726882920.16491: _execute() done 30564 1726882920.16493: dumping result to json 30564 1726882920.16496: done dumping result, returning 30564 1726882920.16502: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0e448fcc-3ce9-4216-acec-0000000024ad] 30564 1726882920.16510: sending task result for task 0e448fcc-3ce9-4216-acec-0000000024ad 30564 1726882920.16641: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000024ad 30564 1726882920.16645: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30564 1726882920.16704: no more pending results, returning what we have 30564 1726882920.16708: results queue empty 30564 1726882920.16709: checking for any_errors_fatal 30564 1726882920.16717: done checking for any_errors_fatal 30564 1726882920.16718: checking for max_fail_percentage 30564 1726882920.16719: done checking for max_fail_percentage 30564 1726882920.16722: checking to see if all hosts have failed and the running result is not ok 30564 1726882920.16723: done checking to see if all hosts have failed 30564 1726882920.16724: getting the remaining hosts for this loop 30564 1726882920.16726: done getting the remaining hosts for this loop 30564 1726882920.16787: getting the next task for host managed_node2 30564 1726882920.16797: done getting next task for host managed_node2 30564 1726882920.16801: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30564 1726882920.16806: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882920.16835: getting variables 30564 1726882920.16837: in VariableManager get_vars() 30564 1726882920.16885: Calling all_inventory to load vars for managed_node2 30564 1726882920.16902: Calling groups_inventory to load vars for managed_node2 30564 1726882920.16905: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882920.16917: Calling all_plugins_play to load vars for managed_node2 30564 1726882920.16920: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882920.16923: Calling groups_plugins_play to load vars for managed_node2 30564 1726882920.18540: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882920.20952: done with get_vars() 30564 1726882920.20976: done getting variables 30564 1726882920.21021: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:42:00 -0400 (0:00:00.058) 0:01:58.791 ****** 30564 1726882920.21047: entering _queue_task() for managed_node2/package 30564 1726882920.21292: worker is 1 (out of 1 available) 30564 1726882920.21307: exiting _queue_task() for managed_node2/package 30564 1726882920.21328: done queuing things up, now waiting for results queue to drain 30564 1726882920.21330: waiting for pending results... 30564 1726882920.21525: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30564 1726882920.21630: in run() - task 0e448fcc-3ce9-4216-acec-0000000024ae 30564 1726882920.21640: variable 'ansible_search_path' from source: unknown 30564 1726882920.21643: variable 'ansible_search_path' from source: unknown 30564 1726882920.21682: calling self._execute() 30564 1726882920.21767: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882920.21773: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882920.21778: variable 'omit' from source: magic vars 30564 1726882920.22058: variable 'ansible_distribution_major_version' from source: facts 30564 1726882920.22097: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882920.22228: variable 'network_state' from source: role '' defaults 30564 1726882920.22242: Evaluated conditional (network_state != {}): False 30564 1726882920.22249: when evaluation is False, skipping this task 30564 1726882920.22256: _execute() done 30564 1726882920.22262: dumping result to json 30564 1726882920.22275: done dumping result, returning 30564 1726882920.22288: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0e448fcc-3ce9-4216-acec-0000000024ae] 30564 1726882920.22299: sending task result for task 0e448fcc-3ce9-4216-acec-0000000024ae skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30564 1726882920.22476: no more pending results, returning what we have 30564 1726882920.22481: results queue empty 30564 1726882920.22482: checking for any_errors_fatal 30564 1726882920.22492: done checking for any_errors_fatal 30564 1726882920.22493: checking for max_fail_percentage 30564 1726882920.22495: done checking for max_fail_percentage 30564 1726882920.22496: checking to see if all hosts have failed and the running result is not ok 30564 1726882920.22497: done checking to see if all hosts have failed 30564 1726882920.22497: getting the remaining hosts for this loop 30564 1726882920.22499: done getting the remaining hosts for this loop 30564 1726882920.22503: getting the next task for host managed_node2 30564 1726882920.22514: done getting next task for host managed_node2 30564 1726882920.22518: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30564 1726882920.22525: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882920.22570: getting variables 30564 1726882920.22572: in VariableManager get_vars() 30564 1726882920.22621: Calling all_inventory to load vars for managed_node2 30564 1726882920.22624: Calling groups_inventory to load vars for managed_node2 30564 1726882920.22627: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882920.22653: Calling all_plugins_play to load vars for managed_node2 30564 1726882920.22657: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882920.22677: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000024ae 30564 1726882920.22680: WORKER PROCESS EXITING 30564 1726882920.23203: Calling groups_plugins_play to load vars for managed_node2 30564 1726882920.24787: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882920.27025: done with get_vars() 30564 1726882920.27049: done getting variables 30564 1726882920.27118: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:42:00 -0400 (0:00:00.061) 0:01:58.852 ****** 30564 1726882920.27152: entering _queue_task() for managed_node2/service 30564 1726882920.27471: worker is 1 (out of 1 available) 30564 1726882920.27486: exiting _queue_task() for managed_node2/service 30564 1726882920.27509: done queuing things up, now waiting for results queue to drain 30564 1726882920.27511: waiting for pending results... 30564 1726882920.27861: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30564 1726882920.28041: in run() - task 0e448fcc-3ce9-4216-acec-0000000024af 30564 1726882920.28075: variable 'ansible_search_path' from source: unknown 30564 1726882920.28085: variable 'ansible_search_path' from source: unknown 30564 1726882920.28127: calling self._execute() 30564 1726882920.28244: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882920.28257: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882920.28286: variable 'omit' from source: magic vars 30564 1726882920.29595: variable 'ansible_distribution_major_version' from source: facts 30564 1726882920.29684: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882920.29917: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882920.30137: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882920.35175: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882920.35249: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882920.35301: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882920.35339: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882920.35377: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882920.35466: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882920.35523: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882920.35553: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882920.35612: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882920.35632: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882920.35686: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882920.35722: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882920.35752: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882920.35802: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882920.35829: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882920.35880: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882920.35915: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882920.35953: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882920.36006: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882920.36025: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882920.36239: variable 'network_connections' from source: include params 30564 1726882920.36271: variable 'interface' from source: play vars 30564 1726882920.36353: variable 'interface' from source: play vars 30564 1726882920.36444: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882920.36641: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882920.36709: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882920.36748: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882920.36801: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882920.36849: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882920.36885: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882920.36927: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882920.36961: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882920.37039: variable '__network_team_connections_defined' from source: role '' defaults 30564 1726882920.37351: variable 'network_connections' from source: include params 30564 1726882920.37371: variable 'interface' from source: play vars 30564 1726882920.37447: variable 'interface' from source: play vars 30564 1726882920.37491: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30564 1726882920.37500: when evaluation is False, skipping this task 30564 1726882920.37508: _execute() done 30564 1726882920.37513: dumping result to json 30564 1726882920.37521: done dumping result, returning 30564 1726882920.37535: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-4216-acec-0000000024af] 30564 1726882920.37544: sending task result for task 0e448fcc-3ce9-4216-acec-0000000024af skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30564 1726882920.37718: no more pending results, returning what we have 30564 1726882920.37723: results queue empty 30564 1726882920.37724: checking for any_errors_fatal 30564 1726882920.37732: done checking for any_errors_fatal 30564 1726882920.37733: checking for max_fail_percentage 30564 1726882920.37735: done checking for max_fail_percentage 30564 1726882920.37736: checking to see if all hosts have failed and the running result is not ok 30564 1726882920.37736: done checking to see if all hosts have failed 30564 1726882920.37737: getting the remaining hosts for this loop 30564 1726882920.37739: done getting the remaining hosts for this loop 30564 1726882920.37744: getting the next task for host managed_node2 30564 1726882920.37753: done getting next task for host managed_node2 30564 1726882920.37758: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30564 1726882920.37765: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882920.37799: getting variables 30564 1726882920.37802: in VariableManager get_vars() 30564 1726882920.37849: Calling all_inventory to load vars for managed_node2 30564 1726882920.37852: Calling groups_inventory to load vars for managed_node2 30564 1726882920.37855: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882920.37869: Calling all_plugins_play to load vars for managed_node2 30564 1726882920.37873: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882920.37876: Calling groups_plugins_play to load vars for managed_node2 30564 1726882920.38822: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000024af 30564 1726882920.38825: WORKER PROCESS EXITING 30564 1726882920.41335: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882920.43840: done with get_vars() 30564 1726882920.43884: done getting variables 30564 1726882920.43993: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:42:00 -0400 (0:00:00.168) 0:01:59.021 ****** 30564 1726882920.44034: entering _queue_task() for managed_node2/service 30564 1726882920.44413: worker is 1 (out of 1 available) 30564 1726882920.44425: exiting _queue_task() for managed_node2/service 30564 1726882920.44437: done queuing things up, now waiting for results queue to drain 30564 1726882920.44438: waiting for pending results... 30564 1726882920.44746: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30564 1726882920.44899: in run() - task 0e448fcc-3ce9-4216-acec-0000000024b0 30564 1726882920.44943: variable 'ansible_search_path' from source: unknown 30564 1726882920.44951: variable 'ansible_search_path' from source: unknown 30564 1726882920.44997: calling self._execute() 30564 1726882920.45138: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882920.45156: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882920.45187: variable 'omit' from source: magic vars 30564 1726882920.45721: variable 'ansible_distribution_major_version' from source: facts 30564 1726882920.45739: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882920.45928: variable 'network_provider' from source: set_fact 30564 1726882920.45939: variable 'network_state' from source: role '' defaults 30564 1726882920.45953: Evaluated conditional (network_provider == "nm" or network_state != {}): True 30564 1726882920.45967: variable 'omit' from source: magic vars 30564 1726882920.46059: variable 'omit' from source: magic vars 30564 1726882920.46119: variable 'network_service_name' from source: role '' defaults 30564 1726882920.46204: variable 'network_service_name' from source: role '' defaults 30564 1726882920.46387: variable '__network_provider_setup' from source: role '' defaults 30564 1726882920.46409: variable '__network_service_name_default_nm' from source: role '' defaults 30564 1726882920.46540: variable '__network_service_name_default_nm' from source: role '' defaults 30564 1726882920.46560: variable '__network_packages_default_nm' from source: role '' defaults 30564 1726882920.46637: variable '__network_packages_default_nm' from source: role '' defaults 30564 1726882920.46981: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882920.51753: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882920.51836: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882920.51896: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882920.51948: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882920.51990: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882920.52080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882920.52127: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882920.52158: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882920.52215: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882920.52242: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882920.52340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882920.52404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882920.52502: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882920.52593: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882920.52625: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882920.52950: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30564 1726882920.53092: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882920.53132: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882920.53194: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882920.53241: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882920.53261: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882920.53385: variable 'ansible_python' from source: facts 30564 1726882920.53420: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30564 1726882920.54376: variable '__network_wpa_supplicant_required' from source: role '' defaults 30564 1726882920.54626: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30564 1726882920.54897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882920.54997: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882920.55093: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882920.55208: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882920.55249: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882920.55350: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882920.55553: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882920.55588: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882920.55704: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882920.55776: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882920.56048: variable 'network_connections' from source: include params 30564 1726882920.56184: variable 'interface' from source: play vars 30564 1726882920.56269: variable 'interface' from source: play vars 30564 1726882920.56736: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882920.57119: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882920.57173: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882920.57216: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882920.57318: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882920.57580: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882920.57632: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882920.57666: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882920.57790: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882920.57925: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882920.58361: variable 'network_connections' from source: include params 30564 1726882920.58371: variable 'interface' from source: play vars 30564 1726882920.58443: variable 'interface' from source: play vars 30564 1726882920.58479: variable '__network_packages_default_wireless' from source: role '' defaults 30564 1726882920.58654: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882920.59373: variable 'network_connections' from source: include params 30564 1726882920.59377: variable 'interface' from source: play vars 30564 1726882920.59443: variable 'interface' from source: play vars 30564 1726882920.59472: variable '__network_packages_default_team' from source: role '' defaults 30564 1726882920.59544: variable '__network_team_connections_defined' from source: role '' defaults 30564 1726882920.59848: variable 'network_connections' from source: include params 30564 1726882920.59852: variable 'interface' from source: play vars 30564 1726882920.59924: variable 'interface' from source: play vars 30564 1726882920.59976: variable '__network_service_name_default_initscripts' from source: role '' defaults 30564 1726882920.60037: variable '__network_service_name_default_initscripts' from source: role '' defaults 30564 1726882920.60043: variable '__network_packages_default_initscripts' from source: role '' defaults 30564 1726882920.60101: variable '__network_packages_default_initscripts' from source: role '' defaults 30564 1726882920.60427: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30564 1726882920.60985: variable 'network_connections' from source: include params 30564 1726882920.60992: variable 'interface' from source: play vars 30564 1726882920.61076: variable 'interface' from source: play vars 30564 1726882920.61083: variable 'ansible_distribution' from source: facts 30564 1726882920.61088: variable '__network_rh_distros' from source: role '' defaults 30564 1726882920.61097: variable 'ansible_distribution_major_version' from source: facts 30564 1726882920.61133: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30564 1726882920.61354: variable 'ansible_distribution' from source: facts 30564 1726882920.61357: variable '__network_rh_distros' from source: role '' defaults 30564 1726882920.61364: variable 'ansible_distribution_major_version' from source: facts 30564 1726882920.61378: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30564 1726882920.61558: variable 'ansible_distribution' from source: facts 30564 1726882920.61561: variable '__network_rh_distros' from source: role '' defaults 30564 1726882920.61570: variable 'ansible_distribution_major_version' from source: facts 30564 1726882920.61604: variable 'network_provider' from source: set_fact 30564 1726882920.61677: variable 'omit' from source: magic vars 30564 1726882920.61704: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882920.61732: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882920.61750: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882920.61770: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882920.61778: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882920.61807: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882920.61810: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882920.61813: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882920.61913: Set connection var ansible_timeout to 10 30564 1726882920.61916: Set connection var ansible_pipelining to False 30564 1726882920.61919: Set connection var ansible_shell_type to sh 30564 1726882920.61925: Set connection var ansible_shell_executable to /bin/sh 30564 1726882920.61933: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882920.61935: Set connection var ansible_connection to ssh 30564 1726882920.61963: variable 'ansible_shell_executable' from source: unknown 30564 1726882920.61966: variable 'ansible_connection' from source: unknown 30564 1726882920.61980: variable 'ansible_module_compression' from source: unknown 30564 1726882920.61984: variable 'ansible_shell_type' from source: unknown 30564 1726882920.61986: variable 'ansible_shell_executable' from source: unknown 30564 1726882920.61988: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882920.61990: variable 'ansible_pipelining' from source: unknown 30564 1726882920.61992: variable 'ansible_timeout' from source: unknown 30564 1726882920.61995: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882920.62093: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882920.62104: variable 'omit' from source: magic vars 30564 1726882920.62111: starting attempt loop 30564 1726882920.62117: running the handler 30564 1726882920.62273: variable 'ansible_facts' from source: unknown 30564 1726882920.64603: _low_level_execute_command(): starting 30564 1726882920.64608: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882920.66160: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882920.66176: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882920.66187: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882920.66202: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882920.66240: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882920.66251: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882920.66260: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882920.66277: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882920.66285: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882920.66291: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882920.66299: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882920.66309: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882920.66321: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882920.66330: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882920.66335: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882920.66345: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882920.66422: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882920.66441: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882920.66454: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882920.66691: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882920.68270: stdout chunk (state=3): >>>/root <<< 30564 1726882920.68417: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882920.68420: stdout chunk (state=3): >>><<< 30564 1726882920.68430: stderr chunk (state=3): >>><<< 30564 1726882920.68449: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882920.68461: _low_level_execute_command(): starting 30564 1726882920.68472: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882920.6844902-35779-128781844566571 `" && echo ansible-tmp-1726882920.6844902-35779-128781844566571="` echo /root/.ansible/tmp/ansible-tmp-1726882920.6844902-35779-128781844566571 `" ) && sleep 0' 30564 1726882920.70707: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882920.70711: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882920.70753: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882920.70757: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration <<< 30564 1726882920.70775: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882920.70778: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882920.70798: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 30564 1726882920.70801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882920.70875: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882920.71005: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882920.71009: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882920.71140: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882920.73031: stdout chunk (state=3): >>>ansible-tmp-1726882920.6844902-35779-128781844566571=/root/.ansible/tmp/ansible-tmp-1726882920.6844902-35779-128781844566571 <<< 30564 1726882920.73182: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882920.73240: stderr chunk (state=3): >>><<< 30564 1726882920.73243: stdout chunk (state=3): >>><<< 30564 1726882920.73478: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882920.6844902-35779-128781844566571=/root/.ansible/tmp/ansible-tmp-1726882920.6844902-35779-128781844566571 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882920.73481: variable 'ansible_module_compression' from source: unknown 30564 1726882920.73488: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 30564 1726882920.73491: variable 'ansible_facts' from source: unknown 30564 1726882920.73625: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882920.6844902-35779-128781844566571/AnsiballZ_systemd.py 30564 1726882920.74299: Sending initial data 30564 1726882920.74302: Sent initial data (156 bytes) 30564 1726882920.76794: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882920.76854: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882920.76877: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882920.76895: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882920.76934: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882920.76986: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882920.77007: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882920.77025: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882920.77072: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882920.77091: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882920.77104: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882920.77119: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882920.77136: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882920.77149: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882920.77165: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882920.77187: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882920.77348: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882920.77376: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882920.77406: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882920.77541: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882920.79348: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882920.79444: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882920.79543: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmpsj9lkidv /root/.ansible/tmp/ansible-tmp-1726882920.6844902-35779-128781844566571/AnsiballZ_systemd.py <<< 30564 1726882920.79637: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882920.82783: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882920.82991: stderr chunk (state=3): >>><<< 30564 1726882920.82994: stdout chunk (state=3): >>><<< 30564 1726882920.82997: done transferring module to remote 30564 1726882920.82999: _low_level_execute_command(): starting 30564 1726882920.83001: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882920.6844902-35779-128781844566571/ /root/.ansible/tmp/ansible-tmp-1726882920.6844902-35779-128781844566571/AnsiballZ_systemd.py && sleep 0' 30564 1726882920.84496: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882920.84511: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882920.84527: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882920.84545: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882920.84591: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882920.84604: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882920.84618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882920.84635: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882920.84647: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882920.84658: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882920.84676: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882920.84690: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882920.84707: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882920.84721: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882920.84734: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882920.84748: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882920.84830: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882920.84862: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882920.84884: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882920.85014: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882920.86857: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882920.86861: stdout chunk (state=3): >>><<< 30564 1726882920.86865: stderr chunk (state=3): >>><<< 30564 1726882920.86959: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882920.86963: _low_level_execute_command(): starting 30564 1726882920.86975: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882920.6844902-35779-128781844566571/AnsiballZ_systemd.py && sleep 0' 30564 1726882920.88237: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882920.88483: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882920.88497: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882920.88514: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882920.88554: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882920.88575: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882920.88591: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882920.88609: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882920.88622: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882920.88634: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882920.88647: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882920.88661: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882920.88685: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882920.88699: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882920.88711: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882920.88724: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882920.89241: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882920.89263: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882920.89285: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882920.89420: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882921.14069: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6692", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ExecMainStartTimestampMonotonic": "202392137", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6692", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3602", "MemoryCurrent": "9191424", "MemoryAvailable": "infinity", "CPUUsageNSec": "2382249000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft"<<< 30564 1726882921.14105: stdout chunk (state=3): >>>: "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service multi-user.target network.target shutdown.target cloud-init.service", "After": "cloud-init-local.service dbus-broker.service network-pre.target system.slice dbus.socket systemd-journald.socket basic.target sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:57 EDT", "StateChangeTimestampMonotonic": "316658837", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveExitTimestampMonotonic": "202392395", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveEnterTimestampMonotonic": "202472383", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveExitTimestampMonotonic": "202362940", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveEnterTimestampMonotonic": "202381901", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ConditionTimestampMonotonic": "202382734", "AssertTimestamp": "Fri 2024-09-20 21:31:03 EDT", "AssertTimestampMonotonic": "202382737", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "55e27919215348fab37a11b7ea324f90", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 30564 1726882921.15613: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882921.15618: stdout chunk (state=3): >>><<< 30564 1726882921.15621: stderr chunk (state=3): >>><<< 30564 1726882921.15677: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6692", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ExecMainStartTimestampMonotonic": "202392137", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6692", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3602", "MemoryCurrent": "9191424", "MemoryAvailable": "infinity", "CPUUsageNSec": "2382249000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service multi-user.target network.target shutdown.target cloud-init.service", "After": "cloud-init-local.service dbus-broker.service network-pre.target system.slice dbus.socket systemd-journald.socket basic.target sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:57 EDT", "StateChangeTimestampMonotonic": "316658837", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveExitTimestampMonotonic": "202392395", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveEnterTimestampMonotonic": "202472383", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveExitTimestampMonotonic": "202362940", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveEnterTimestampMonotonic": "202381901", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ConditionTimestampMonotonic": "202382734", "AssertTimestamp": "Fri 2024-09-20 21:31:03 EDT", "AssertTimestampMonotonic": "202382737", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "55e27919215348fab37a11b7ea324f90", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882921.15919: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882920.6844902-35779-128781844566571/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882921.15931: _low_level_execute_command(): starting 30564 1726882921.15934: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882920.6844902-35779-128781844566571/ > /dev/null 2>&1 && sleep 0' 30564 1726882921.18230: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882921.18235: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882921.18278: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882921.18284: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882921.18308: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882921.18343: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882921.18499: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882921.18541: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882921.18623: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882921.18674: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882921.18947: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882921.19023: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882921.19171: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882921.19374: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882921.21180: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882921.21203: stderr chunk (state=3): >>><<< 30564 1726882921.21206: stdout chunk (state=3): >>><<< 30564 1726882921.21217: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882921.21224: handler run complete 30564 1726882921.21262: attempt loop complete, returning result 30564 1726882921.21267: _execute() done 30564 1726882921.21269: dumping result to json 30564 1726882921.21283: done dumping result, returning 30564 1726882921.21291: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0e448fcc-3ce9-4216-acec-0000000024b0] 30564 1726882921.21300: sending task result for task 0e448fcc-3ce9-4216-acec-0000000024b0 30564 1726882921.21501: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000024b0 30564 1726882921.21504: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30564 1726882921.21570: no more pending results, returning what we have 30564 1726882921.21573: results queue empty 30564 1726882921.21574: checking for any_errors_fatal 30564 1726882921.21582: done checking for any_errors_fatal 30564 1726882921.21583: checking for max_fail_percentage 30564 1726882921.21584: done checking for max_fail_percentage 30564 1726882921.21585: checking to see if all hosts have failed and the running result is not ok 30564 1726882921.21586: done checking to see if all hosts have failed 30564 1726882921.21587: getting the remaining hosts for this loop 30564 1726882921.21589: done getting the remaining hosts for this loop 30564 1726882921.21592: getting the next task for host managed_node2 30564 1726882921.21600: done getting next task for host managed_node2 30564 1726882921.21604: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30564 1726882921.21610: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882921.21624: getting variables 30564 1726882921.21626: in VariableManager get_vars() 30564 1726882921.21669: Calling all_inventory to load vars for managed_node2 30564 1726882921.21671: Calling groups_inventory to load vars for managed_node2 30564 1726882921.21674: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882921.21684: Calling all_plugins_play to load vars for managed_node2 30564 1726882921.21686: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882921.21689: Calling groups_plugins_play to load vars for managed_node2 30564 1726882921.22665: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882921.24733: done with get_vars() 30564 1726882921.24757: done getting variables 30564 1726882921.24802: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:42:01 -0400 (0:00:00.808) 0:01:59.829 ****** 30564 1726882921.24835: entering _queue_task() for managed_node2/service 30564 1726882921.25134: worker is 1 (out of 1 available) 30564 1726882921.25146: exiting _queue_task() for managed_node2/service 30564 1726882921.25157: done queuing things up, now waiting for results queue to drain 30564 1726882921.25159: waiting for pending results... 30564 1726882921.25332: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30564 1726882921.25441: in run() - task 0e448fcc-3ce9-4216-acec-0000000024b1 30564 1726882921.25452: variable 'ansible_search_path' from source: unknown 30564 1726882921.25455: variable 'ansible_search_path' from source: unknown 30564 1726882921.25494: calling self._execute() 30564 1726882921.25585: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882921.25589: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882921.25600: variable 'omit' from source: magic vars 30564 1726882921.26017: variable 'ansible_distribution_major_version' from source: facts 30564 1726882921.26021: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882921.26080: variable 'network_provider' from source: set_fact 30564 1726882921.26086: Evaluated conditional (network_provider == "nm"): True 30564 1726882921.26149: variable '__network_wpa_supplicant_required' from source: role '' defaults 30564 1726882921.26231: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30564 1726882921.26352: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882921.28578: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882921.28641: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882921.28681: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882921.28715: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882921.28746: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882921.28824: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882921.28852: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882921.28881: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882921.28922: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882921.28936: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882921.28986: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882921.29008: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882921.29034: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882921.29078: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882921.29109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882921.29139: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882921.29167: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882921.29194: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882921.29242: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882921.29262: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882921.29444: variable 'network_connections' from source: include params 30564 1726882921.29462: variable 'interface' from source: play vars 30564 1726882921.29529: variable 'interface' from source: play vars 30564 1726882921.29615: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882921.29789: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882921.29824: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882921.29853: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882921.29886: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882921.29934: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882921.29959: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882921.29989: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882921.30030: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882921.30097: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882921.30359: variable 'network_connections' from source: include params 30564 1726882921.30365: variable 'interface' from source: play vars 30564 1726882921.30425: variable 'interface' from source: play vars 30564 1726882921.30460: Evaluated conditional (__network_wpa_supplicant_required): False 30564 1726882921.30465: when evaluation is False, skipping this task 30564 1726882921.30468: _execute() done 30564 1726882921.30470: dumping result to json 30564 1726882921.30478: done dumping result, returning 30564 1726882921.30483: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0e448fcc-3ce9-4216-acec-0000000024b1] 30564 1726882921.30542: sending task result for task 0e448fcc-3ce9-4216-acec-0000000024b1 30564 1726882921.30712: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000024b1 30564 1726882921.30716: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 30564 1726882921.30760: no more pending results, returning what we have 30564 1726882921.30765: results queue empty 30564 1726882921.30766: checking for any_errors_fatal 30564 1726882921.30787: done checking for any_errors_fatal 30564 1726882921.30787: checking for max_fail_percentage 30564 1726882921.30789: done checking for max_fail_percentage 30564 1726882921.30790: checking to see if all hosts have failed and the running result is not ok 30564 1726882921.30791: done checking to see if all hosts have failed 30564 1726882921.30791: getting the remaining hosts for this loop 30564 1726882921.30817: done getting the remaining hosts for this loop 30564 1726882921.30821: getting the next task for host managed_node2 30564 1726882921.30831: done getting next task for host managed_node2 30564 1726882921.30835: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 30564 1726882921.30841: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882921.30958: getting variables 30564 1726882921.30962: in VariableManager get_vars() 30564 1726882921.31070: Calling all_inventory to load vars for managed_node2 30564 1726882921.31073: Calling groups_inventory to load vars for managed_node2 30564 1726882921.31076: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882921.31088: Calling all_plugins_play to load vars for managed_node2 30564 1726882921.31091: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882921.31122: Calling groups_plugins_play to load vars for managed_node2 30564 1726882921.34826: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882921.36939: done with get_vars() 30564 1726882921.36970: done getting variables 30564 1726882921.37035: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:42:01 -0400 (0:00:00.122) 0:01:59.952 ****** 30564 1726882921.37080: entering _queue_task() for managed_node2/service 30564 1726882921.37456: worker is 1 (out of 1 available) 30564 1726882921.37482: exiting _queue_task() for managed_node2/service 30564 1726882921.37496: done queuing things up, now waiting for results queue to drain 30564 1726882921.37498: waiting for pending results... 30564 1726882921.37832: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 30564 1726882921.38031: in run() - task 0e448fcc-3ce9-4216-acec-0000000024b2 30564 1726882921.38054: variable 'ansible_search_path' from source: unknown 30564 1726882921.38067: variable 'ansible_search_path' from source: unknown 30564 1726882921.38115: calling self._execute() 30564 1726882921.39072: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882921.39086: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882921.39103: variable 'omit' from source: magic vars 30564 1726882921.39824: variable 'ansible_distribution_major_version' from source: facts 30564 1726882921.39843: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882921.39979: variable 'network_provider' from source: set_fact 30564 1726882921.39996: Evaluated conditional (network_provider == "initscripts"): False 30564 1726882921.40012: when evaluation is False, skipping this task 30564 1726882921.40019: _execute() done 30564 1726882921.40026: dumping result to json 30564 1726882921.40033: done dumping result, returning 30564 1726882921.40043: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [0e448fcc-3ce9-4216-acec-0000000024b2] 30564 1726882921.40054: sending task result for task 0e448fcc-3ce9-4216-acec-0000000024b2 skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30564 1726882921.40212: no more pending results, returning what we have 30564 1726882921.40217: results queue empty 30564 1726882921.40218: checking for any_errors_fatal 30564 1726882921.40231: done checking for any_errors_fatal 30564 1726882921.40232: checking for max_fail_percentage 30564 1726882921.40234: done checking for max_fail_percentage 30564 1726882921.40235: checking to see if all hosts have failed and the running result is not ok 30564 1726882921.40236: done checking to see if all hosts have failed 30564 1726882921.40237: getting the remaining hosts for this loop 30564 1726882921.40239: done getting the remaining hosts for this loop 30564 1726882921.40243: getting the next task for host managed_node2 30564 1726882921.40254: done getting next task for host managed_node2 30564 1726882921.40259: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30564 1726882921.40266: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882921.40304: getting variables 30564 1726882921.40306: in VariableManager get_vars() 30564 1726882921.40356: Calling all_inventory to load vars for managed_node2 30564 1726882921.40359: Calling groups_inventory to load vars for managed_node2 30564 1726882921.40361: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882921.40376: Calling all_plugins_play to load vars for managed_node2 30564 1726882921.40379: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882921.40382: Calling groups_plugins_play to load vars for managed_node2 30564 1726882921.41317: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000024b2 30564 1726882921.41321: WORKER PROCESS EXITING 30564 1726882921.42342: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882921.44338: done with get_vars() 30564 1726882921.44361: done getting variables 30564 1726882921.44432: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:42:01 -0400 (0:00:00.073) 0:02:00.025 ****** 30564 1726882921.44469: entering _queue_task() for managed_node2/copy 30564 1726882921.44805: worker is 1 (out of 1 available) 30564 1726882921.44819: exiting _queue_task() for managed_node2/copy 30564 1726882921.44841: done queuing things up, now waiting for results queue to drain 30564 1726882921.44843: waiting for pending results... 30564 1726882921.46083: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30564 1726882921.46302: in run() - task 0e448fcc-3ce9-4216-acec-0000000024b3 30564 1726882921.46324: variable 'ansible_search_path' from source: unknown 30564 1726882921.46331: variable 'ansible_search_path' from source: unknown 30564 1726882921.46377: calling self._execute() 30564 1726882921.46478: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882921.46499: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882921.46514: variable 'omit' from source: magic vars 30564 1726882921.46940: variable 'ansible_distribution_major_version' from source: facts 30564 1726882921.46960: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882921.47093: variable 'network_provider' from source: set_fact 30564 1726882921.47104: Evaluated conditional (network_provider == "initscripts"): False 30564 1726882921.47111: when evaluation is False, skipping this task 30564 1726882921.47118: _execute() done 30564 1726882921.47126: dumping result to json 30564 1726882921.47136: done dumping result, returning 30564 1726882921.47160: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0e448fcc-3ce9-4216-acec-0000000024b3] 30564 1726882921.47175: sending task result for task 0e448fcc-3ce9-4216-acec-0000000024b3 skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 30564 1726882921.47332: no more pending results, returning what we have 30564 1726882921.47337: results queue empty 30564 1726882921.47338: checking for any_errors_fatal 30564 1726882921.47349: done checking for any_errors_fatal 30564 1726882921.47350: checking for max_fail_percentage 30564 1726882921.47352: done checking for max_fail_percentage 30564 1726882921.47353: checking to see if all hosts have failed and the running result is not ok 30564 1726882921.47354: done checking to see if all hosts have failed 30564 1726882921.47355: getting the remaining hosts for this loop 30564 1726882921.47357: done getting the remaining hosts for this loop 30564 1726882921.47361: getting the next task for host managed_node2 30564 1726882921.47374: done getting next task for host managed_node2 30564 1726882921.47378: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30564 1726882921.47384: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882921.47420: getting variables 30564 1726882921.47422: in VariableManager get_vars() 30564 1726882921.47468: Calling all_inventory to load vars for managed_node2 30564 1726882921.47470: Calling groups_inventory to load vars for managed_node2 30564 1726882921.47473: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882921.47486: Calling all_plugins_play to load vars for managed_node2 30564 1726882921.47489: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882921.47492: Calling groups_plugins_play to load vars for managed_node2 30564 1726882921.48524: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000024b3 30564 1726882921.48527: WORKER PROCESS EXITING 30564 1726882921.50305: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882921.52545: done with get_vars() 30564 1726882921.52572: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:42:01 -0400 (0:00:00.081) 0:02:00.107 ****** 30564 1726882921.52660: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 30564 1726882921.52990: worker is 1 (out of 1 available) 30564 1726882921.53001: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 30564 1726882921.53033: done queuing things up, now waiting for results queue to drain 30564 1726882921.53035: waiting for pending results... 30564 1726882921.53357: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30564 1726882921.53525: in run() - task 0e448fcc-3ce9-4216-acec-0000000024b4 30564 1726882921.53547: variable 'ansible_search_path' from source: unknown 30564 1726882921.53560: variable 'ansible_search_path' from source: unknown 30564 1726882921.53610: calling self._execute() 30564 1726882921.53733: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882921.53745: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882921.53759: variable 'omit' from source: magic vars 30564 1726882921.54195: variable 'ansible_distribution_major_version' from source: facts 30564 1726882921.54214: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882921.54225: variable 'omit' from source: magic vars 30564 1726882921.54306: variable 'omit' from source: magic vars 30564 1726882921.54490: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882921.67797: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882921.67868: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882921.67920: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882921.67958: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882921.68002: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882921.68086: variable 'network_provider' from source: set_fact 30564 1726882921.68232: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882921.68268: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882921.68300: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882921.68357: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882921.68380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882921.68471: variable 'omit' from source: magic vars 30564 1726882921.69243: variable 'omit' from source: magic vars 30564 1726882921.69409: variable 'network_connections' from source: include params 30564 1726882921.69539: variable 'interface' from source: play vars 30564 1726882921.69604: variable 'interface' from source: play vars 30564 1726882921.69882: variable 'omit' from source: magic vars 30564 1726882921.69893: variable '__lsr_ansible_managed' from source: task vars 30564 1726882921.69945: variable '__lsr_ansible_managed' from source: task vars 30564 1726882921.70121: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 30564 1726882921.70340: Loaded config def from plugin (lookup/template) 30564 1726882921.70351: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 30564 1726882921.70382: File lookup term: get_ansible_managed.j2 30564 1726882921.70390: variable 'ansible_search_path' from source: unknown 30564 1726882921.70408: evaluation_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 30564 1726882921.70427: search_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 30564 1726882921.70447: variable 'ansible_search_path' from source: unknown 30564 1726882921.80331: variable 'ansible_managed' from source: unknown 30564 1726882921.80745: variable 'omit' from source: magic vars 30564 1726882921.80823: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882921.80926: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882921.80944: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882921.80967: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882921.80982: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882921.81121: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882921.81131: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882921.81139: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882921.81355: Set connection var ansible_timeout to 10 30564 1726882921.81369: Set connection var ansible_pipelining to False 30564 1726882921.81377: Set connection var ansible_shell_type to sh 30564 1726882921.81389: Set connection var ansible_shell_executable to /bin/sh 30564 1726882921.81401: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882921.81408: Set connection var ansible_connection to ssh 30564 1726882921.81440: variable 'ansible_shell_executable' from source: unknown 30564 1726882921.81452: variable 'ansible_connection' from source: unknown 30564 1726882921.81460: variable 'ansible_module_compression' from source: unknown 30564 1726882921.81468: variable 'ansible_shell_type' from source: unknown 30564 1726882921.81476: variable 'ansible_shell_executable' from source: unknown 30564 1726882921.81484: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882921.81492: variable 'ansible_pipelining' from source: unknown 30564 1726882921.81562: variable 'ansible_timeout' from source: unknown 30564 1726882921.81575: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882921.81813: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30564 1726882921.81835: variable 'omit' from source: magic vars 30564 1726882921.81845: starting attempt loop 30564 1726882921.81852: running the handler 30564 1726882921.81869: _low_level_execute_command(): starting 30564 1726882921.82000: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882921.83877: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882921.83895: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882921.83911: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882921.83933: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882921.83979: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882921.83994: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882921.84011: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882921.84034: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882921.84045: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882921.84056: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882921.84070: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882921.84083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882921.84098: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882921.84113: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882921.84129: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882921.84149: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882921.84229: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882921.84248: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882921.84271: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882921.84480: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882921.86139: stdout chunk (state=3): >>>/root <<< 30564 1726882921.86328: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882921.86331: stdout chunk (state=3): >>><<< 30564 1726882921.86333: stderr chunk (state=3): >>><<< 30564 1726882921.86439: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882921.86442: _low_level_execute_command(): starting 30564 1726882921.86445: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882921.863505-35827-136565564939744 `" && echo ansible-tmp-1726882921.863505-35827-136565564939744="` echo /root/.ansible/tmp/ansible-tmp-1726882921.863505-35827-136565564939744 `" ) && sleep 0' 30564 1726882921.86961: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882921.86971: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882921.87010: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882921.87015: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882921.87025: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882921.87031: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882921.87089: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882921.87106: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882921.87217: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882921.89099: stdout chunk (state=3): >>>ansible-tmp-1726882921.863505-35827-136565564939744=/root/.ansible/tmp/ansible-tmp-1726882921.863505-35827-136565564939744 <<< 30564 1726882921.89255: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882921.89282: stderr chunk (state=3): >>><<< 30564 1726882921.89285: stdout chunk (state=3): >>><<< 30564 1726882921.89370: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882921.863505-35827-136565564939744=/root/.ansible/tmp/ansible-tmp-1726882921.863505-35827-136565564939744 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882921.89373: variable 'ansible_module_compression' from source: unknown 30564 1726882921.89430: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 30564 1726882921.89433: variable 'ansible_facts' from source: unknown 30564 1726882921.89520: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882921.863505-35827-136565564939744/AnsiballZ_network_connections.py 30564 1726882921.89657: Sending initial data 30564 1726882921.89660: Sent initial data (167 bytes) 30564 1726882921.90708: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882921.90724: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882921.90728: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882921.90746: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882921.90777: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882921.90784: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882921.90809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882921.90812: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882921.90815: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882921.90817: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882921.90834: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882921.90837: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882921.90854: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882921.90856: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882921.90859: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882921.90881: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882921.90959: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882921.90982: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882921.90987: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882921.91091: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882921.92840: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882921.92934: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882921.93030: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmpjurxb5f_ /root/.ansible/tmp/ansible-tmp-1726882921.863505-35827-136565564939744/AnsiballZ_network_connections.py <<< 30564 1726882921.93125: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882921.94907: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882921.94979: stderr chunk (state=3): >>><<< 30564 1726882921.94983: stdout chunk (state=3): >>><<< 30564 1726882921.95005: done transferring module to remote 30564 1726882921.95017: _low_level_execute_command(): starting 30564 1726882921.95022: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882921.863505-35827-136565564939744/ /root/.ansible/tmp/ansible-tmp-1726882921.863505-35827-136565564939744/AnsiballZ_network_connections.py && sleep 0' 30564 1726882921.95638: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882921.95644: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882921.96280: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882921.96328: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882921.96352: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882921.96579: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882921.98300: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882921.98343: stderr chunk (state=3): >>><<< 30564 1726882921.98347: stdout chunk (state=3): >>><<< 30564 1726882921.98360: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882921.98362: _low_level_execute_command(): starting 30564 1726882921.98373: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882921.863505-35827-136565564939744/AnsiballZ_network_connections.py && sleep 0' 30564 1726882921.98799: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882921.98809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882921.98837: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882921.98843: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882921.98852: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882921.98861: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882921.98873: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882921.98882: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882921.98890: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882921.98896: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882921.98948: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882921.98973: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882921.98979: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882921.99085: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882922.28927: stdout chunk (state=3): >>>Traceback (most recent call last): <<< 30564 1726882922.28932: stdout chunk (state=3): >>> File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_1r19g6yb/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_1r19g6yb/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on statebr/776ac6a9-ad06-421f-84d7-faa75bbe803f: error=unknown <<< 30564 1726882922.29132: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 30564 1726882922.30657: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882922.30713: stderr chunk (state=3): >>><<< 30564 1726882922.30717: stdout chunk (state=3): >>><<< 30564 1726882922.30732: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_1r19g6yb/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_1r19g6yb/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on statebr/776ac6a9-ad06-421f-84d7-faa75bbe803f: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882922.30758: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882921.863505-35827-136565564939744/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882922.30766: _low_level_execute_command(): starting 30564 1726882922.30774: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882921.863505-35827-136565564939744/ > /dev/null 2>&1 && sleep 0' 30564 1726882922.31698: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882922.31796: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882922.31920: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882922.33778: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882922.33840: stderr chunk (state=3): >>><<< 30564 1726882922.33843: stdout chunk (state=3): >>><<< 30564 1726882922.33859: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882922.33863: handler run complete 30564 1726882922.33898: attempt loop complete, returning result 30564 1726882922.33902: _execute() done 30564 1726882922.33904: dumping result to json 30564 1726882922.33906: done dumping result, returning 30564 1726882922.33916: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0e448fcc-3ce9-4216-acec-0000000024b4] 30564 1726882922.33919: sending task result for task 0e448fcc-3ce9-4216-acec-0000000024b4 30564 1726882922.34029: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000024b4 30564 1726882922.34032: WORKER PROCESS EXITING changed: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 30564 1726882922.34125: no more pending results, returning what we have 30564 1726882922.34128: results queue empty 30564 1726882922.34129: checking for any_errors_fatal 30564 1726882922.34135: done checking for any_errors_fatal 30564 1726882922.34136: checking for max_fail_percentage 30564 1726882922.34138: done checking for max_fail_percentage 30564 1726882922.34139: checking to see if all hosts have failed and the running result is not ok 30564 1726882922.34140: done checking to see if all hosts have failed 30564 1726882922.34140: getting the remaining hosts for this loop 30564 1726882922.34142: done getting the remaining hosts for this loop 30564 1726882922.34146: getting the next task for host managed_node2 30564 1726882922.34152: done getting next task for host managed_node2 30564 1726882922.34156: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 30564 1726882922.34161: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882922.34175: getting variables 30564 1726882922.34177: in VariableManager get_vars() 30564 1726882922.34217: Calling all_inventory to load vars for managed_node2 30564 1726882922.34220: Calling groups_inventory to load vars for managed_node2 30564 1726882922.34222: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882922.34231: Calling all_plugins_play to load vars for managed_node2 30564 1726882922.34234: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882922.34236: Calling groups_plugins_play to load vars for managed_node2 30564 1726882922.41733: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882922.42732: done with get_vars() 30564 1726882922.42756: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:42:02 -0400 (0:00:00.901) 0:02:01.009 ****** 30564 1726882922.42832: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 30564 1726882922.43193: worker is 1 (out of 1 available) 30564 1726882922.43204: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 30564 1726882922.43218: done queuing things up, now waiting for results queue to drain 30564 1726882922.43219: waiting for pending results... 30564 1726882922.43516: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 30564 1726882922.43693: in run() - task 0e448fcc-3ce9-4216-acec-0000000024b5 30564 1726882922.43714: variable 'ansible_search_path' from source: unknown 30564 1726882922.43724: variable 'ansible_search_path' from source: unknown 30564 1726882922.43771: calling self._execute() 30564 1726882922.43878: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882922.43894: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882922.43911: variable 'omit' from source: magic vars 30564 1726882922.44241: variable 'ansible_distribution_major_version' from source: facts 30564 1726882922.44252: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882922.44345: variable 'network_state' from source: role '' defaults 30564 1726882922.44354: Evaluated conditional (network_state != {}): False 30564 1726882922.44359: when evaluation is False, skipping this task 30564 1726882922.44362: _execute() done 30564 1726882922.44366: dumping result to json 30564 1726882922.44372: done dumping result, returning 30564 1726882922.44375: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [0e448fcc-3ce9-4216-acec-0000000024b5] 30564 1726882922.44379: sending task result for task 0e448fcc-3ce9-4216-acec-0000000024b5 30564 1726882922.44472: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000024b5 30564 1726882922.44475: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30564 1726882922.44529: no more pending results, returning what we have 30564 1726882922.44533: results queue empty 30564 1726882922.44534: checking for any_errors_fatal 30564 1726882922.44548: done checking for any_errors_fatal 30564 1726882922.44549: checking for max_fail_percentage 30564 1726882922.44550: done checking for max_fail_percentage 30564 1726882922.44551: checking to see if all hosts have failed and the running result is not ok 30564 1726882922.44552: done checking to see if all hosts have failed 30564 1726882922.44553: getting the remaining hosts for this loop 30564 1726882922.44555: done getting the remaining hosts for this loop 30564 1726882922.44558: getting the next task for host managed_node2 30564 1726882922.44567: done getting next task for host managed_node2 30564 1726882922.44573: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30564 1726882922.44578: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882922.44605: getting variables 30564 1726882922.44607: in VariableManager get_vars() 30564 1726882922.44643: Calling all_inventory to load vars for managed_node2 30564 1726882922.44645: Calling groups_inventory to load vars for managed_node2 30564 1726882922.44647: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882922.44656: Calling all_plugins_play to load vars for managed_node2 30564 1726882922.44658: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882922.44661: Calling groups_plugins_play to load vars for managed_node2 30564 1726882922.45470: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882922.47026: done with get_vars() 30564 1726882922.47047: done getting variables 30564 1726882922.47108: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:42:02 -0400 (0:00:00.043) 0:02:01.052 ****** 30564 1726882922.47143: entering _queue_task() for managed_node2/debug 30564 1726882922.47409: worker is 1 (out of 1 available) 30564 1726882922.47422: exiting _queue_task() for managed_node2/debug 30564 1726882922.47432: done queuing things up, now waiting for results queue to drain 30564 1726882922.47433: waiting for pending results... 30564 1726882922.47807: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30564 1726882922.47987: in run() - task 0e448fcc-3ce9-4216-acec-0000000024b6 30564 1726882922.47999: variable 'ansible_search_path' from source: unknown 30564 1726882922.48004: variable 'ansible_search_path' from source: unknown 30564 1726882922.48052: calling self._execute() 30564 1726882922.48134: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882922.48138: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882922.48148: variable 'omit' from source: magic vars 30564 1726882922.48442: variable 'ansible_distribution_major_version' from source: facts 30564 1726882922.48453: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882922.48457: variable 'omit' from source: magic vars 30564 1726882922.48502: variable 'omit' from source: magic vars 30564 1726882922.48525: variable 'omit' from source: magic vars 30564 1726882922.48560: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882922.48588: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882922.48604: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882922.48617: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882922.48627: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882922.48650: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882922.48654: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882922.48657: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882922.48727: Set connection var ansible_timeout to 10 30564 1726882922.48731: Set connection var ansible_pipelining to False 30564 1726882922.48733: Set connection var ansible_shell_type to sh 30564 1726882922.48739: Set connection var ansible_shell_executable to /bin/sh 30564 1726882922.48745: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882922.48749: Set connection var ansible_connection to ssh 30564 1726882922.48773: variable 'ansible_shell_executable' from source: unknown 30564 1726882922.48777: variable 'ansible_connection' from source: unknown 30564 1726882922.48780: variable 'ansible_module_compression' from source: unknown 30564 1726882922.48782: variable 'ansible_shell_type' from source: unknown 30564 1726882922.48785: variable 'ansible_shell_executable' from source: unknown 30564 1726882922.48787: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882922.48789: variable 'ansible_pipelining' from source: unknown 30564 1726882922.48791: variable 'ansible_timeout' from source: unknown 30564 1726882922.48793: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882922.48890: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882922.48902: variable 'omit' from source: magic vars 30564 1726882922.48905: starting attempt loop 30564 1726882922.48908: running the handler 30564 1726882922.49006: variable '__network_connections_result' from source: set_fact 30564 1726882922.49047: handler run complete 30564 1726882922.49060: attempt loop complete, returning result 30564 1726882922.49063: _execute() done 30564 1726882922.49076: dumping result to json 30564 1726882922.49079: done dumping result, returning 30564 1726882922.49082: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0e448fcc-3ce9-4216-acec-0000000024b6] 30564 1726882922.49084: sending task result for task 0e448fcc-3ce9-4216-acec-0000000024b6 30564 1726882922.49177: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000024b6 30564 1726882922.49180: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "" ] } 30564 1726882922.49260: no more pending results, returning what we have 30564 1726882922.49265: results queue empty 30564 1726882922.49266: checking for any_errors_fatal 30564 1726882922.49272: done checking for any_errors_fatal 30564 1726882922.49273: checking for max_fail_percentage 30564 1726882922.49276: done checking for max_fail_percentage 30564 1726882922.49277: checking to see if all hosts have failed and the running result is not ok 30564 1726882922.49278: done checking to see if all hosts have failed 30564 1726882922.49279: getting the remaining hosts for this loop 30564 1726882922.49280: done getting the remaining hosts for this loop 30564 1726882922.49283: getting the next task for host managed_node2 30564 1726882922.49290: done getting next task for host managed_node2 30564 1726882922.49294: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30564 1726882922.49299: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882922.49311: getting variables 30564 1726882922.49312: in VariableManager get_vars() 30564 1726882922.49345: Calling all_inventory to load vars for managed_node2 30564 1726882922.49347: Calling groups_inventory to load vars for managed_node2 30564 1726882922.49349: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882922.49355: Calling all_plugins_play to load vars for managed_node2 30564 1726882922.49356: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882922.49358: Calling groups_plugins_play to load vars for managed_node2 30564 1726882922.50321: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882922.52081: done with get_vars() 30564 1726882922.52104: done getting variables 30564 1726882922.52154: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:42:02 -0400 (0:00:00.050) 0:02:01.103 ****** 30564 1726882922.52186: entering _queue_task() for managed_node2/debug 30564 1726882922.52392: worker is 1 (out of 1 available) 30564 1726882922.52405: exiting _queue_task() for managed_node2/debug 30564 1726882922.52416: done queuing things up, now waiting for results queue to drain 30564 1726882922.52417: waiting for pending results... 30564 1726882922.52660: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30564 1726882922.52761: in run() - task 0e448fcc-3ce9-4216-acec-0000000024b7 30564 1726882922.52776: variable 'ansible_search_path' from source: unknown 30564 1726882922.52783: variable 'ansible_search_path' from source: unknown 30564 1726882922.52828: calling self._execute() 30564 1726882922.52986: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882922.52989: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882922.52992: variable 'omit' from source: magic vars 30564 1726882922.53378: variable 'ansible_distribution_major_version' from source: facts 30564 1726882922.53395: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882922.53412: variable 'omit' from source: magic vars 30564 1726882922.53485: variable 'omit' from source: magic vars 30564 1726882922.53532: variable 'omit' from source: magic vars 30564 1726882922.53580: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882922.53621: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882922.53682: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882922.53723: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882922.53782: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882922.53831: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882922.53839: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882922.53859: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882922.53974: Set connection var ansible_timeout to 10 30564 1726882922.53979: Set connection var ansible_pipelining to False 30564 1726882922.53985: Set connection var ansible_shell_type to sh 30564 1726882922.53993: Set connection var ansible_shell_executable to /bin/sh 30564 1726882922.54011: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882922.54020: Set connection var ansible_connection to ssh 30564 1726882922.54039: variable 'ansible_shell_executable' from source: unknown 30564 1726882922.54042: variable 'ansible_connection' from source: unknown 30564 1726882922.54045: variable 'ansible_module_compression' from source: unknown 30564 1726882922.54048: variable 'ansible_shell_type' from source: unknown 30564 1726882922.54051: variable 'ansible_shell_executable' from source: unknown 30564 1726882922.54053: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882922.54055: variable 'ansible_pipelining' from source: unknown 30564 1726882922.54057: variable 'ansible_timeout' from source: unknown 30564 1726882922.54067: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882922.54160: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882922.54178: variable 'omit' from source: magic vars 30564 1726882922.54188: starting attempt loop 30564 1726882922.54192: running the handler 30564 1726882922.54226: variable '__network_connections_result' from source: set_fact 30564 1726882922.54284: variable '__network_connections_result' from source: set_fact 30564 1726882922.54359: handler run complete 30564 1726882922.54378: attempt loop complete, returning result 30564 1726882922.54381: _execute() done 30564 1726882922.54385: dumping result to json 30564 1726882922.54387: done dumping result, returning 30564 1726882922.54397: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0e448fcc-3ce9-4216-acec-0000000024b7] 30564 1726882922.54400: sending task result for task 0e448fcc-3ce9-4216-acec-0000000024b7 30564 1726882922.54488: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000024b7 30564 1726882922.54491: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 30564 1726882922.54607: no more pending results, returning what we have 30564 1726882922.54610: results queue empty 30564 1726882922.54611: checking for any_errors_fatal 30564 1726882922.54615: done checking for any_errors_fatal 30564 1726882922.54617: checking for max_fail_percentage 30564 1726882922.54619: done checking for max_fail_percentage 30564 1726882922.54619: checking to see if all hosts have failed and the running result is not ok 30564 1726882922.54620: done checking to see if all hosts have failed 30564 1726882922.54621: getting the remaining hosts for this loop 30564 1726882922.54622: done getting the remaining hosts for this loop 30564 1726882922.54625: getting the next task for host managed_node2 30564 1726882922.54632: done getting next task for host managed_node2 30564 1726882922.54635: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30564 1726882922.54639: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882922.54647: getting variables 30564 1726882922.54648: in VariableManager get_vars() 30564 1726882922.54681: Calling all_inventory to load vars for managed_node2 30564 1726882922.54683: Calling groups_inventory to load vars for managed_node2 30564 1726882922.54685: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882922.54692: Calling all_plugins_play to load vars for managed_node2 30564 1726882922.54693: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882922.54699: Calling groups_plugins_play to load vars for managed_node2 30564 1726882922.55708: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882922.57252: done with get_vars() 30564 1726882922.57271: done getting variables 30564 1726882922.57311: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:42:02 -0400 (0:00:00.051) 0:02:01.154 ****** 30564 1726882922.57334: entering _queue_task() for managed_node2/debug 30564 1726882922.57518: worker is 1 (out of 1 available) 30564 1726882922.57532: exiting _queue_task() for managed_node2/debug 30564 1726882922.57544: done queuing things up, now waiting for results queue to drain 30564 1726882922.57545: waiting for pending results... 30564 1726882922.57730: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30564 1726882922.57831: in run() - task 0e448fcc-3ce9-4216-acec-0000000024b8 30564 1726882922.57841: variable 'ansible_search_path' from source: unknown 30564 1726882922.57851: variable 'ansible_search_path' from source: unknown 30564 1726882922.57884: calling self._execute() 30564 1726882922.57965: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882922.57973: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882922.57982: variable 'omit' from source: magic vars 30564 1726882922.58266: variable 'ansible_distribution_major_version' from source: facts 30564 1726882922.58281: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882922.58369: variable 'network_state' from source: role '' defaults 30564 1726882922.58381: Evaluated conditional (network_state != {}): False 30564 1726882922.58386: when evaluation is False, skipping this task 30564 1726882922.58388: _execute() done 30564 1726882922.58391: dumping result to json 30564 1726882922.58395: done dumping result, returning 30564 1726882922.58398: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0e448fcc-3ce9-4216-acec-0000000024b8] 30564 1726882922.58408: sending task result for task 0e448fcc-3ce9-4216-acec-0000000024b8 30564 1726882922.58491: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000024b8 30564 1726882922.58494: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "network_state != {}" } 30564 1726882922.58566: no more pending results, returning what we have 30564 1726882922.58569: results queue empty 30564 1726882922.58570: checking for any_errors_fatal 30564 1726882922.58576: done checking for any_errors_fatal 30564 1726882922.58577: checking for max_fail_percentage 30564 1726882922.58579: done checking for max_fail_percentage 30564 1726882922.58580: checking to see if all hosts have failed and the running result is not ok 30564 1726882922.58580: done checking to see if all hosts have failed 30564 1726882922.58581: getting the remaining hosts for this loop 30564 1726882922.58582: done getting the remaining hosts for this loop 30564 1726882922.58585: getting the next task for host managed_node2 30564 1726882922.58592: done getting next task for host managed_node2 30564 1726882922.58596: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 30564 1726882922.58601: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882922.58629: getting variables 30564 1726882922.58631: in VariableManager get_vars() 30564 1726882922.58658: Calling all_inventory to load vars for managed_node2 30564 1726882922.58659: Calling groups_inventory to load vars for managed_node2 30564 1726882922.58661: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882922.58669: Calling all_plugins_play to load vars for managed_node2 30564 1726882922.58671: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882922.58674: Calling groups_plugins_play to load vars for managed_node2 30564 1726882922.59481: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882922.60469: done with get_vars() 30564 1726882922.60485: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:42:02 -0400 (0:00:00.032) 0:02:01.186 ****** 30564 1726882922.60544: entering _queue_task() for managed_node2/ping 30564 1726882922.60724: worker is 1 (out of 1 available) 30564 1726882922.60738: exiting _queue_task() for managed_node2/ping 30564 1726882922.60750: done queuing things up, now waiting for results queue to drain 30564 1726882922.60751: waiting for pending results... 30564 1726882922.60929: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 30564 1726882922.61025: in run() - task 0e448fcc-3ce9-4216-acec-0000000024b9 30564 1726882922.61035: variable 'ansible_search_path' from source: unknown 30564 1726882922.61038: variable 'ansible_search_path' from source: unknown 30564 1726882922.61067: calling self._execute() 30564 1726882922.61142: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882922.61146: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882922.61155: variable 'omit' from source: magic vars 30564 1726882922.61425: variable 'ansible_distribution_major_version' from source: facts 30564 1726882922.61439: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882922.61442: variable 'omit' from source: magic vars 30564 1726882922.61486: variable 'omit' from source: magic vars 30564 1726882922.61507: variable 'omit' from source: magic vars 30564 1726882922.61538: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882922.61566: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882922.61583: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882922.61596: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882922.61605: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882922.61628: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882922.61632: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882922.61634: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882922.61708: Set connection var ansible_timeout to 10 30564 1726882922.61712: Set connection var ansible_pipelining to False 30564 1726882922.61714: Set connection var ansible_shell_type to sh 30564 1726882922.61719: Set connection var ansible_shell_executable to /bin/sh 30564 1726882922.61725: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882922.61728: Set connection var ansible_connection to ssh 30564 1726882922.61746: variable 'ansible_shell_executable' from source: unknown 30564 1726882922.61749: variable 'ansible_connection' from source: unknown 30564 1726882922.61752: variable 'ansible_module_compression' from source: unknown 30564 1726882922.61754: variable 'ansible_shell_type' from source: unknown 30564 1726882922.61756: variable 'ansible_shell_executable' from source: unknown 30564 1726882922.61760: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882922.61762: variable 'ansible_pipelining' from source: unknown 30564 1726882922.61764: variable 'ansible_timeout' from source: unknown 30564 1726882922.61770: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882922.61910: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30564 1726882922.61918: variable 'omit' from source: magic vars 30564 1726882922.61923: starting attempt loop 30564 1726882922.61926: running the handler 30564 1726882922.61937: _low_level_execute_command(): starting 30564 1726882922.61944: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882922.62449: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882922.62458: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882922.62495: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882922.62508: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882922.62559: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882922.62576: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882922.62693: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882922.64365: stdout chunk (state=3): >>>/root <<< 30564 1726882922.64466: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882922.64514: stderr chunk (state=3): >>><<< 30564 1726882922.64522: stdout chunk (state=3): >>><<< 30564 1726882922.64544: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882922.64557: _low_level_execute_command(): starting 30564 1726882922.64570: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882922.6454885-35878-95415778847752 `" && echo ansible-tmp-1726882922.6454885-35878-95415778847752="` echo /root/.ansible/tmp/ansible-tmp-1726882922.6454885-35878-95415778847752 `" ) && sleep 0' 30564 1726882922.65004: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882922.65021: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882922.65043: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882922.65055: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882922.65107: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882922.65119: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882922.65237: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882922.67107: stdout chunk (state=3): >>>ansible-tmp-1726882922.6454885-35878-95415778847752=/root/.ansible/tmp/ansible-tmp-1726882922.6454885-35878-95415778847752 <<< 30564 1726882922.67216: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882922.67255: stderr chunk (state=3): >>><<< 30564 1726882922.67258: stdout chunk (state=3): >>><<< 30564 1726882922.67282: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882922.6454885-35878-95415778847752=/root/.ansible/tmp/ansible-tmp-1726882922.6454885-35878-95415778847752 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882922.67316: variable 'ansible_module_compression' from source: unknown 30564 1726882922.67347: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 30564 1726882922.67380: variable 'ansible_facts' from source: unknown 30564 1726882922.67435: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882922.6454885-35878-95415778847752/AnsiballZ_ping.py 30564 1726882922.67537: Sending initial data 30564 1726882922.67540: Sent initial data (152 bytes) 30564 1726882922.68189: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882922.68194: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882922.68224: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882922.68236: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882922.68295: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882922.68307: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882922.68406: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882922.70131: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 <<< 30564 1726882922.70138: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882922.70225: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882922.70324: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmp4hyaeu3l /root/.ansible/tmp/ansible-tmp-1726882922.6454885-35878-95415778847752/AnsiballZ_ping.py <<< 30564 1726882922.70418: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882922.71416: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882922.71505: stderr chunk (state=3): >>><<< 30564 1726882922.71508: stdout chunk (state=3): >>><<< 30564 1726882922.71522: done transferring module to remote 30564 1726882922.71530: _low_level_execute_command(): starting 30564 1726882922.71535: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882922.6454885-35878-95415778847752/ /root/.ansible/tmp/ansible-tmp-1726882922.6454885-35878-95415778847752/AnsiballZ_ping.py && sleep 0' 30564 1726882922.71953: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882922.71959: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882922.72011: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882922.72017: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882922.72020: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 30564 1726882922.72022: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882922.72073: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882922.72085: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882922.72189: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882922.73945: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882922.73985: stderr chunk (state=3): >>><<< 30564 1726882922.73989: stdout chunk (state=3): >>><<< 30564 1726882922.74001: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882922.74007: _low_level_execute_command(): starting 30564 1726882922.74010: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882922.6454885-35878-95415778847752/AnsiballZ_ping.py && sleep 0' 30564 1726882922.74447: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882922.74452: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882922.74497: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882922.74503: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882922.74506: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882922.74561: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882922.74568: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882922.74680: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882922.87536: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 30564 1726882922.88497: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882922.88542: stderr chunk (state=3): >>><<< 30564 1726882922.88546: stdout chunk (state=3): >>><<< 30564 1726882922.88560: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882922.88585: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882922.6454885-35878-95415778847752/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882922.88594: _low_level_execute_command(): starting 30564 1726882922.88597: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882922.6454885-35878-95415778847752/ > /dev/null 2>&1 && sleep 0' 30564 1726882922.89033: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882922.89039: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882922.89074: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882922.89080: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882922.89088: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882922.89095: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882922.89101: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882922.89110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882922.89116: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882922.89188: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882922.89194: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882922.89196: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882922.89293: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882922.91090: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882922.91131: stderr chunk (state=3): >>><<< 30564 1726882922.91134: stdout chunk (state=3): >>><<< 30564 1726882922.91146: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882922.91152: handler run complete 30564 1726882922.91165: attempt loop complete, returning result 30564 1726882922.91170: _execute() done 30564 1726882922.91173: dumping result to json 30564 1726882922.91175: done dumping result, returning 30564 1726882922.91182: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0e448fcc-3ce9-4216-acec-0000000024b9] 30564 1726882922.91188: sending task result for task 0e448fcc-3ce9-4216-acec-0000000024b9 30564 1726882922.91280: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000024b9 30564 1726882922.91283: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 30564 1726882922.91352: no more pending results, returning what we have 30564 1726882922.91355: results queue empty 30564 1726882922.91356: checking for any_errors_fatal 30564 1726882922.91365: done checking for any_errors_fatal 30564 1726882922.91366: checking for max_fail_percentage 30564 1726882922.91369: done checking for max_fail_percentage 30564 1726882922.91370: checking to see if all hosts have failed and the running result is not ok 30564 1726882922.91371: done checking to see if all hosts have failed 30564 1726882922.91372: getting the remaining hosts for this loop 30564 1726882922.91374: done getting the remaining hosts for this loop 30564 1726882922.91378: getting the next task for host managed_node2 30564 1726882922.91388: done getting next task for host managed_node2 30564 1726882922.91390: ^ task is: TASK: meta (role_complete) 30564 1726882922.91396: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882922.91410: getting variables 30564 1726882922.91411: in VariableManager get_vars() 30564 1726882922.91454: Calling all_inventory to load vars for managed_node2 30564 1726882922.91457: Calling groups_inventory to load vars for managed_node2 30564 1726882922.91459: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882922.91472: Calling all_plugins_play to load vars for managed_node2 30564 1726882922.91475: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882922.91479: Calling groups_plugins_play to load vars for managed_node2 30564 1726882922.92479: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882922.93601: done with get_vars() 30564 1726882922.93634: done getting variables 30564 1726882922.93721: done queuing things up, now waiting for results queue to drain 30564 1726882922.93729: results queue empty 30564 1726882922.93730: checking for any_errors_fatal 30564 1726882922.93733: done checking for any_errors_fatal 30564 1726882922.93734: checking for max_fail_percentage 30564 1726882922.93735: done checking for max_fail_percentage 30564 1726882922.93736: checking to see if all hosts have failed and the running result is not ok 30564 1726882922.93736: done checking to see if all hosts have failed 30564 1726882922.93737: getting the remaining hosts for this loop 30564 1726882922.93738: done getting the remaining hosts for this loop 30564 1726882922.93740: getting the next task for host managed_node2 30564 1726882922.93746: done getting next task for host managed_node2 30564 1726882922.93748: ^ task is: TASK: Test 30564 1726882922.93750: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882922.93753: getting variables 30564 1726882922.93754: in VariableManager get_vars() 30564 1726882922.93769: Calling all_inventory to load vars for managed_node2 30564 1726882922.93771: Calling groups_inventory to load vars for managed_node2 30564 1726882922.93774: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882922.93778: Calling all_plugins_play to load vars for managed_node2 30564 1726882922.93781: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882922.93783: Calling groups_plugins_play to load vars for managed_node2 30564 1726882922.95260: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882922.97160: done with get_vars() 30564 1726882922.97184: done getting variables TASK [Test] ******************************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:30 Friday 20 September 2024 21:42:02 -0400 (0:00:00.367) 0:02:01.554 ****** 30564 1726882922.97269: entering _queue_task() for managed_node2/include_tasks 30564 1726882922.97606: worker is 1 (out of 1 available) 30564 1726882922.97617: exiting _queue_task() for managed_node2/include_tasks 30564 1726882922.97634: done queuing things up, now waiting for results queue to drain 30564 1726882922.97636: waiting for pending results... 30564 1726882922.97948: running TaskExecutor() for managed_node2/TASK: Test 30564 1726882922.98054: in run() - task 0e448fcc-3ce9-4216-acec-0000000020b1 30564 1726882922.98072: variable 'ansible_search_path' from source: unknown 30564 1726882922.98075: variable 'ansible_search_path' from source: unknown 30564 1726882922.98123: variable 'lsr_test' from source: include params 30564 1726882922.98359: variable 'lsr_test' from source: include params 30564 1726882922.98435: variable 'omit' from source: magic vars 30564 1726882922.98582: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882922.98591: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882922.98602: variable 'omit' from source: magic vars 30564 1726882922.98880: variable 'ansible_distribution_major_version' from source: facts 30564 1726882922.98889: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882922.98896: variable 'item' from source: unknown 30564 1726882922.98974: variable 'item' from source: unknown 30564 1726882922.99003: variable 'item' from source: unknown 30564 1726882922.99074: variable 'item' from source: unknown 30564 1726882922.99208: dumping result to json 30564 1726882922.99212: done dumping result, returning 30564 1726882922.99215: done running TaskExecutor() for managed_node2/TASK: Test [0e448fcc-3ce9-4216-acec-0000000020b1] 30564 1726882922.99217: sending task result for task 0e448fcc-3ce9-4216-acec-0000000020b1 30564 1726882922.99258: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000020b1 30564 1726882922.99262: WORKER PROCESS EXITING 30564 1726882922.99349: no more pending results, returning what we have 30564 1726882922.99354: in VariableManager get_vars() 30564 1726882922.99400: Calling all_inventory to load vars for managed_node2 30564 1726882922.99404: Calling groups_inventory to load vars for managed_node2 30564 1726882922.99407: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882922.99420: Calling all_plugins_play to load vars for managed_node2 30564 1726882922.99423: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882922.99426: Calling groups_plugins_play to load vars for managed_node2 30564 1726882923.01007: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882923.02909: done with get_vars() 30564 1726882923.02929: variable 'ansible_search_path' from source: unknown 30564 1726882923.02930: variable 'ansible_search_path' from source: unknown 30564 1726882923.02980: we have included files to process 30564 1726882923.02981: generating all_blocks data 30564 1726882923.02984: done generating all_blocks data 30564 1726882923.02991: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml 30564 1726882923.02992: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml 30564 1726882923.02994: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml 30564 1726882923.03126: done processing included file 30564 1726882923.03128: iterating over new_blocks loaded from include file 30564 1726882923.03130: in VariableManager get_vars() 30564 1726882923.03146: done with get_vars() 30564 1726882923.03147: filtering new block on tags 30564 1726882923.03186: done filtering new block on tags 30564 1726882923.03189: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml for managed_node2 => (item=tasks/remove+down_profile.yml) 30564 1726882923.03193: extending task lists for all hosts with included blocks 30564 1726882923.04315: done extending task lists 30564 1726882923.04316: done processing included files 30564 1726882923.04317: results queue empty 30564 1726882923.04318: checking for any_errors_fatal 30564 1726882923.04320: done checking for any_errors_fatal 30564 1726882923.04321: checking for max_fail_percentage 30564 1726882923.04322: done checking for max_fail_percentage 30564 1726882923.04322: checking to see if all hosts have failed and the running result is not ok 30564 1726882923.04323: done checking to see if all hosts have failed 30564 1726882923.04324: getting the remaining hosts for this loop 30564 1726882923.04325: done getting the remaining hosts for this loop 30564 1726882923.04328: getting the next task for host managed_node2 30564 1726882923.04332: done getting next task for host managed_node2 30564 1726882923.04334: ^ task is: TASK: Include network role 30564 1726882923.04337: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882923.04340: getting variables 30564 1726882923.04340: in VariableManager get_vars() 30564 1726882923.04352: Calling all_inventory to load vars for managed_node2 30564 1726882923.04361: Calling groups_inventory to load vars for managed_node2 30564 1726882923.04363: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882923.04372: Calling all_plugins_play to load vars for managed_node2 30564 1726882923.04375: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882923.04378: Calling groups_plugins_play to load vars for managed_node2 30564 1726882923.05827: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882923.08516: done with get_vars() 30564 1726882923.08544: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove+down_profile.yml:3 Friday 20 September 2024 21:42:03 -0400 (0:00:00.113) 0:02:01.667 ****** 30564 1726882923.08642: entering _queue_task() for managed_node2/include_role 30564 1726882923.08991: worker is 1 (out of 1 available) 30564 1726882923.09002: exiting _queue_task() for managed_node2/include_role 30564 1726882923.09014: done queuing things up, now waiting for results queue to drain 30564 1726882923.09015: waiting for pending results... 30564 1726882923.09313: running TaskExecutor() for managed_node2/TASK: Include network role 30564 1726882923.09425: in run() - task 0e448fcc-3ce9-4216-acec-000000002612 30564 1726882923.09439: variable 'ansible_search_path' from source: unknown 30564 1726882923.09442: variable 'ansible_search_path' from source: unknown 30564 1726882923.09483: calling self._execute() 30564 1726882923.09699: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882923.09703: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882923.09705: variable 'omit' from source: magic vars 30564 1726882923.11028: variable 'ansible_distribution_major_version' from source: facts 30564 1726882923.11041: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882923.11049: _execute() done 30564 1726882923.11052: dumping result to json 30564 1726882923.11055: done dumping result, returning 30564 1726882923.11059: done running TaskExecutor() for managed_node2/TASK: Include network role [0e448fcc-3ce9-4216-acec-000000002612] 30564 1726882923.11066: sending task result for task 0e448fcc-3ce9-4216-acec-000000002612 30564 1726882923.11185: done sending task result for task 0e448fcc-3ce9-4216-acec-000000002612 30564 1726882923.11189: WORKER PROCESS EXITING 30564 1726882923.11217: no more pending results, returning what we have 30564 1726882923.11223: in VariableManager get_vars() 30564 1726882923.11279: Calling all_inventory to load vars for managed_node2 30564 1726882923.11282: Calling groups_inventory to load vars for managed_node2 30564 1726882923.11286: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882923.11301: Calling all_plugins_play to load vars for managed_node2 30564 1726882923.11305: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882923.11308: Calling groups_plugins_play to load vars for managed_node2 30564 1726882923.15026: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882923.17000: done with get_vars() 30564 1726882923.17026: variable 'ansible_search_path' from source: unknown 30564 1726882923.17027: variable 'ansible_search_path' from source: unknown 30564 1726882923.17186: variable 'omit' from source: magic vars 30564 1726882923.17229: variable 'omit' from source: magic vars 30564 1726882923.17246: variable 'omit' from source: magic vars 30564 1726882923.17249: we have included files to process 30564 1726882923.17250: generating all_blocks data 30564 1726882923.17252: done generating all_blocks data 30564 1726882923.17253: processing included file: fedora.linux_system_roles.network 30564 1726882923.17278: in VariableManager get_vars() 30564 1726882923.17294: done with get_vars() 30564 1726882923.17323: in VariableManager get_vars() 30564 1726882923.17346: done with get_vars() 30564 1726882923.17390: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 30564 1726882923.17499: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 30564 1726882923.17574: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 30564 1726882923.18057: in VariableManager get_vars() 30564 1726882923.18084: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30564 1726882923.21500: iterating over new_blocks loaded from include file 30564 1726882923.21502: in VariableManager get_vars() 30564 1726882923.21520: done with get_vars() 30564 1726882923.21522: filtering new block on tags 30564 1726882923.21857: done filtering new block on tags 30564 1726882923.21860: in VariableManager get_vars() 30564 1726882923.21886: done with get_vars() 30564 1726882923.21888: filtering new block on tags 30564 1726882923.21905: done filtering new block on tags 30564 1726882923.21907: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node2 30564 1726882923.21914: extending task lists for all hosts with included blocks 30564 1726882923.22034: done extending task lists 30564 1726882923.22035: done processing included files 30564 1726882923.22036: results queue empty 30564 1726882923.22037: checking for any_errors_fatal 30564 1726882923.22041: done checking for any_errors_fatal 30564 1726882923.22042: checking for max_fail_percentage 30564 1726882923.22043: done checking for max_fail_percentage 30564 1726882923.22044: checking to see if all hosts have failed and the running result is not ok 30564 1726882923.22045: done checking to see if all hosts have failed 30564 1726882923.22046: getting the remaining hosts for this loop 30564 1726882923.22047: done getting the remaining hosts for this loop 30564 1726882923.22050: getting the next task for host managed_node2 30564 1726882923.22054: done getting next task for host managed_node2 30564 1726882923.22057: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30564 1726882923.22060: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882923.22075: getting variables 30564 1726882923.22076: in VariableManager get_vars() 30564 1726882923.22095: Calling all_inventory to load vars for managed_node2 30564 1726882923.22097: Calling groups_inventory to load vars for managed_node2 30564 1726882923.22099: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882923.22104: Calling all_plugins_play to load vars for managed_node2 30564 1726882923.22106: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882923.22108: Calling groups_plugins_play to load vars for managed_node2 30564 1726882923.23444: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882923.25402: done with get_vars() 30564 1726882923.25423: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:42:03 -0400 (0:00:00.168) 0:02:01.836 ****** 30564 1726882923.25508: entering _queue_task() for managed_node2/include_tasks 30564 1726882923.25892: worker is 1 (out of 1 available) 30564 1726882923.25905: exiting _queue_task() for managed_node2/include_tasks 30564 1726882923.25918: done queuing things up, now waiting for results queue to drain 30564 1726882923.25919: waiting for pending results... 30564 1726882923.26261: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 30564 1726882923.26426: in run() - task 0e448fcc-3ce9-4216-acec-000000002694 30564 1726882923.26452: variable 'ansible_search_path' from source: unknown 30564 1726882923.26461: variable 'ansible_search_path' from source: unknown 30564 1726882923.26512: calling self._execute() 30564 1726882923.26736: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882923.26748: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882923.26822: variable 'omit' from source: magic vars 30564 1726882923.27652: variable 'ansible_distribution_major_version' from source: facts 30564 1726882923.27676: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882923.27689: _execute() done 30564 1726882923.27697: dumping result to json 30564 1726882923.27703: done dumping result, returning 30564 1726882923.27750: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0e448fcc-3ce9-4216-acec-000000002694] 30564 1726882923.27760: sending task result for task 0e448fcc-3ce9-4216-acec-000000002694 30564 1726882923.27995: no more pending results, returning what we have 30564 1726882923.27999: in VariableManager get_vars() 30564 1726882923.28055: Calling all_inventory to load vars for managed_node2 30564 1726882923.28058: Calling groups_inventory to load vars for managed_node2 30564 1726882923.28061: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882923.28078: Calling all_plugins_play to load vars for managed_node2 30564 1726882923.28082: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882923.28085: Calling groups_plugins_play to load vars for managed_node2 30564 1726882923.29264: done sending task result for task 0e448fcc-3ce9-4216-acec-000000002694 30564 1726882923.29274: WORKER PROCESS EXITING 30564 1726882923.30184: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882923.32011: done with get_vars() 30564 1726882923.32037: variable 'ansible_search_path' from source: unknown 30564 1726882923.32039: variable 'ansible_search_path' from source: unknown 30564 1726882923.32084: we have included files to process 30564 1726882923.32085: generating all_blocks data 30564 1726882923.32087: done generating all_blocks data 30564 1726882923.32091: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30564 1726882923.32092: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30564 1726882923.32094: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 30564 1726882923.32729: done processing included file 30564 1726882923.32732: iterating over new_blocks loaded from include file 30564 1726882923.32733: in VariableManager get_vars() 30564 1726882923.32760: done with get_vars() 30564 1726882923.32762: filtering new block on tags 30564 1726882923.32803: done filtering new block on tags 30564 1726882923.32805: in VariableManager get_vars() 30564 1726882923.32831: done with get_vars() 30564 1726882923.32833: filtering new block on tags 30564 1726882923.32885: done filtering new block on tags 30564 1726882923.32888: in VariableManager get_vars() 30564 1726882923.32916: done with get_vars() 30564 1726882923.32918: filtering new block on tags 30564 1726882923.32960: done filtering new block on tags 30564 1726882923.32964: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 30564 1726882923.32972: extending task lists for all hosts with included blocks 30564 1726882923.35244: done extending task lists 30564 1726882923.35246: done processing included files 30564 1726882923.35247: results queue empty 30564 1726882923.35247: checking for any_errors_fatal 30564 1726882923.35250: done checking for any_errors_fatal 30564 1726882923.35251: checking for max_fail_percentage 30564 1726882923.35252: done checking for max_fail_percentage 30564 1726882923.35253: checking to see if all hosts have failed and the running result is not ok 30564 1726882923.35254: done checking to see if all hosts have failed 30564 1726882923.35255: getting the remaining hosts for this loop 30564 1726882923.35256: done getting the remaining hosts for this loop 30564 1726882923.35259: getting the next task for host managed_node2 30564 1726882923.35266: done getting next task for host managed_node2 30564 1726882923.35271: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30564 1726882923.35275: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882923.35287: getting variables 30564 1726882923.35288: in VariableManager get_vars() 30564 1726882923.35303: Calling all_inventory to load vars for managed_node2 30564 1726882923.35305: Calling groups_inventory to load vars for managed_node2 30564 1726882923.35421: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882923.35428: Calling all_plugins_play to load vars for managed_node2 30564 1726882923.35430: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882923.35434: Calling groups_plugins_play to load vars for managed_node2 30564 1726882923.37661: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882923.39571: done with get_vars() 30564 1726882923.39592: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:42:03 -0400 (0:00:00.141) 0:02:01.978 ****** 30564 1726882923.39684: entering _queue_task() for managed_node2/setup 30564 1726882923.40046: worker is 1 (out of 1 available) 30564 1726882923.40058: exiting _queue_task() for managed_node2/setup 30564 1726882923.40083: done queuing things up, now waiting for results queue to drain 30564 1726882923.40085: waiting for pending results... 30564 1726882923.40398: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 30564 1726882923.40583: in run() - task 0e448fcc-3ce9-4216-acec-0000000026eb 30564 1726882923.40602: variable 'ansible_search_path' from source: unknown 30564 1726882923.40609: variable 'ansible_search_path' from source: unknown 30564 1726882923.40656: calling self._execute() 30564 1726882923.40774: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882923.40786: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882923.40802: variable 'omit' from source: magic vars 30564 1726882923.41212: variable 'ansible_distribution_major_version' from source: facts 30564 1726882923.41229: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882923.41481: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882923.44203: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882923.44285: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882923.44330: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882923.44379: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882923.44411: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882923.44506: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882923.44544: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882923.44586: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882923.44632: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882923.44658: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882923.44721: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882923.44750: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882923.44794: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882923.44841: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882923.44862: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882923.45048: variable '__network_required_facts' from source: role '' defaults 30564 1726882923.45061: variable 'ansible_facts' from source: unknown 30564 1726882923.45916: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 30564 1726882923.45926: when evaluation is False, skipping this task 30564 1726882923.45934: _execute() done 30564 1726882923.45938: dumping result to json 30564 1726882923.45940: done dumping result, returning 30564 1726882923.45946: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0e448fcc-3ce9-4216-acec-0000000026eb] 30564 1726882923.45952: sending task result for task 0e448fcc-3ce9-4216-acec-0000000026eb 30564 1726882923.46040: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000026eb 30564 1726882923.46043: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30564 1726882923.46128: no more pending results, returning what we have 30564 1726882923.46132: results queue empty 30564 1726882923.46133: checking for any_errors_fatal 30564 1726882923.46135: done checking for any_errors_fatal 30564 1726882923.46135: checking for max_fail_percentage 30564 1726882923.46137: done checking for max_fail_percentage 30564 1726882923.46138: checking to see if all hosts have failed and the running result is not ok 30564 1726882923.46139: done checking to see if all hosts have failed 30564 1726882923.46140: getting the remaining hosts for this loop 30564 1726882923.46141: done getting the remaining hosts for this loop 30564 1726882923.46145: getting the next task for host managed_node2 30564 1726882923.46158: done getting next task for host managed_node2 30564 1726882923.46161: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 30564 1726882923.46169: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882923.46196: getting variables 30564 1726882923.46198: in VariableManager get_vars() 30564 1726882923.46238: Calling all_inventory to load vars for managed_node2 30564 1726882923.46240: Calling groups_inventory to load vars for managed_node2 30564 1726882923.46243: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882923.46251: Calling all_plugins_play to load vars for managed_node2 30564 1726882923.46254: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882923.46261: Calling groups_plugins_play to load vars for managed_node2 30564 1726882923.47937: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882923.49482: done with get_vars() 30564 1726882923.49497: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:42:03 -0400 (0:00:00.098) 0:02:02.077 ****** 30564 1726882923.49569: entering _queue_task() for managed_node2/stat 30564 1726882923.49784: worker is 1 (out of 1 available) 30564 1726882923.49795: exiting _queue_task() for managed_node2/stat 30564 1726882923.49807: done queuing things up, now waiting for results queue to drain 30564 1726882923.49808: waiting for pending results... 30564 1726882923.50002: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 30564 1726882923.50106: in run() - task 0e448fcc-3ce9-4216-acec-0000000026ed 30564 1726882923.50117: variable 'ansible_search_path' from source: unknown 30564 1726882923.50120: variable 'ansible_search_path' from source: unknown 30564 1726882923.50148: calling self._execute() 30564 1726882923.50227: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882923.50231: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882923.50240: variable 'omit' from source: magic vars 30564 1726882923.50518: variable 'ansible_distribution_major_version' from source: facts 30564 1726882923.50531: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882923.50647: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882923.50841: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882923.50876: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882923.50901: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882923.50925: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882923.51003: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882923.51020: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882923.51038: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882923.51056: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882923.51141: variable '__network_is_ostree' from source: set_fact 30564 1726882923.51147: Evaluated conditional (not __network_is_ostree is defined): False 30564 1726882923.51149: when evaluation is False, skipping this task 30564 1726882923.51232: _execute() done 30564 1726882923.51238: dumping result to json 30564 1726882923.51241: done dumping result, returning 30564 1726882923.51243: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0e448fcc-3ce9-4216-acec-0000000026ed] 30564 1726882923.51246: sending task result for task 0e448fcc-3ce9-4216-acec-0000000026ed skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30564 1726882923.51428: no more pending results, returning what we have 30564 1726882923.51433: results queue empty 30564 1726882923.51435: checking for any_errors_fatal 30564 1726882923.51443: done checking for any_errors_fatal 30564 1726882923.51443: checking for max_fail_percentage 30564 1726882923.51446: done checking for max_fail_percentage 30564 1726882923.51447: checking to see if all hosts have failed and the running result is not ok 30564 1726882923.51447: done checking to see if all hosts have failed 30564 1726882923.51448: getting the remaining hosts for this loop 30564 1726882923.51450: done getting the remaining hosts for this loop 30564 1726882923.51453: getting the next task for host managed_node2 30564 1726882923.51582: done getting next task for host managed_node2 30564 1726882923.51587: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30564 1726882923.51593: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882923.51604: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000026ed 30564 1726882923.51606: WORKER PROCESS EXITING 30564 1726882923.51636: getting variables 30564 1726882923.51638: in VariableManager get_vars() 30564 1726882923.51814: Calling all_inventory to load vars for managed_node2 30564 1726882923.51817: Calling groups_inventory to load vars for managed_node2 30564 1726882923.51820: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882923.51829: Calling all_plugins_play to load vars for managed_node2 30564 1726882923.51832: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882923.51835: Calling groups_plugins_play to load vars for managed_node2 30564 1726882923.53442: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882923.56125: done with get_vars() 30564 1726882923.56161: done getting variables 30564 1726882923.56217: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:42:03 -0400 (0:00:00.066) 0:02:02.143 ****** 30564 1726882923.56267: entering _queue_task() for managed_node2/set_fact 30564 1726882923.56584: worker is 1 (out of 1 available) 30564 1726882923.56601: exiting _queue_task() for managed_node2/set_fact 30564 1726882923.56614: done queuing things up, now waiting for results queue to drain 30564 1726882923.56615: waiting for pending results... 30564 1726882923.56934: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 30564 1726882923.57143: in run() - task 0e448fcc-3ce9-4216-acec-0000000026ee 30564 1726882923.57167: variable 'ansible_search_path' from source: unknown 30564 1726882923.57179: variable 'ansible_search_path' from source: unknown 30564 1726882923.57248: calling self._execute() 30564 1726882923.57383: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882923.57395: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882923.57409: variable 'omit' from source: magic vars 30564 1726882923.57849: variable 'ansible_distribution_major_version' from source: facts 30564 1726882923.57870: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882923.58068: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882923.58400: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882923.58465: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882923.58507: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882923.58544: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882923.58646: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882923.58695: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882923.58728: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882923.58761: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882923.58862: variable '__network_is_ostree' from source: set_fact 30564 1726882923.58879: Evaluated conditional (not __network_is_ostree is defined): False 30564 1726882923.58897: when evaluation is False, skipping this task 30564 1726882923.58906: _execute() done 30564 1726882923.58913: dumping result to json 30564 1726882923.58919: done dumping result, returning 30564 1726882923.58929: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0e448fcc-3ce9-4216-acec-0000000026ee] 30564 1726882923.58938: sending task result for task 0e448fcc-3ce9-4216-acec-0000000026ee skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 30564 1726882923.59088: no more pending results, returning what we have 30564 1726882923.59093: results queue empty 30564 1726882923.59094: checking for any_errors_fatal 30564 1726882923.59101: done checking for any_errors_fatal 30564 1726882923.59102: checking for max_fail_percentage 30564 1726882923.59104: done checking for max_fail_percentage 30564 1726882923.59105: checking to see if all hosts have failed and the running result is not ok 30564 1726882923.59106: done checking to see if all hosts have failed 30564 1726882923.59107: getting the remaining hosts for this loop 30564 1726882923.59109: done getting the remaining hosts for this loop 30564 1726882923.59113: getting the next task for host managed_node2 30564 1726882923.59126: done getting next task for host managed_node2 30564 1726882923.59130: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 30564 1726882923.59137: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882923.59175: getting variables 30564 1726882923.59177: in VariableManager get_vars() 30564 1726882923.59227: Calling all_inventory to load vars for managed_node2 30564 1726882923.59229: Calling groups_inventory to load vars for managed_node2 30564 1726882923.59232: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882923.59243: Calling all_plugins_play to load vars for managed_node2 30564 1726882923.59246: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882923.59249: Calling groups_plugins_play to load vars for managed_node2 30564 1726882923.60217: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000026ee 30564 1726882923.60221: WORKER PROCESS EXITING 30564 1726882923.61287: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882923.63189: done with get_vars() 30564 1726882923.63211: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:42:03 -0400 (0:00:00.070) 0:02:02.214 ****** 30564 1726882923.63313: entering _queue_task() for managed_node2/service_facts 30564 1726882923.63608: worker is 1 (out of 1 available) 30564 1726882923.63624: exiting _queue_task() for managed_node2/service_facts 30564 1726882923.63638: done queuing things up, now waiting for results queue to drain 30564 1726882923.63640: waiting for pending results... 30564 1726882923.63948: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 30564 1726882923.64114: in run() - task 0e448fcc-3ce9-4216-acec-0000000026f0 30564 1726882923.64135: variable 'ansible_search_path' from source: unknown 30564 1726882923.64143: variable 'ansible_search_path' from source: unknown 30564 1726882923.64195: calling self._execute() 30564 1726882923.64308: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882923.64319: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882923.64336: variable 'omit' from source: magic vars 30564 1726882923.64774: variable 'ansible_distribution_major_version' from source: facts 30564 1726882923.64792: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882923.64809: variable 'omit' from source: magic vars 30564 1726882923.64905: variable 'omit' from source: magic vars 30564 1726882923.64956: variable 'omit' from source: magic vars 30564 1726882923.65001: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882923.65054: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882923.65080: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882923.65103: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882923.65119: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882923.65168: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882923.65178: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882923.65187: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882923.65309: Set connection var ansible_timeout to 10 30564 1726882923.65320: Set connection var ansible_pipelining to False 30564 1726882923.65328: Set connection var ansible_shell_type to sh 30564 1726882923.65339: Set connection var ansible_shell_executable to /bin/sh 30564 1726882923.65358: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882923.65376: Set connection var ansible_connection to ssh 30564 1726882923.65409: variable 'ansible_shell_executable' from source: unknown 30564 1726882923.65418: variable 'ansible_connection' from source: unknown 30564 1726882923.65426: variable 'ansible_module_compression' from source: unknown 30564 1726882923.65433: variable 'ansible_shell_type' from source: unknown 30564 1726882923.65440: variable 'ansible_shell_executable' from source: unknown 30564 1726882923.65447: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882923.65461: variable 'ansible_pipelining' from source: unknown 30564 1726882923.65472: variable 'ansible_timeout' from source: unknown 30564 1726882923.65492: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882923.65711: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30564 1726882923.65728: variable 'omit' from source: magic vars 30564 1726882923.65737: starting attempt loop 30564 1726882923.65744: running the handler 30564 1726882923.65762: _low_level_execute_command(): starting 30564 1726882923.65777: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882923.66595: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882923.66611: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882923.66626: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882923.66644: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882923.66703: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882923.66716: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882923.66730: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882923.66750: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882923.66765: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882923.66787: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882923.66804: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882923.66819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882923.66836: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882923.66850: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882923.66865: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882923.66885: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882923.66973: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882923.67010: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882923.67031: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882923.67170: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882923.68838: stdout chunk (state=3): >>>/root <<< 30564 1726882923.68945: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882923.69030: stderr chunk (state=3): >>><<< 30564 1726882923.69042: stdout chunk (state=3): >>><<< 30564 1726882923.69071: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882923.69170: _low_level_execute_command(): starting 30564 1726882923.69174: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882923.6908023-35922-30402421775469 `" && echo ansible-tmp-1726882923.6908023-35922-30402421775469="` echo /root/.ansible/tmp/ansible-tmp-1726882923.6908023-35922-30402421775469 `" ) && sleep 0' 30564 1726882923.69759: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882923.69787: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882923.69803: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882923.69832: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882923.69874: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882923.69888: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882923.69902: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882923.69920: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882923.69941: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882923.69952: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882923.69966: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882923.69980: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882923.69996: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882923.70009: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882923.70020: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882923.70040: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882923.70120: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882923.70139: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882923.70166: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882923.70299: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882923.72167: stdout chunk (state=3): >>>ansible-tmp-1726882923.6908023-35922-30402421775469=/root/.ansible/tmp/ansible-tmp-1726882923.6908023-35922-30402421775469 <<< 30564 1726882923.72279: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882923.72357: stderr chunk (state=3): >>><<< 30564 1726882923.72370: stdout chunk (state=3): >>><<< 30564 1726882923.72476: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882923.6908023-35922-30402421775469=/root/.ansible/tmp/ansible-tmp-1726882923.6908023-35922-30402421775469 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882923.72479: variable 'ansible_module_compression' from source: unknown 30564 1726882923.72676: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 30564 1726882923.72680: variable 'ansible_facts' from source: unknown 30564 1726882923.72682: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882923.6908023-35922-30402421775469/AnsiballZ_service_facts.py 30564 1726882923.72744: Sending initial data 30564 1726882923.72747: Sent initial data (161 bytes) 30564 1726882923.73743: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882923.73772: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882923.73786: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882923.73801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882923.73850: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882923.73870: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882923.73896: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882923.73914: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882923.73927: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882923.73938: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882923.73951: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882923.73968: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882923.74006: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882923.74020: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882923.74033: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882923.74047: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882923.74136: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882923.74156: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882923.74173: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882923.74301: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882923.76025: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882923.76116: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882923.76212: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmp55n2uqtq /root/.ansible/tmp/ansible-tmp-1726882923.6908023-35922-30402421775469/AnsiballZ_service_facts.py <<< 30564 1726882923.76305: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882923.77458: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882923.77634: stderr chunk (state=3): >>><<< 30564 1726882923.77637: stdout chunk (state=3): >>><<< 30564 1726882923.77639: done transferring module to remote 30564 1726882923.77644: _low_level_execute_command(): starting 30564 1726882923.77646: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882923.6908023-35922-30402421775469/ /root/.ansible/tmp/ansible-tmp-1726882923.6908023-35922-30402421775469/AnsiballZ_service_facts.py && sleep 0' 30564 1726882923.78245: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882923.78257: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882923.78279: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882923.78296: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882923.78349: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882923.78361: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882923.78380: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882923.78398: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882923.78410: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882923.78431: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882923.78443: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882923.78459: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882923.78478: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882923.78491: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882923.78503: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882923.78516: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882923.78608: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882923.78624: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882923.78641: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882923.78755: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882923.80487: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882923.80532: stderr chunk (state=3): >>><<< 30564 1726882923.80536: stdout chunk (state=3): >>><<< 30564 1726882923.80551: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882923.80554: _low_level_execute_command(): starting 30564 1726882923.80559: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882923.6908023-35922-30402421775469/AnsiballZ_service_facts.py && sleep 0' 30564 1726882923.80959: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882923.80962: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882923.80999: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882923.81002: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882923.81005: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882923.81050: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882923.81057: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882923.81180: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882925.13390: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", <<< 30564 1726882925.13410: stdout chunk (state=3): >>>"source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-qu<<< 30564 1726882925.13425: stdout chunk (state=3): >>>it-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "sourc<<< 30564 1726882925.13433: stdout chunk (state=3): >>>e": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.servi<<< 30564 1726882925.13447: stdout chunk (state=3): >>>ce": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "ina<<< 30564 1726882925.13453: stdout chunk (state=3): >>>ctive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "system<<< 30564 1726882925.13480: stdout chunk (state=3): >>>d"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 30564 1726882925.14685: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882925.14739: stderr chunk (state=3): >>><<< 30564 1726882925.14744: stdout chunk (state=3): >>><<< 30564 1726882925.14760: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882925.15143: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882923.6908023-35922-30402421775469/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882925.15150: _low_level_execute_command(): starting 30564 1726882925.15155: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882923.6908023-35922-30402421775469/ > /dev/null 2>&1 && sleep 0' 30564 1726882925.15583: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882925.15587: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882925.15621: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 30564 1726882925.15635: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882925.15645: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882925.15689: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882925.15701: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882925.15805: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882925.17610: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882925.17653: stderr chunk (state=3): >>><<< 30564 1726882925.17657: stdout chunk (state=3): >>><<< 30564 1726882925.17672: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882925.17679: handler run complete 30564 1726882925.17780: variable 'ansible_facts' from source: unknown 30564 1726882925.17878: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882925.18160: variable 'ansible_facts' from source: unknown 30564 1726882925.18285: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882925.18467: attempt loop complete, returning result 30564 1726882925.18580: _execute() done 30564 1726882925.18583: dumping result to json 30564 1726882925.18636: done dumping result, returning 30564 1726882925.18646: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [0e448fcc-3ce9-4216-acec-0000000026f0] 30564 1726882925.18652: sending task result for task 0e448fcc-3ce9-4216-acec-0000000026f0 ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30564 1726882925.19673: no more pending results, returning what we have 30564 1726882925.19676: results queue empty 30564 1726882925.19677: checking for any_errors_fatal 30564 1726882925.19682: done checking for any_errors_fatal 30564 1726882925.19684: checking for max_fail_percentage 30564 1726882925.19686: done checking for max_fail_percentage 30564 1726882925.19687: checking to see if all hosts have failed and the running result is not ok 30564 1726882925.19687: done checking to see if all hosts have failed 30564 1726882925.19688: getting the remaining hosts for this loop 30564 1726882925.19689: done getting the remaining hosts for this loop 30564 1726882925.19692: getting the next task for host managed_node2 30564 1726882925.19699: done getting next task for host managed_node2 30564 1726882925.19702: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 30564 1726882925.19708: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882925.19722: getting variables 30564 1726882925.19724: in VariableManager get_vars() 30564 1726882925.19761: Calling all_inventory to load vars for managed_node2 30564 1726882925.19766: Calling groups_inventory to load vars for managed_node2 30564 1726882925.19768: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882925.19779: Calling all_plugins_play to load vars for managed_node2 30564 1726882925.19782: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882925.19785: Calling groups_plugins_play to load vars for managed_node2 30564 1726882925.20481: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000026f0 30564 1726882925.20485: WORKER PROCESS EXITING 30564 1726882925.25295: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882925.26226: done with get_vars() 30564 1726882925.26243: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:42:05 -0400 (0:00:01.629) 0:02:03.844 ****** 30564 1726882925.26306: entering _queue_task() for managed_node2/package_facts 30564 1726882925.26555: worker is 1 (out of 1 available) 30564 1726882925.26570: exiting _queue_task() for managed_node2/package_facts 30564 1726882925.26583: done queuing things up, now waiting for results queue to drain 30564 1726882925.26586: waiting for pending results... 30564 1726882925.26777: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 30564 1726882925.26898: in run() - task 0e448fcc-3ce9-4216-acec-0000000026f1 30564 1726882925.26912: variable 'ansible_search_path' from source: unknown 30564 1726882925.26917: variable 'ansible_search_path' from source: unknown 30564 1726882925.26945: calling self._execute() 30564 1726882925.27032: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882925.27037: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882925.27045: variable 'omit' from source: magic vars 30564 1726882925.27347: variable 'ansible_distribution_major_version' from source: facts 30564 1726882925.27359: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882925.27366: variable 'omit' from source: magic vars 30564 1726882925.27421: variable 'omit' from source: magic vars 30564 1726882925.27443: variable 'omit' from source: magic vars 30564 1726882925.27483: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882925.27509: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882925.27524: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882925.27537: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882925.27546: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882925.27574: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882925.27578: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882925.27581: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882925.27647: Set connection var ansible_timeout to 10 30564 1726882925.27651: Set connection var ansible_pipelining to False 30564 1726882925.27653: Set connection var ansible_shell_type to sh 30564 1726882925.27658: Set connection var ansible_shell_executable to /bin/sh 30564 1726882925.27668: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882925.27672: Set connection var ansible_connection to ssh 30564 1726882925.27693: variable 'ansible_shell_executable' from source: unknown 30564 1726882925.27696: variable 'ansible_connection' from source: unknown 30564 1726882925.27699: variable 'ansible_module_compression' from source: unknown 30564 1726882925.27702: variable 'ansible_shell_type' from source: unknown 30564 1726882925.27705: variable 'ansible_shell_executable' from source: unknown 30564 1726882925.27707: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882925.27709: variable 'ansible_pipelining' from source: unknown 30564 1726882925.27712: variable 'ansible_timeout' from source: unknown 30564 1726882925.27714: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882925.27857: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30564 1726882925.27867: variable 'omit' from source: magic vars 30564 1726882925.27874: starting attempt loop 30564 1726882925.27877: running the handler 30564 1726882925.27890: _low_level_execute_command(): starting 30564 1726882925.27899: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882925.28425: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882925.28438: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882925.28466: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 30564 1726882925.28476: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30564 1726882925.28490: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882925.28496: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882925.28556: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882925.28570: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882925.28682: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882925.30326: stdout chunk (state=3): >>>/root <<< 30564 1726882925.30430: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882925.30489: stderr chunk (state=3): >>><<< 30564 1726882925.30492: stdout chunk (state=3): >>><<< 30564 1726882925.30514: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882925.30526: _low_level_execute_command(): starting 30564 1726882925.30529: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882925.3051202-35994-32832756664712 `" && echo ansible-tmp-1726882925.3051202-35994-32832756664712="` echo /root/.ansible/tmp/ansible-tmp-1726882925.3051202-35994-32832756664712 `" ) && sleep 0' 30564 1726882925.30977: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882925.30981: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882925.31017: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 30564 1726882925.31030: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882925.31033: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 30564 1726882925.31035: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882925.31075: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882925.31083: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882925.31194: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882925.33075: stdout chunk (state=3): >>>ansible-tmp-1726882925.3051202-35994-32832756664712=/root/.ansible/tmp/ansible-tmp-1726882925.3051202-35994-32832756664712 <<< 30564 1726882925.33181: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882925.33238: stderr chunk (state=3): >>><<< 30564 1726882925.33244: stdout chunk (state=3): >>><<< 30564 1726882925.33261: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882925.3051202-35994-32832756664712=/root/.ansible/tmp/ansible-tmp-1726882925.3051202-35994-32832756664712 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882925.33308: variable 'ansible_module_compression' from source: unknown 30564 1726882925.33347: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 30564 1726882925.33402: variable 'ansible_facts' from source: unknown 30564 1726882925.33536: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882925.3051202-35994-32832756664712/AnsiballZ_package_facts.py 30564 1726882925.33654: Sending initial data 30564 1726882925.33659: Sent initial data (161 bytes) 30564 1726882925.34344: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882925.34348: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882925.34391: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882925.34405: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882925.34453: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882925.34478: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882925.34583: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882925.36309: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 30564 1726882925.36318: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882925.36413: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882925.36513: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmp1cyegl3a /root/.ansible/tmp/ansible-tmp-1726882925.3051202-35994-32832756664712/AnsiballZ_package_facts.py <<< 30564 1726882925.36603: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882925.38595: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882925.38707: stderr chunk (state=3): >>><<< 30564 1726882925.38711: stdout chunk (state=3): >>><<< 30564 1726882925.38727: done transferring module to remote 30564 1726882925.38736: _low_level_execute_command(): starting 30564 1726882925.38741: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882925.3051202-35994-32832756664712/ /root/.ansible/tmp/ansible-tmp-1726882925.3051202-35994-32832756664712/AnsiballZ_package_facts.py && sleep 0' 30564 1726882925.39203: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882925.39210: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882925.39245: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 30564 1726882925.39251: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration <<< 30564 1726882925.39261: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882925.39275: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882925.39320: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882925.39332: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882925.39449: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882925.41216: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882925.41262: stderr chunk (state=3): >>><<< 30564 1726882925.41268: stdout chunk (state=3): >>><<< 30564 1726882925.41286: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882925.41289: _low_level_execute_command(): starting 30564 1726882925.41298: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882925.3051202-35994-32832756664712/AnsiballZ_package_facts.py && sleep 0' 30564 1726882925.41725: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882925.41736: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882925.41765: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882925.41779: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882925.41789: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882925.41837: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882925.41851: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882925.41966: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882925.87889: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": nu<<< 30564 1726882925.87931: stdout chunk (state=3): >>>ll, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]<<< 30564 1726882925.87985: stdout chunk (state=3): >>>, "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "a<<< 30564 1726882925.88017: stdout chunk (state=3): >>>rch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64"<<< 30564 1726882925.88054: stdout chunk (state=3): >>>, "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_6<<< 30564 1726882925.88070: stdout chunk (state=3): >>>4", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", <<< 30564 1726882925.88097: stdout chunk (state=3): >>>"release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 30564 1726882925.89849: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882925.89852: stderr chunk (state=3): >>><<< 30564 1726882925.89855: stdout chunk (state=3): >>><<< 30564 1726882925.89903: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882925.93842: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882925.3051202-35994-32832756664712/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882925.93862: _low_level_execute_command(): starting 30564 1726882925.93865: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882925.3051202-35994-32832756664712/ > /dev/null 2>&1 && sleep 0' 30564 1726882925.95175: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882925.95283: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882925.95293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882925.95307: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882925.95669: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882925.95675: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882925.95677: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882925.95680: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882925.95682: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882925.95684: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882925.95686: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882925.95688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882925.95788: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882925.95791: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882925.95794: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882925.95795: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882925.95800: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882925.96170: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882925.96173: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882925.96175: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882925.97844: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882925.97847: stdout chunk (state=3): >>><<< 30564 1726882925.97854: stderr chunk (state=3): >>><<< 30564 1726882925.97869: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882925.97879: handler run complete 30564 1726882925.99716: variable 'ansible_facts' from source: unknown 30564 1726882926.00217: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882926.04606: variable 'ansible_facts' from source: unknown 30564 1726882926.05803: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882926.07316: attempt loop complete, returning result 30564 1726882926.07326: _execute() done 30564 1726882926.07329: dumping result to json 30564 1726882926.07562: done dumping result, returning 30564 1726882926.07577: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0e448fcc-3ce9-4216-acec-0000000026f1] 30564 1726882926.07582: sending task result for task 0e448fcc-3ce9-4216-acec-0000000026f1 30564 1726882926.11525: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000026f1 30564 1726882926.11529: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30564 1726882926.11700: no more pending results, returning what we have 30564 1726882926.11703: results queue empty 30564 1726882926.11705: checking for any_errors_fatal 30564 1726882926.11711: done checking for any_errors_fatal 30564 1726882926.11712: checking for max_fail_percentage 30564 1726882926.11714: done checking for max_fail_percentage 30564 1726882926.11715: checking to see if all hosts have failed and the running result is not ok 30564 1726882926.11716: done checking to see if all hosts have failed 30564 1726882926.11716: getting the remaining hosts for this loop 30564 1726882926.11718: done getting the remaining hosts for this loop 30564 1726882926.11721: getting the next task for host managed_node2 30564 1726882926.11729: done getting next task for host managed_node2 30564 1726882926.11733: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 30564 1726882926.11740: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882926.11753: getting variables 30564 1726882926.11755: in VariableManager get_vars() 30564 1726882926.11792: Calling all_inventory to load vars for managed_node2 30564 1726882926.11796: Calling groups_inventory to load vars for managed_node2 30564 1726882926.11803: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882926.11812: Calling all_plugins_play to load vars for managed_node2 30564 1726882926.11815: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882926.11819: Calling groups_plugins_play to load vars for managed_node2 30564 1726882926.14222: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882926.17893: done with get_vars() 30564 1726882926.17924: done getting variables 30564 1726882926.17988: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:42:06 -0400 (0:00:00.917) 0:02:04.761 ****** 30564 1726882926.18025: entering _queue_task() for managed_node2/debug 30564 1726882926.18359: worker is 1 (out of 1 available) 30564 1726882926.18685: exiting _queue_task() for managed_node2/debug 30564 1726882926.18697: done queuing things up, now waiting for results queue to drain 30564 1726882926.18699: waiting for pending results... 30564 1726882926.20159: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 30564 1726882926.20291: in run() - task 0e448fcc-3ce9-4216-acec-000000002695 30564 1726882926.20304: variable 'ansible_search_path' from source: unknown 30564 1726882926.20308: variable 'ansible_search_path' from source: unknown 30564 1726882926.20343: calling self._execute() 30564 1726882926.21032: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882926.21038: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882926.21048: variable 'omit' from source: magic vars 30564 1726882926.21956: variable 'ansible_distribution_major_version' from source: facts 30564 1726882926.21974: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882926.21980: variable 'omit' from source: magic vars 30564 1726882926.22047: variable 'omit' from source: magic vars 30564 1726882926.22143: variable 'network_provider' from source: set_fact 30564 1726882926.22161: variable 'omit' from source: magic vars 30564 1726882926.22208: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882926.22239: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882926.22257: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882926.22682: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882926.22695: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882926.22724: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882926.22728: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882926.22730: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882926.22834: Set connection var ansible_timeout to 10 30564 1726882926.22840: Set connection var ansible_pipelining to False 30564 1726882926.22843: Set connection var ansible_shell_type to sh 30564 1726882926.22849: Set connection var ansible_shell_executable to /bin/sh 30564 1726882926.22856: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882926.22859: Set connection var ansible_connection to ssh 30564 1726882926.22889: variable 'ansible_shell_executable' from source: unknown 30564 1726882926.22894: variable 'ansible_connection' from source: unknown 30564 1726882926.22897: variable 'ansible_module_compression' from source: unknown 30564 1726882926.22899: variable 'ansible_shell_type' from source: unknown 30564 1726882926.22902: variable 'ansible_shell_executable' from source: unknown 30564 1726882926.22904: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882926.22906: variable 'ansible_pipelining' from source: unknown 30564 1726882926.22908: variable 'ansible_timeout' from source: unknown 30564 1726882926.22910: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882926.23045: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882926.23056: variable 'omit' from source: magic vars 30564 1726882926.23061: starting attempt loop 30564 1726882926.23066: running the handler 30564 1726882926.23117: handler run complete 30564 1726882926.23133: attempt loop complete, returning result 30564 1726882926.23136: _execute() done 30564 1726882926.23139: dumping result to json 30564 1726882926.23141: done dumping result, returning 30564 1726882926.23148: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [0e448fcc-3ce9-4216-acec-000000002695] 30564 1726882926.23154: sending task result for task 0e448fcc-3ce9-4216-acec-000000002695 30564 1726882926.23249: done sending task result for task 0e448fcc-3ce9-4216-acec-000000002695 30564 1726882926.23252: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: Using network provider: nm 30564 1726882926.23327: no more pending results, returning what we have 30564 1726882926.23331: results queue empty 30564 1726882926.23332: checking for any_errors_fatal 30564 1726882926.23344: done checking for any_errors_fatal 30564 1726882926.23344: checking for max_fail_percentage 30564 1726882926.23346: done checking for max_fail_percentage 30564 1726882926.23347: checking to see if all hosts have failed and the running result is not ok 30564 1726882926.23348: done checking to see if all hosts have failed 30564 1726882926.23349: getting the remaining hosts for this loop 30564 1726882926.23350: done getting the remaining hosts for this loop 30564 1726882926.23354: getting the next task for host managed_node2 30564 1726882926.23361: done getting next task for host managed_node2 30564 1726882926.23367: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30564 1726882926.23372: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882926.23387: getting variables 30564 1726882926.23388: in VariableManager get_vars() 30564 1726882926.23434: Calling all_inventory to load vars for managed_node2 30564 1726882926.23436: Calling groups_inventory to load vars for managed_node2 30564 1726882926.23439: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882926.23449: Calling all_plugins_play to load vars for managed_node2 30564 1726882926.23452: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882926.23455: Calling groups_plugins_play to load vars for managed_node2 30564 1726882926.26958: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882926.30131: done with get_vars() 30564 1726882926.30167: done getting variables 30564 1726882926.30346: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:42:06 -0400 (0:00:00.123) 0:02:04.885 ****** 30564 1726882926.30392: entering _queue_task() for managed_node2/fail 30564 1726882926.31073: worker is 1 (out of 1 available) 30564 1726882926.31179: exiting _queue_task() for managed_node2/fail 30564 1726882926.31220: done queuing things up, now waiting for results queue to drain 30564 1726882926.31222: waiting for pending results... 30564 1726882926.31857: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 30564 1726882926.32005: in run() - task 0e448fcc-3ce9-4216-acec-000000002696 30564 1726882926.32018: variable 'ansible_search_path' from source: unknown 30564 1726882926.32022: variable 'ansible_search_path' from source: unknown 30564 1726882926.32059: calling self._execute() 30564 1726882926.32165: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882926.32176: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882926.32186: variable 'omit' from source: magic vars 30564 1726882926.33234: variable 'ansible_distribution_major_version' from source: facts 30564 1726882926.33251: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882926.33384: variable 'network_state' from source: role '' defaults 30564 1726882926.33399: Evaluated conditional (network_state != {}): False 30564 1726882926.33405: when evaluation is False, skipping this task 30564 1726882926.33409: _execute() done 30564 1726882926.33411: dumping result to json 30564 1726882926.33414: done dumping result, returning 30564 1726882926.33422: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0e448fcc-3ce9-4216-acec-000000002696] 30564 1726882926.33428: sending task result for task 0e448fcc-3ce9-4216-acec-000000002696 30564 1726882926.33530: done sending task result for task 0e448fcc-3ce9-4216-acec-000000002696 30564 1726882926.33533: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30564 1726882926.33607: no more pending results, returning what we have 30564 1726882926.33612: results queue empty 30564 1726882926.33613: checking for any_errors_fatal 30564 1726882926.33621: done checking for any_errors_fatal 30564 1726882926.33622: checking for max_fail_percentage 30564 1726882926.33623: done checking for max_fail_percentage 30564 1726882926.33624: checking to see if all hosts have failed and the running result is not ok 30564 1726882926.33625: done checking to see if all hosts have failed 30564 1726882926.33625: getting the remaining hosts for this loop 30564 1726882926.33627: done getting the remaining hosts for this loop 30564 1726882926.33631: getting the next task for host managed_node2 30564 1726882926.33639: done getting next task for host managed_node2 30564 1726882926.33645: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30564 1726882926.33650: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882926.33678: getting variables 30564 1726882926.33680: in VariableManager get_vars() 30564 1726882926.33719: Calling all_inventory to load vars for managed_node2 30564 1726882926.33722: Calling groups_inventory to load vars for managed_node2 30564 1726882926.33724: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882926.33733: Calling all_plugins_play to load vars for managed_node2 30564 1726882926.33736: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882926.33738: Calling groups_plugins_play to load vars for managed_node2 30564 1726882926.35427: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882926.37310: done with get_vars() 30564 1726882926.37327: done getting variables 30564 1726882926.37373: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:42:06 -0400 (0:00:00.070) 0:02:04.955 ****** 30564 1726882926.37399: entering _queue_task() for managed_node2/fail 30564 1726882926.37625: worker is 1 (out of 1 available) 30564 1726882926.37636: exiting _queue_task() for managed_node2/fail 30564 1726882926.37648: done queuing things up, now waiting for results queue to drain 30564 1726882926.37649: waiting for pending results... 30564 1726882926.37843: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 30564 1726882926.37941: in run() - task 0e448fcc-3ce9-4216-acec-000000002697 30564 1726882926.37961: variable 'ansible_search_path' from source: unknown 30564 1726882926.37966: variable 'ansible_search_path' from source: unknown 30564 1726882926.37993: calling self._execute() 30564 1726882926.38287: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882926.38290: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882926.38293: variable 'omit' from source: magic vars 30564 1726882926.38515: variable 'ansible_distribution_major_version' from source: facts 30564 1726882926.38526: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882926.38648: variable 'network_state' from source: role '' defaults 30564 1726882926.38659: Evaluated conditional (network_state != {}): False 30564 1726882926.38662: when evaluation is False, skipping this task 30564 1726882926.38667: _execute() done 30564 1726882926.38672: dumping result to json 30564 1726882926.38674: done dumping result, returning 30564 1726882926.38682: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0e448fcc-3ce9-4216-acec-000000002697] 30564 1726882926.38688: sending task result for task 0e448fcc-3ce9-4216-acec-000000002697 30564 1726882926.38794: done sending task result for task 0e448fcc-3ce9-4216-acec-000000002697 30564 1726882926.38798: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30564 1726882926.38844: no more pending results, returning what we have 30564 1726882926.38848: results queue empty 30564 1726882926.38849: checking for any_errors_fatal 30564 1726882926.38857: done checking for any_errors_fatal 30564 1726882926.38858: checking for max_fail_percentage 30564 1726882926.38859: done checking for max_fail_percentage 30564 1726882926.38860: checking to see if all hosts have failed and the running result is not ok 30564 1726882926.38861: done checking to see if all hosts have failed 30564 1726882926.38862: getting the remaining hosts for this loop 30564 1726882926.38869: done getting the remaining hosts for this loop 30564 1726882926.38872: getting the next task for host managed_node2 30564 1726882926.38884: done getting next task for host managed_node2 30564 1726882926.38958: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30564 1726882926.38969: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882926.38998: getting variables 30564 1726882926.39000: in VariableManager get_vars() 30564 1726882926.39051: Calling all_inventory to load vars for managed_node2 30564 1726882926.39053: Calling groups_inventory to load vars for managed_node2 30564 1726882926.39056: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882926.39072: Calling all_plugins_play to load vars for managed_node2 30564 1726882926.39075: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882926.39079: Calling groups_plugins_play to load vars for managed_node2 30564 1726882926.41033: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882926.42012: done with get_vars() 30564 1726882926.42028: done getting variables 30564 1726882926.42073: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:42:06 -0400 (0:00:00.046) 0:02:05.002 ****** 30564 1726882926.42097: entering _queue_task() for managed_node2/fail 30564 1726882926.42299: worker is 1 (out of 1 available) 30564 1726882926.42312: exiting _queue_task() for managed_node2/fail 30564 1726882926.42324: done queuing things up, now waiting for results queue to drain 30564 1726882926.42326: waiting for pending results... 30564 1726882926.42603: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 30564 1726882926.42895: in run() - task 0e448fcc-3ce9-4216-acec-000000002698 30564 1726882926.42900: variable 'ansible_search_path' from source: unknown 30564 1726882926.42902: variable 'ansible_search_path' from source: unknown 30564 1726882926.42905: calling self._execute() 30564 1726882926.43522: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882926.43529: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882926.43539: variable 'omit' from source: magic vars 30564 1726882926.44294: variable 'ansible_distribution_major_version' from source: facts 30564 1726882926.44307: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882926.44809: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882926.47491: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882926.47558: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882926.47606: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882926.47642: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882926.47670: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882926.47752: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882926.48187: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882926.48214: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882926.48350: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882926.48353: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882926.48565: variable 'ansible_distribution_major_version' from source: facts 30564 1726882926.48582: Evaluated conditional (ansible_distribution_major_version | int > 9): False 30564 1726882926.48585: when evaluation is False, skipping this task 30564 1726882926.48595: _execute() done 30564 1726882926.48598: dumping result to json 30564 1726882926.48600: done dumping result, returning 30564 1726882926.48609: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0e448fcc-3ce9-4216-acec-000000002698] 30564 1726882926.48615: sending task result for task 0e448fcc-3ce9-4216-acec-000000002698 30564 1726882926.48718: done sending task result for task 0e448fcc-3ce9-4216-acec-000000002698 30564 1726882926.48721: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 30564 1726882926.48765: no more pending results, returning what we have 30564 1726882926.48771: results queue empty 30564 1726882926.48772: checking for any_errors_fatal 30564 1726882926.48778: done checking for any_errors_fatal 30564 1726882926.48779: checking for max_fail_percentage 30564 1726882926.48781: done checking for max_fail_percentage 30564 1726882926.48782: checking to see if all hosts have failed and the running result is not ok 30564 1726882926.48783: done checking to see if all hosts have failed 30564 1726882926.48784: getting the remaining hosts for this loop 30564 1726882926.48785: done getting the remaining hosts for this loop 30564 1726882926.48789: getting the next task for host managed_node2 30564 1726882926.48798: done getting next task for host managed_node2 30564 1726882926.48802: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30564 1726882926.48807: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882926.48838: getting variables 30564 1726882926.48840: in VariableManager get_vars() 30564 1726882926.48890: Calling all_inventory to load vars for managed_node2 30564 1726882926.48893: Calling groups_inventory to load vars for managed_node2 30564 1726882926.48895: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882926.48904: Calling all_plugins_play to load vars for managed_node2 30564 1726882926.48906: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882926.48909: Calling groups_plugins_play to load vars for managed_node2 30564 1726882926.50847: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882926.52593: done with get_vars() 30564 1726882926.52616: done getting variables 30564 1726882926.52680: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:42:06 -0400 (0:00:00.106) 0:02:05.108 ****** 30564 1726882926.52715: entering _queue_task() for managed_node2/dnf 30564 1726882926.53010: worker is 1 (out of 1 available) 30564 1726882926.53022: exiting _queue_task() for managed_node2/dnf 30564 1726882926.53033: done queuing things up, now waiting for results queue to drain 30564 1726882926.53034: waiting for pending results... 30564 1726882926.53329: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 30564 1726882926.53508: in run() - task 0e448fcc-3ce9-4216-acec-000000002699 30564 1726882926.53528: variable 'ansible_search_path' from source: unknown 30564 1726882926.53536: variable 'ansible_search_path' from source: unknown 30564 1726882926.53580: calling self._execute() 30564 1726882926.53693: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882926.53707: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882926.53723: variable 'omit' from source: magic vars 30564 1726882926.54122: variable 'ansible_distribution_major_version' from source: facts 30564 1726882926.54146: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882926.54366: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882926.56897: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882926.56974: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882926.57018: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882926.57057: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882926.57096: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882926.57186: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882926.57235: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882926.57273: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882926.57325: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882926.57346: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882926.57487: variable 'ansible_distribution' from source: facts 30564 1726882926.57496: variable 'ansible_distribution_major_version' from source: facts 30564 1726882926.57519: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 30564 1726882926.57633: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882926.57762: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882926.57794: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882926.57821: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882926.57870: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882926.57891: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882926.57936: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882926.57969: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882926.57999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882926.58040: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882926.58062: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882926.58106: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882926.58130: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882926.58165: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882926.58211: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882926.58229: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882926.58399: variable 'network_connections' from source: include params 30564 1726882926.58415: variable 'interface' from source: play vars 30564 1726882926.58483: variable 'interface' from source: play vars 30564 1726882926.58549: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882926.58725: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882926.58762: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882926.58801: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882926.58835: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882926.58894: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882926.58925: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882926.58974: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882926.59027: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882926.59080: variable '__network_team_connections_defined' from source: role '' defaults 30564 1726882926.59362: variable 'network_connections' from source: include params 30564 1726882926.59379: variable 'interface' from source: play vars 30564 1726882926.59440: variable 'interface' from source: play vars 30564 1726882926.59475: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30564 1726882926.59484: when evaluation is False, skipping this task 30564 1726882926.59490: _execute() done 30564 1726882926.59496: dumping result to json 30564 1726882926.59502: done dumping result, returning 30564 1726882926.59512: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0e448fcc-3ce9-4216-acec-000000002699] 30564 1726882926.59522: sending task result for task 0e448fcc-3ce9-4216-acec-000000002699 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30564 1726882926.59680: no more pending results, returning what we have 30564 1726882926.59685: results queue empty 30564 1726882926.59686: checking for any_errors_fatal 30564 1726882926.59694: done checking for any_errors_fatal 30564 1726882926.59695: checking for max_fail_percentage 30564 1726882926.59697: done checking for max_fail_percentage 30564 1726882926.59698: checking to see if all hosts have failed and the running result is not ok 30564 1726882926.59699: done checking to see if all hosts have failed 30564 1726882926.59699: getting the remaining hosts for this loop 30564 1726882926.59701: done getting the remaining hosts for this loop 30564 1726882926.59705: getting the next task for host managed_node2 30564 1726882926.59715: done getting next task for host managed_node2 30564 1726882926.59719: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30564 1726882926.59726: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882926.59760: getting variables 30564 1726882926.59762: in VariableManager get_vars() 30564 1726882926.59816: Calling all_inventory to load vars for managed_node2 30564 1726882926.59819: Calling groups_inventory to load vars for managed_node2 30564 1726882926.59822: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882926.59835: Calling all_plugins_play to load vars for managed_node2 30564 1726882926.59838: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882926.59842: Calling groups_plugins_play to load vars for managed_node2 30564 1726882926.60784: done sending task result for task 0e448fcc-3ce9-4216-acec-000000002699 30564 1726882926.60787: WORKER PROCESS EXITING 30564 1726882926.61271: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882926.62245: done with get_vars() 30564 1726882926.62261: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 30564 1726882926.62318: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:42:06 -0400 (0:00:00.096) 0:02:05.204 ****** 30564 1726882926.62341: entering _queue_task() for managed_node2/yum 30564 1726882926.62575: worker is 1 (out of 1 available) 30564 1726882926.62587: exiting _queue_task() for managed_node2/yum 30564 1726882926.62627: done queuing things up, now waiting for results queue to drain 30564 1726882926.62629: waiting for pending results... 30564 1726882926.62833: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 30564 1726882926.63671: in run() - task 0e448fcc-3ce9-4216-acec-00000000269a 30564 1726882926.63677: variable 'ansible_search_path' from source: unknown 30564 1726882926.63680: variable 'ansible_search_path' from source: unknown 30564 1726882926.63683: calling self._execute() 30564 1726882926.63686: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882926.63692: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882926.63694: variable 'omit' from source: magic vars 30564 1726882926.63696: variable 'ansible_distribution_major_version' from source: facts 30564 1726882926.63698: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882926.63701: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882926.66068: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882926.66114: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882926.66139: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882926.66164: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882926.66187: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882926.66244: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882926.66265: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882926.66288: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882926.66317: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882926.66327: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882926.66393: variable 'ansible_distribution_major_version' from source: facts 30564 1726882926.66404: Evaluated conditional (ansible_distribution_major_version | int < 8): False 30564 1726882926.66409: when evaluation is False, skipping this task 30564 1726882926.66412: _execute() done 30564 1726882926.66414: dumping result to json 30564 1726882926.66416: done dumping result, returning 30564 1726882926.66422: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0e448fcc-3ce9-4216-acec-00000000269a] 30564 1726882926.66428: sending task result for task 0e448fcc-3ce9-4216-acec-00000000269a 30564 1726882926.66516: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000269a 30564 1726882926.66518: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 30564 1726882926.66568: no more pending results, returning what we have 30564 1726882926.66572: results queue empty 30564 1726882926.66573: checking for any_errors_fatal 30564 1726882926.66582: done checking for any_errors_fatal 30564 1726882926.66583: checking for max_fail_percentage 30564 1726882926.66585: done checking for max_fail_percentage 30564 1726882926.66586: checking to see if all hosts have failed and the running result is not ok 30564 1726882926.66586: done checking to see if all hosts have failed 30564 1726882926.66587: getting the remaining hosts for this loop 30564 1726882926.66589: done getting the remaining hosts for this loop 30564 1726882926.66592: getting the next task for host managed_node2 30564 1726882926.66600: done getting next task for host managed_node2 30564 1726882926.66604: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30564 1726882926.66610: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882926.66638: getting variables 30564 1726882926.66639: in VariableManager get_vars() 30564 1726882926.66683: Calling all_inventory to load vars for managed_node2 30564 1726882926.66686: Calling groups_inventory to load vars for managed_node2 30564 1726882926.66688: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882926.66697: Calling all_plugins_play to load vars for managed_node2 30564 1726882926.66699: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882926.66702: Calling groups_plugins_play to load vars for managed_node2 30564 1726882926.68155: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882926.69808: done with get_vars() 30564 1726882926.69824: done getting variables 30564 1726882926.69869: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:42:06 -0400 (0:00:00.075) 0:02:05.280 ****** 30564 1726882926.69893: entering _queue_task() for managed_node2/fail 30564 1726882926.70107: worker is 1 (out of 1 available) 30564 1726882926.70118: exiting _queue_task() for managed_node2/fail 30564 1726882926.70130: done queuing things up, now waiting for results queue to drain 30564 1726882926.70131: waiting for pending results... 30564 1726882926.70313: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 30564 1726882926.70423: in run() - task 0e448fcc-3ce9-4216-acec-00000000269b 30564 1726882926.70434: variable 'ansible_search_path' from source: unknown 30564 1726882926.70437: variable 'ansible_search_path' from source: unknown 30564 1726882926.70466: calling self._execute() 30564 1726882926.70546: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882926.70551: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882926.70559: variable 'omit' from source: magic vars 30564 1726882926.70842: variable 'ansible_distribution_major_version' from source: facts 30564 1726882926.70851: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882926.70940: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882926.71074: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882926.73417: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882926.73488: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882926.73526: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882926.73559: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882926.73587: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882926.73657: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882926.73701: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882926.73726: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882926.73770: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882926.73784: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882926.73825: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882926.73847: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882926.73873: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882926.73911: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882926.73927: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882926.73962: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882926.73986: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882926.74009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882926.74047: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882926.74060: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882926.74230: variable 'network_connections' from source: include params 30564 1726882926.74242: variable 'interface' from source: play vars 30564 1726882926.74306: variable 'interface' from source: play vars 30564 1726882926.74373: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882926.74529: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882926.74566: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882926.74596: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882926.74623: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882926.74666: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882926.74690: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882926.74713: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882926.74738: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882926.74786: variable '__network_team_connections_defined' from source: role '' defaults 30564 1726882926.75019: variable 'network_connections' from source: include params 30564 1726882926.75024: variable 'interface' from source: play vars 30564 1726882926.75085: variable 'interface' from source: play vars 30564 1726882926.75107: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30564 1726882926.75111: when evaluation is False, skipping this task 30564 1726882926.75114: _execute() done 30564 1726882926.75118: dumping result to json 30564 1726882926.75120: done dumping result, returning 30564 1726882926.75127: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-4216-acec-00000000269b] 30564 1726882926.75130: sending task result for task 0e448fcc-3ce9-4216-acec-00000000269b 30564 1726882926.75223: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000269b 30564 1726882926.75226: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30564 1726882926.75281: no more pending results, returning what we have 30564 1726882926.75285: results queue empty 30564 1726882926.75286: checking for any_errors_fatal 30564 1726882926.75293: done checking for any_errors_fatal 30564 1726882926.75294: checking for max_fail_percentage 30564 1726882926.75295: done checking for max_fail_percentage 30564 1726882926.75296: checking to see if all hosts have failed and the running result is not ok 30564 1726882926.75297: done checking to see if all hosts have failed 30564 1726882926.75298: getting the remaining hosts for this loop 30564 1726882926.75299: done getting the remaining hosts for this loop 30564 1726882926.75303: getting the next task for host managed_node2 30564 1726882926.75310: done getting next task for host managed_node2 30564 1726882926.75314: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 30564 1726882926.75319: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882926.75346: getting variables 30564 1726882926.75347: in VariableManager get_vars() 30564 1726882926.75390: Calling all_inventory to load vars for managed_node2 30564 1726882926.75393: Calling groups_inventory to load vars for managed_node2 30564 1726882926.75395: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882926.75404: Calling all_plugins_play to load vars for managed_node2 30564 1726882926.75407: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882926.75409: Calling groups_plugins_play to load vars for managed_node2 30564 1726882926.77041: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882926.78874: done with get_vars() 30564 1726882926.78896: done getting variables 30564 1726882926.78960: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:42:06 -0400 (0:00:00.091) 0:02:05.371 ****** 30564 1726882926.79001: entering _queue_task() for managed_node2/package 30564 1726882926.79320: worker is 1 (out of 1 available) 30564 1726882926.79332: exiting _queue_task() for managed_node2/package 30564 1726882926.79344: done queuing things up, now waiting for results queue to drain 30564 1726882926.79345: waiting for pending results... 30564 1726882926.79557: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 30564 1726882926.79661: in run() - task 0e448fcc-3ce9-4216-acec-00000000269c 30564 1726882926.79675: variable 'ansible_search_path' from source: unknown 30564 1726882926.79678: variable 'ansible_search_path' from source: unknown 30564 1726882926.79711: calling self._execute() 30564 1726882926.79787: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882926.79790: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882926.79801: variable 'omit' from source: magic vars 30564 1726882926.80080: variable 'ansible_distribution_major_version' from source: facts 30564 1726882926.80090: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882926.80224: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882926.80418: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882926.80449: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882926.80479: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882926.80519: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882926.80595: variable 'network_packages' from source: role '' defaults 30564 1726882926.80662: variable '__network_provider_setup' from source: role '' defaults 30564 1726882926.80673: variable '__network_service_name_default_nm' from source: role '' defaults 30564 1726882926.80721: variable '__network_service_name_default_nm' from source: role '' defaults 30564 1726882926.80729: variable '__network_packages_default_nm' from source: role '' defaults 30564 1726882926.80773: variable '__network_packages_default_nm' from source: role '' defaults 30564 1726882926.80891: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882926.82850: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882926.82898: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882926.82924: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882926.82947: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882926.82970: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882926.83027: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882926.83046: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882926.83066: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882926.83100: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882926.83111: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882926.83141: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882926.83156: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882926.83177: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882926.83206: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882926.83216: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882926.83357: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30564 1726882926.83429: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882926.83445: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882926.83462: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882926.83492: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882926.83503: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882926.83567: variable 'ansible_python' from source: facts 30564 1726882926.83582: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30564 1726882926.83635: variable '__network_wpa_supplicant_required' from source: role '' defaults 30564 1726882926.83695: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30564 1726882926.83781: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882926.83797: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882926.83814: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882926.83840: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882926.83852: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882926.83887: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882926.83907: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882926.83924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882926.83948: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882926.83960: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882926.84054: variable 'network_connections' from source: include params 30564 1726882926.84057: variable 'interface' from source: play vars 30564 1726882926.84131: variable 'interface' from source: play vars 30564 1726882926.84195: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882926.84214: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882926.84234: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882926.84255: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882926.84321: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882926.84793: variable 'network_connections' from source: include params 30564 1726882926.84796: variable 'interface' from source: play vars 30564 1726882926.84798: variable 'interface' from source: play vars 30564 1726882926.84801: variable '__network_packages_default_wireless' from source: role '' defaults 30564 1726882926.84803: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882926.85070: variable 'network_connections' from source: include params 30564 1726882926.85079: variable 'interface' from source: play vars 30564 1726882926.85143: variable 'interface' from source: play vars 30564 1726882926.85165: variable '__network_packages_default_team' from source: role '' defaults 30564 1726882926.85241: variable '__network_team_connections_defined' from source: role '' defaults 30564 1726882926.85548: variable 'network_connections' from source: include params 30564 1726882926.85551: variable 'interface' from source: play vars 30564 1726882926.85616: variable 'interface' from source: play vars 30564 1726882926.85671: variable '__network_service_name_default_initscripts' from source: role '' defaults 30564 1726882926.85727: variable '__network_service_name_default_initscripts' from source: role '' defaults 30564 1726882926.85734: variable '__network_packages_default_initscripts' from source: role '' defaults 30564 1726882926.85796: variable '__network_packages_default_initscripts' from source: role '' defaults 30564 1726882926.86015: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30564 1726882926.86494: variable 'network_connections' from source: include params 30564 1726882926.86497: variable 'interface' from source: play vars 30564 1726882926.86543: variable 'interface' from source: play vars 30564 1726882926.86559: variable 'ansible_distribution' from source: facts 30564 1726882926.86562: variable '__network_rh_distros' from source: role '' defaults 30564 1726882926.86568: variable 'ansible_distribution_major_version' from source: facts 30564 1726882926.86587: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30564 1726882926.86692: variable 'ansible_distribution' from source: facts 30564 1726882926.86695: variable '__network_rh_distros' from source: role '' defaults 30564 1726882926.86700: variable 'ansible_distribution_major_version' from source: facts 30564 1726882926.86710: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30564 1726882926.86829: variable 'ansible_distribution' from source: facts 30564 1726882926.86832: variable '__network_rh_distros' from source: role '' defaults 30564 1726882926.86835: variable 'ansible_distribution_major_version' from source: facts 30564 1726882926.86866: variable 'network_provider' from source: set_fact 30564 1726882926.86877: variable 'ansible_facts' from source: unknown 30564 1726882926.87366: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 30564 1726882926.87369: when evaluation is False, skipping this task 30564 1726882926.87372: _execute() done 30564 1726882926.87375: dumping result to json 30564 1726882926.87379: done dumping result, returning 30564 1726882926.87386: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [0e448fcc-3ce9-4216-acec-00000000269c] 30564 1726882926.87392: sending task result for task 0e448fcc-3ce9-4216-acec-00000000269c 30564 1726882926.87482: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000269c 30564 1726882926.87485: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 30564 1726882926.87555: no more pending results, returning what we have 30564 1726882926.87559: results queue empty 30564 1726882926.87560: checking for any_errors_fatal 30564 1726882926.87568: done checking for any_errors_fatal 30564 1726882926.87569: checking for max_fail_percentage 30564 1726882926.87571: done checking for max_fail_percentage 30564 1726882926.87572: checking to see if all hosts have failed and the running result is not ok 30564 1726882926.87573: done checking to see if all hosts have failed 30564 1726882926.87574: getting the remaining hosts for this loop 30564 1726882926.87575: done getting the remaining hosts for this loop 30564 1726882926.87579: getting the next task for host managed_node2 30564 1726882926.87586: done getting next task for host managed_node2 30564 1726882926.87590: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30564 1726882926.87595: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882926.87620: getting variables 30564 1726882926.87622: in VariableManager get_vars() 30564 1726882926.87672: Calling all_inventory to load vars for managed_node2 30564 1726882926.87675: Calling groups_inventory to load vars for managed_node2 30564 1726882926.87677: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882926.87686: Calling all_plugins_play to load vars for managed_node2 30564 1726882926.87688: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882926.87690: Calling groups_plugins_play to load vars for managed_node2 30564 1726882926.88615: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882926.90173: done with get_vars() 30564 1726882926.90196: done getting variables 30564 1726882926.90253: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:42:06 -0400 (0:00:00.112) 0:02:05.484 ****** 30564 1726882926.90295: entering _queue_task() for managed_node2/package 30564 1726882926.90570: worker is 1 (out of 1 available) 30564 1726882926.90581: exiting _queue_task() for managed_node2/package 30564 1726882926.90593: done queuing things up, now waiting for results queue to drain 30564 1726882926.90595: waiting for pending results... 30564 1726882926.90892: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 30564 1726882926.91038: in run() - task 0e448fcc-3ce9-4216-acec-00000000269d 30564 1726882926.91058: variable 'ansible_search_path' from source: unknown 30564 1726882926.91061: variable 'ansible_search_path' from source: unknown 30564 1726882926.91095: calling self._execute() 30564 1726882926.91184: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882926.91189: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882926.91201: variable 'omit' from source: magic vars 30564 1726882926.91499: variable 'ansible_distribution_major_version' from source: facts 30564 1726882926.91510: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882926.91596: variable 'network_state' from source: role '' defaults 30564 1726882926.91603: Evaluated conditional (network_state != {}): False 30564 1726882926.91606: when evaluation is False, skipping this task 30564 1726882926.91609: _execute() done 30564 1726882926.91611: dumping result to json 30564 1726882926.91615: done dumping result, returning 30564 1726882926.91625: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0e448fcc-3ce9-4216-acec-00000000269d] 30564 1726882926.91630: sending task result for task 0e448fcc-3ce9-4216-acec-00000000269d 30564 1726882926.91723: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000269d 30564 1726882926.91728: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30564 1726882926.91777: no more pending results, returning what we have 30564 1726882926.91780: results queue empty 30564 1726882926.91781: checking for any_errors_fatal 30564 1726882926.91786: done checking for any_errors_fatal 30564 1726882926.91786: checking for max_fail_percentage 30564 1726882926.91788: done checking for max_fail_percentage 30564 1726882926.91789: checking to see if all hosts have failed and the running result is not ok 30564 1726882926.91789: done checking to see if all hosts have failed 30564 1726882926.91790: getting the remaining hosts for this loop 30564 1726882926.91792: done getting the remaining hosts for this loop 30564 1726882926.91795: getting the next task for host managed_node2 30564 1726882926.91803: done getting next task for host managed_node2 30564 1726882926.91806: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30564 1726882926.91811: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882926.91835: getting variables 30564 1726882926.91838: in VariableManager get_vars() 30564 1726882926.91875: Calling all_inventory to load vars for managed_node2 30564 1726882926.91879: Calling groups_inventory to load vars for managed_node2 30564 1726882926.91881: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882926.91889: Calling all_plugins_play to load vars for managed_node2 30564 1726882926.91892: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882926.91894: Calling groups_plugins_play to load vars for managed_node2 30564 1726882926.92784: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882926.94301: done with get_vars() 30564 1726882926.94318: done getting variables 30564 1726882926.94366: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:42:06 -0400 (0:00:00.040) 0:02:05.525 ****** 30564 1726882926.94391: entering _queue_task() for managed_node2/package 30564 1726882926.94583: worker is 1 (out of 1 available) 30564 1726882926.94595: exiting _queue_task() for managed_node2/package 30564 1726882926.94606: done queuing things up, now waiting for results queue to drain 30564 1726882926.94608: waiting for pending results... 30564 1726882926.94789: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 30564 1726882926.94881: in run() - task 0e448fcc-3ce9-4216-acec-00000000269e 30564 1726882926.94891: variable 'ansible_search_path' from source: unknown 30564 1726882926.94895: variable 'ansible_search_path' from source: unknown 30564 1726882926.94923: calling self._execute() 30564 1726882926.95002: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882926.95006: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882926.95016: variable 'omit' from source: magic vars 30564 1726882926.95298: variable 'ansible_distribution_major_version' from source: facts 30564 1726882926.95307: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882926.95392: variable 'network_state' from source: role '' defaults 30564 1726882926.95402: Evaluated conditional (network_state != {}): False 30564 1726882926.95405: when evaluation is False, skipping this task 30564 1726882926.95408: _execute() done 30564 1726882926.95412: dumping result to json 30564 1726882926.95414: done dumping result, returning 30564 1726882926.95423: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0e448fcc-3ce9-4216-acec-00000000269e] 30564 1726882926.95426: sending task result for task 0e448fcc-3ce9-4216-acec-00000000269e 30564 1726882926.95519: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000269e 30564 1726882926.95522: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30564 1726882926.95573: no more pending results, returning what we have 30564 1726882926.95576: results queue empty 30564 1726882926.95577: checking for any_errors_fatal 30564 1726882926.95582: done checking for any_errors_fatal 30564 1726882926.95583: checking for max_fail_percentage 30564 1726882926.95584: done checking for max_fail_percentage 30564 1726882926.95585: checking to see if all hosts have failed and the running result is not ok 30564 1726882926.95586: done checking to see if all hosts have failed 30564 1726882926.95586: getting the remaining hosts for this loop 30564 1726882926.95588: done getting the remaining hosts for this loop 30564 1726882926.95591: getting the next task for host managed_node2 30564 1726882926.95599: done getting next task for host managed_node2 30564 1726882926.95602: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30564 1726882926.95608: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882926.95631: getting variables 30564 1726882926.95639: in VariableManager get_vars() 30564 1726882926.95673: Calling all_inventory to load vars for managed_node2 30564 1726882926.95675: Calling groups_inventory to load vars for managed_node2 30564 1726882926.95677: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882926.95683: Calling all_plugins_play to load vars for managed_node2 30564 1726882926.95685: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882926.95687: Calling groups_plugins_play to load vars for managed_node2 30564 1726882926.96554: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882926.97514: done with get_vars() 30564 1726882926.97529: done getting variables 30564 1726882926.97570: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:42:06 -0400 (0:00:00.032) 0:02:05.557 ****** 30564 1726882926.97595: entering _queue_task() for managed_node2/service 30564 1726882926.97777: worker is 1 (out of 1 available) 30564 1726882926.97789: exiting _queue_task() for managed_node2/service 30564 1726882926.97800: done queuing things up, now waiting for results queue to drain 30564 1726882926.97801: waiting for pending results... 30564 1726882926.97972: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 30564 1726882926.98055: in run() - task 0e448fcc-3ce9-4216-acec-00000000269f 30564 1726882926.98065: variable 'ansible_search_path' from source: unknown 30564 1726882926.98071: variable 'ansible_search_path' from source: unknown 30564 1726882926.98097: calling self._execute() 30564 1726882926.98170: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882926.98175: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882926.98182: variable 'omit' from source: magic vars 30564 1726882926.98440: variable 'ansible_distribution_major_version' from source: facts 30564 1726882926.98450: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882926.98537: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882926.98671: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882927.00224: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882927.00271: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882927.00296: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882927.00324: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882927.00343: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882927.00401: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882927.00433: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882927.00451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882927.00481: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882927.00491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882927.00522: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882927.00541: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882927.00558: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882927.00587: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882927.00597: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882927.00623: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882927.00645: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882927.00661: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882927.00691: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882927.00702: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882927.00812: variable 'network_connections' from source: include params 30564 1726882927.00821: variable 'interface' from source: play vars 30564 1726882927.00870: variable 'interface' from source: play vars 30564 1726882927.00916: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882927.01025: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882927.01051: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882927.01078: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882927.01099: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882927.01130: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882927.01147: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882927.01165: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882927.01186: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882927.01223: variable '__network_team_connections_defined' from source: role '' defaults 30564 1726882927.01377: variable 'network_connections' from source: include params 30564 1726882927.01380: variable 'interface' from source: play vars 30564 1726882927.01424: variable 'interface' from source: play vars 30564 1726882927.01441: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 30564 1726882927.01445: when evaluation is False, skipping this task 30564 1726882927.01447: _execute() done 30564 1726882927.01450: dumping result to json 30564 1726882927.01452: done dumping result, returning 30564 1726882927.01458: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-4216-acec-00000000269f] 30564 1726882927.01466: sending task result for task 0e448fcc-3ce9-4216-acec-00000000269f 30564 1726882927.01553: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000269f 30564 1726882927.01562: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 30564 1726882927.01612: no more pending results, returning what we have 30564 1726882927.01619: results queue empty 30564 1726882927.01620: checking for any_errors_fatal 30564 1726882927.01625: done checking for any_errors_fatal 30564 1726882927.01626: checking for max_fail_percentage 30564 1726882927.01627: done checking for max_fail_percentage 30564 1726882927.01628: checking to see if all hosts have failed and the running result is not ok 30564 1726882927.01629: done checking to see if all hosts have failed 30564 1726882927.01630: getting the remaining hosts for this loop 30564 1726882927.01631: done getting the remaining hosts for this loop 30564 1726882927.01634: getting the next task for host managed_node2 30564 1726882927.01642: done getting next task for host managed_node2 30564 1726882927.01645: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30564 1726882927.01650: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882927.01677: getting variables 30564 1726882927.01679: in VariableManager get_vars() 30564 1726882927.01719: Calling all_inventory to load vars for managed_node2 30564 1726882927.01725: Calling groups_inventory to load vars for managed_node2 30564 1726882927.01727: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882927.01734: Calling all_plugins_play to load vars for managed_node2 30564 1726882927.01736: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882927.01737: Calling groups_plugins_play to load vars for managed_node2 30564 1726882927.02529: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882927.03485: done with get_vars() 30564 1726882927.03500: done getting variables 30564 1726882927.03537: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:42:07 -0400 (0:00:00.059) 0:02:05.616 ****** 30564 1726882927.03561: entering _queue_task() for managed_node2/service 30564 1726882927.03749: worker is 1 (out of 1 available) 30564 1726882927.03760: exiting _queue_task() for managed_node2/service 30564 1726882927.03801: done queuing things up, now waiting for results queue to drain 30564 1726882927.03803: waiting for pending results... 30564 1726882927.03951: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 30564 1726882927.04059: in run() - task 0e448fcc-3ce9-4216-acec-0000000026a0 30564 1726882927.04075: variable 'ansible_search_path' from source: unknown 30564 1726882927.04079: variable 'ansible_search_path' from source: unknown 30564 1726882927.04107: calling self._execute() 30564 1726882927.04183: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882927.04187: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882927.04195: variable 'omit' from source: magic vars 30564 1726882927.04475: variable 'ansible_distribution_major_version' from source: facts 30564 1726882927.04485: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882927.04598: variable 'network_provider' from source: set_fact 30564 1726882927.04602: variable 'network_state' from source: role '' defaults 30564 1726882927.04612: Evaluated conditional (network_provider == "nm" or network_state != {}): True 30564 1726882927.04618: variable 'omit' from source: magic vars 30564 1726882927.04662: variable 'omit' from source: magic vars 30564 1726882927.04685: variable 'network_service_name' from source: role '' defaults 30564 1726882927.04730: variable 'network_service_name' from source: role '' defaults 30564 1726882927.04805: variable '__network_provider_setup' from source: role '' defaults 30564 1726882927.04810: variable '__network_service_name_default_nm' from source: role '' defaults 30564 1726882927.04857: variable '__network_service_name_default_nm' from source: role '' defaults 30564 1726882927.04867: variable '__network_packages_default_nm' from source: role '' defaults 30564 1726882927.04912: variable '__network_packages_default_nm' from source: role '' defaults 30564 1726882927.05055: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882927.06799: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882927.06842: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882927.06873: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882927.06899: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882927.06918: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882927.06980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882927.07012: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882927.07029: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882927.07058: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882927.07073: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882927.07105: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882927.07121: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882927.07137: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882927.07164: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882927.07180: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882927.07316: variable '__network_packages_default_gobject_packages' from source: role '' defaults 30564 1726882927.07392: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882927.07409: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882927.07425: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882927.07449: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882927.07460: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882927.07526: variable 'ansible_python' from source: facts 30564 1726882927.07538: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 30564 1726882927.07598: variable '__network_wpa_supplicant_required' from source: role '' defaults 30564 1726882927.07649: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30564 1726882927.07734: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882927.07751: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882927.07768: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882927.07797: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882927.07808: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882927.07840: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882927.07859: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882927.07880: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882927.07906: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882927.07916: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882927.08007: variable 'network_connections' from source: include params 30564 1726882927.08015: variable 'interface' from source: play vars 30564 1726882927.08066: variable 'interface' from source: play vars 30564 1726882927.08171: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882927.08283: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882927.08318: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882927.08346: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882927.08381: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882927.08423: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882927.08443: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882927.08470: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882927.08494: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882927.08530: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882927.08715: variable 'network_connections' from source: include params 30564 1726882927.08721: variable 'interface' from source: play vars 30564 1726882927.08774: variable 'interface' from source: play vars 30564 1726882927.08800: variable '__network_packages_default_wireless' from source: role '' defaults 30564 1726882927.08851: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882927.09038: variable 'network_connections' from source: include params 30564 1726882927.09041: variable 'interface' from source: play vars 30564 1726882927.09095: variable 'interface' from source: play vars 30564 1726882927.09112: variable '__network_packages_default_team' from source: role '' defaults 30564 1726882927.09168: variable '__network_team_connections_defined' from source: role '' defaults 30564 1726882927.09479: variable 'network_connections' from source: include params 30564 1726882927.09490: variable 'interface' from source: play vars 30564 1726882927.09570: variable 'interface' from source: play vars 30564 1726882927.09631: variable '__network_service_name_default_initscripts' from source: role '' defaults 30564 1726882927.09709: variable '__network_service_name_default_initscripts' from source: role '' defaults 30564 1726882927.09723: variable '__network_packages_default_initscripts' from source: role '' defaults 30564 1726882927.09793: variable '__network_packages_default_initscripts' from source: role '' defaults 30564 1726882927.10031: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 30564 1726882927.10610: variable 'network_connections' from source: include params 30564 1726882927.10614: variable 'interface' from source: play vars 30564 1726882927.10657: variable 'interface' from source: play vars 30564 1726882927.10663: variable 'ansible_distribution' from source: facts 30564 1726882927.10668: variable '__network_rh_distros' from source: role '' defaults 30564 1726882927.10680: variable 'ansible_distribution_major_version' from source: facts 30564 1726882927.10690: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 30564 1726882927.10806: variable 'ansible_distribution' from source: facts 30564 1726882927.10809: variable '__network_rh_distros' from source: role '' defaults 30564 1726882927.10811: variable 'ansible_distribution_major_version' from source: facts 30564 1726882927.10822: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 30564 1726882927.10935: variable 'ansible_distribution' from source: facts 30564 1726882927.10938: variable '__network_rh_distros' from source: role '' defaults 30564 1726882927.10943: variable 'ansible_distribution_major_version' from source: facts 30564 1726882927.10970: variable 'network_provider' from source: set_fact 30564 1726882927.10988: variable 'omit' from source: magic vars 30564 1726882927.11008: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882927.11027: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882927.11041: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882927.11054: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882927.11062: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882927.11091: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882927.11094: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882927.11096: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882927.11160: Set connection var ansible_timeout to 10 30564 1726882927.11165: Set connection var ansible_pipelining to False 30564 1726882927.11169: Set connection var ansible_shell_type to sh 30564 1726882927.11176: Set connection var ansible_shell_executable to /bin/sh 30564 1726882927.11184: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882927.11186: Set connection var ansible_connection to ssh 30564 1726882927.11207: variable 'ansible_shell_executable' from source: unknown 30564 1726882927.11210: variable 'ansible_connection' from source: unknown 30564 1726882927.11212: variable 'ansible_module_compression' from source: unknown 30564 1726882927.11215: variable 'ansible_shell_type' from source: unknown 30564 1726882927.11217: variable 'ansible_shell_executable' from source: unknown 30564 1726882927.11219: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882927.11221: variable 'ansible_pipelining' from source: unknown 30564 1726882927.11223: variable 'ansible_timeout' from source: unknown 30564 1726882927.11228: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882927.11302: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882927.11309: variable 'omit' from source: magic vars 30564 1726882927.11312: starting attempt loop 30564 1726882927.11314: running the handler 30564 1726882927.11370: variable 'ansible_facts' from source: unknown 30564 1726882927.11801: _low_level_execute_command(): starting 30564 1726882927.11807: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882927.12286: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882927.12405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882927.12408: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882927.12507: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882927.12511: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882927.12513: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882927.12515: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882927.12877: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882927.14282: stdout chunk (state=3): >>>/root <<< 30564 1726882927.14406: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882927.14458: stderr chunk (state=3): >>><<< 30564 1726882927.14469: stdout chunk (state=3): >>><<< 30564 1726882927.14491: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882927.14515: _low_level_execute_command(): starting 30564 1726882927.14526: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882927.1449652-36054-182047599586595 `" && echo ansible-tmp-1726882927.1449652-36054-182047599586595="` echo /root/.ansible/tmp/ansible-tmp-1726882927.1449652-36054-182047599586595 `" ) && sleep 0' 30564 1726882927.15128: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882927.15140: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882927.15154: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882927.15177: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882927.15214: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882927.15227: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882927.15243: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882927.15261: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882927.15281: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882927.15293: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882927.15305: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882927.15318: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882927.15333: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882927.15345: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882927.15356: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882927.15372: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882927.15451: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882927.15477: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882927.15501: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882927.15631: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882927.17518: stdout chunk (state=3): >>>ansible-tmp-1726882927.1449652-36054-182047599586595=/root/.ansible/tmp/ansible-tmp-1726882927.1449652-36054-182047599586595 <<< 30564 1726882927.17706: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882927.17709: stdout chunk (state=3): >>><<< 30564 1726882927.17711: stderr chunk (state=3): >>><<< 30564 1726882927.17886: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882927.1449652-36054-182047599586595=/root/.ansible/tmp/ansible-tmp-1726882927.1449652-36054-182047599586595 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882927.17889: variable 'ansible_module_compression' from source: unknown 30564 1726882927.17892: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 30564 1726882927.17894: variable 'ansible_facts' from source: unknown 30564 1726882927.18169: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882927.1449652-36054-182047599586595/AnsiballZ_systemd.py 30564 1726882927.18293: Sending initial data 30564 1726882927.18305: Sent initial data (156 bytes) 30564 1726882927.18965: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882927.18969: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882927.19008: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882927.19012: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882927.19014: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882927.19057: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882927.19063: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882927.19179: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882927.20974: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882927.21069: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882927.21169: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmp5ws9vegx /root/.ansible/tmp/ansible-tmp-1726882927.1449652-36054-182047599586595/AnsiballZ_systemd.py <<< 30564 1726882927.21260: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882927.23427: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882927.23504: stderr chunk (state=3): >>><<< 30564 1726882927.23507: stdout chunk (state=3): >>><<< 30564 1726882927.23525: done transferring module to remote 30564 1726882927.23536: _low_level_execute_command(): starting 30564 1726882927.23539: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882927.1449652-36054-182047599586595/ /root/.ansible/tmp/ansible-tmp-1726882927.1449652-36054-182047599586595/AnsiballZ_systemd.py && sleep 0' 30564 1726882927.24259: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882927.24281: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882927.24324: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882927.24327: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882927.24329: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882927.24331: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882927.24375: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882927.24393: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882927.24500: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882927.26267: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882927.26313: stderr chunk (state=3): >>><<< 30564 1726882927.26328: stdout chunk (state=3): >>><<< 30564 1726882927.26346: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882927.26350: _low_level_execute_command(): starting 30564 1726882927.26352: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882927.1449652-36054-182047599586595/AnsiballZ_systemd.py && sleep 0' 30564 1726882927.27053: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882927.27189: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882927.52224: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6692", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ExecMainStartTimestampMonotonic": "202392137", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6692", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3602", "MemoryCurrent": "9183232", "MemoryAvailable": "infinity", "CPUUsageNSec": "2423169000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft"<<< 30564 1726882927.52234: stdout chunk (state=3): >>>: "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service multi-user.target network.target shutdown.target cloud-init.service", "After": "cloud-init-local.service dbus-broker.service network-pre.target system.slice dbus.socket systemd-journald.socket basic.target sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:57 EDT", "StateChangeTimestampMonotonic": "316658837", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveExitTimestampMonotonic": "202392395", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveEnterTimestampMonotonic": "202472383", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveExitTimestampMonotonic": "202362940", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveEnterTimestampMonotonic": "202381901", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ConditionTimestampMonotonic": "202382734", "AssertTimestamp": "Fri 2024-09-20 21:31:03 EDT", "AssertTimestampMonotonic": "202382737", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "55e27919215348fab37a11b7ea324f90", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 30564 1726882927.53685: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882927.53743: stderr chunk (state=3): >>><<< 30564 1726882927.53747: stdout chunk (state=3): >>><<< 30564 1726882927.53764: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6692", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ExecMainStartTimestampMonotonic": "202392137", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6692", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3602", "MemoryCurrent": "9183232", "MemoryAvailable": "infinity", "CPUUsageNSec": "2423169000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service multi-user.target network.target shutdown.target cloud-init.service", "After": "cloud-init-local.service dbus-broker.service network-pre.target system.slice dbus.socket systemd-journald.socket basic.target sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:57 EDT", "StateChangeTimestampMonotonic": "316658837", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveExitTimestampMonotonic": "202392395", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveEnterTimestampMonotonic": "202472383", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveExitTimestampMonotonic": "202362940", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveEnterTimestampMonotonic": "202381901", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ConditionTimestampMonotonic": "202382734", "AssertTimestamp": "Fri 2024-09-20 21:31:03 EDT", "AssertTimestampMonotonic": "202382737", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "55e27919215348fab37a11b7ea324f90", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882927.53878: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882927.1449652-36054-182047599586595/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882927.53892: _low_level_execute_command(): starting 30564 1726882927.53896: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882927.1449652-36054-182047599586595/ > /dev/null 2>&1 && sleep 0' 30564 1726882927.54349: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882927.54353: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882927.54392: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882927.54404: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882927.54455: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882927.54479: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882927.54580: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882927.56353: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882927.56399: stderr chunk (state=3): >>><<< 30564 1726882927.56405: stdout chunk (state=3): >>><<< 30564 1726882927.56417: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882927.56423: handler run complete 30564 1726882927.56461: attempt loop complete, returning result 30564 1726882927.56466: _execute() done 30564 1726882927.56468: dumping result to json 30564 1726882927.56481: done dumping result, returning 30564 1726882927.56489: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0e448fcc-3ce9-4216-acec-0000000026a0] 30564 1726882927.56494: sending task result for task 0e448fcc-3ce9-4216-acec-0000000026a0 30564 1726882927.56684: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000026a0 30564 1726882927.56686: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30564 1726882927.56753: no more pending results, returning what we have 30564 1726882927.56756: results queue empty 30564 1726882927.56757: checking for any_errors_fatal 30564 1726882927.56762: done checking for any_errors_fatal 30564 1726882927.56765: checking for max_fail_percentage 30564 1726882927.56767: done checking for max_fail_percentage 30564 1726882927.56768: checking to see if all hosts have failed and the running result is not ok 30564 1726882927.56769: done checking to see if all hosts have failed 30564 1726882927.56770: getting the remaining hosts for this loop 30564 1726882927.56772: done getting the remaining hosts for this loop 30564 1726882927.56776: getting the next task for host managed_node2 30564 1726882927.56784: done getting next task for host managed_node2 30564 1726882927.56788: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30564 1726882927.56794: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882927.56814: getting variables 30564 1726882927.56816: in VariableManager get_vars() 30564 1726882927.56858: Calling all_inventory to load vars for managed_node2 30564 1726882927.56861: Calling groups_inventory to load vars for managed_node2 30564 1726882927.56863: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882927.56875: Calling all_plugins_play to load vars for managed_node2 30564 1726882927.56877: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882927.56880: Calling groups_plugins_play to load vars for managed_node2 30564 1726882927.57837: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882927.58784: done with get_vars() 30564 1726882927.58800: done getting variables 30564 1726882927.58842: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:42:07 -0400 (0:00:00.553) 0:02:06.170 ****** 30564 1726882927.58874: entering _queue_task() for managed_node2/service 30564 1726882927.59087: worker is 1 (out of 1 available) 30564 1726882927.59101: exiting _queue_task() for managed_node2/service 30564 1726882927.59113: done queuing things up, now waiting for results queue to drain 30564 1726882927.59114: waiting for pending results... 30564 1726882927.59303: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 30564 1726882927.59403: in run() - task 0e448fcc-3ce9-4216-acec-0000000026a1 30564 1726882927.59420: variable 'ansible_search_path' from source: unknown 30564 1726882927.59424: variable 'ansible_search_path' from source: unknown 30564 1726882927.59451: calling self._execute() 30564 1726882927.59537: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882927.59541: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882927.59552: variable 'omit' from source: magic vars 30564 1726882927.59872: variable 'ansible_distribution_major_version' from source: facts 30564 1726882927.59883: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882927.59962: variable 'network_provider' from source: set_fact 30564 1726882927.59973: Evaluated conditional (network_provider == "nm"): True 30564 1726882927.60035: variable '__network_wpa_supplicant_required' from source: role '' defaults 30564 1726882927.60102: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 30564 1726882927.60221: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882927.62225: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882927.62293: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882927.62324: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882927.62364: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882927.62392: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882927.62458: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882927.62483: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882927.62502: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882927.62535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882927.62546: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882927.62581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882927.62600: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882927.62617: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882927.62642: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882927.62652: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882927.62683: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882927.62700: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882927.62719: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882927.62743: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882927.62753: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882927.62854: variable 'network_connections' from source: include params 30564 1726882927.62866: variable 'interface' from source: play vars 30564 1726882927.62912: variable 'interface' from source: play vars 30564 1726882927.62961: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 30564 1726882927.63072: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 30564 1726882927.63099: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 30564 1726882927.63121: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 30564 1726882927.63146: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 30564 1726882927.63177: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 30564 1726882927.63193: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 30564 1726882927.63211: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882927.63228: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 30564 1726882927.63269: variable '__network_wireless_connections_defined' from source: role '' defaults 30564 1726882927.63420: variable 'network_connections' from source: include params 30564 1726882927.63424: variable 'interface' from source: play vars 30564 1726882927.63466: variable 'interface' from source: play vars 30564 1726882927.63491: Evaluated conditional (__network_wpa_supplicant_required): False 30564 1726882927.63495: when evaluation is False, skipping this task 30564 1726882927.63497: _execute() done 30564 1726882927.63500: dumping result to json 30564 1726882927.63502: done dumping result, returning 30564 1726882927.63509: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0e448fcc-3ce9-4216-acec-0000000026a1] 30564 1726882927.63519: sending task result for task 0e448fcc-3ce9-4216-acec-0000000026a1 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 30564 1726882927.63645: no more pending results, returning what we have 30564 1726882927.63649: results queue empty 30564 1726882927.63650: checking for any_errors_fatal 30564 1726882927.63670: done checking for any_errors_fatal 30564 1726882927.63671: checking for max_fail_percentage 30564 1726882927.63674: done checking for max_fail_percentage 30564 1726882927.63676: checking to see if all hosts have failed and the running result is not ok 30564 1726882927.63677: done checking to see if all hosts have failed 30564 1726882927.63678: getting the remaining hosts for this loop 30564 1726882927.63680: done getting the remaining hosts for this loop 30564 1726882927.63683: getting the next task for host managed_node2 30564 1726882927.63693: done getting next task for host managed_node2 30564 1726882927.63696: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 30564 1726882927.63701: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882927.63725: getting variables 30564 1726882927.63726: in VariableManager get_vars() 30564 1726882927.63773: Calling all_inventory to load vars for managed_node2 30564 1726882927.63776: Calling groups_inventory to load vars for managed_node2 30564 1726882927.63779: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882927.63788: Calling all_plugins_play to load vars for managed_node2 30564 1726882927.63790: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882927.63792: Calling groups_plugins_play to load vars for managed_node2 30564 1726882927.64328: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000026a1 30564 1726882927.64331: WORKER PROCESS EXITING 30564 1726882927.65095: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882927.72257: done with get_vars() 30564 1726882927.72284: done getting variables 30564 1726882927.72329: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:42:07 -0400 (0:00:00.134) 0:02:06.304 ****** 30564 1726882927.72362: entering _queue_task() for managed_node2/service 30564 1726882927.72707: worker is 1 (out of 1 available) 30564 1726882927.72720: exiting _queue_task() for managed_node2/service 30564 1726882927.72733: done queuing things up, now waiting for results queue to drain 30564 1726882927.72736: waiting for pending results... 30564 1726882927.73031: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 30564 1726882927.73195: in run() - task 0e448fcc-3ce9-4216-acec-0000000026a2 30564 1726882927.73218: variable 'ansible_search_path' from source: unknown 30564 1726882927.73227: variable 'ansible_search_path' from source: unknown 30564 1726882927.73267: calling self._execute() 30564 1726882927.73377: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882927.73389: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882927.73407: variable 'omit' from source: magic vars 30564 1726882927.73821: variable 'ansible_distribution_major_version' from source: facts 30564 1726882927.73845: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882927.73990: variable 'network_provider' from source: set_fact 30564 1726882927.74001: Evaluated conditional (network_provider == "initscripts"): False 30564 1726882927.74009: when evaluation is False, skipping this task 30564 1726882927.74016: _execute() done 30564 1726882927.74023: dumping result to json 30564 1726882927.74030: done dumping result, returning 30564 1726882927.74040: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [0e448fcc-3ce9-4216-acec-0000000026a2] 30564 1726882927.74053: sending task result for task 0e448fcc-3ce9-4216-acec-0000000026a2 skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 30564 1726882927.74207: no more pending results, returning what we have 30564 1726882927.74212: results queue empty 30564 1726882927.74213: checking for any_errors_fatal 30564 1726882927.74224: done checking for any_errors_fatal 30564 1726882927.74224: checking for max_fail_percentage 30564 1726882927.74227: done checking for max_fail_percentage 30564 1726882927.74228: checking to see if all hosts have failed and the running result is not ok 30564 1726882927.74228: done checking to see if all hosts have failed 30564 1726882927.74229: getting the remaining hosts for this loop 30564 1726882927.74231: done getting the remaining hosts for this loop 30564 1726882927.74235: getting the next task for host managed_node2 30564 1726882927.74245: done getting next task for host managed_node2 30564 1726882927.74249: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30564 1726882927.74256: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882927.74292: getting variables 30564 1726882927.74294: in VariableManager get_vars() 30564 1726882927.74341: Calling all_inventory to load vars for managed_node2 30564 1726882927.74344: Calling groups_inventory to load vars for managed_node2 30564 1726882927.74346: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882927.74358: Calling all_plugins_play to load vars for managed_node2 30564 1726882927.74361: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882927.74366: Calling groups_plugins_play to load vars for managed_node2 30564 1726882927.75379: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000026a2 30564 1726882927.75382: WORKER PROCESS EXITING 30564 1726882927.76179: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882927.77994: done with get_vars() 30564 1726882927.78024: done getting variables 30564 1726882927.78084: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:42:07 -0400 (0:00:00.057) 0:02:06.362 ****** 30564 1726882927.78120: entering _queue_task() for managed_node2/copy 30564 1726882927.78418: worker is 1 (out of 1 available) 30564 1726882927.78432: exiting _queue_task() for managed_node2/copy 30564 1726882927.78445: done queuing things up, now waiting for results queue to drain 30564 1726882927.78446: waiting for pending results... 30564 1726882927.78750: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 30564 1726882927.78915: in run() - task 0e448fcc-3ce9-4216-acec-0000000026a3 30564 1726882927.78933: variable 'ansible_search_path' from source: unknown 30564 1726882927.78941: variable 'ansible_search_path' from source: unknown 30564 1726882927.78981: calling self._execute() 30564 1726882927.79091: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882927.79109: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882927.79124: variable 'omit' from source: magic vars 30564 1726882927.79508: variable 'ansible_distribution_major_version' from source: facts 30564 1726882927.79528: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882927.79672: variable 'network_provider' from source: set_fact 30564 1726882927.79686: Evaluated conditional (network_provider == "initscripts"): False 30564 1726882927.79695: when evaluation is False, skipping this task 30564 1726882927.79704: _execute() done 30564 1726882927.79712: dumping result to json 30564 1726882927.79720: done dumping result, returning 30564 1726882927.79732: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0e448fcc-3ce9-4216-acec-0000000026a3] 30564 1726882927.79743: sending task result for task 0e448fcc-3ce9-4216-acec-0000000026a3 30564 1726882927.79854: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000026a3 30564 1726882927.79863: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 30564 1726882927.79909: no more pending results, returning what we have 30564 1726882927.79913: results queue empty 30564 1726882927.79914: checking for any_errors_fatal 30564 1726882927.79921: done checking for any_errors_fatal 30564 1726882927.79922: checking for max_fail_percentage 30564 1726882927.79924: done checking for max_fail_percentage 30564 1726882927.79925: checking to see if all hosts have failed and the running result is not ok 30564 1726882927.79926: done checking to see if all hosts have failed 30564 1726882927.79927: getting the remaining hosts for this loop 30564 1726882927.79928: done getting the remaining hosts for this loop 30564 1726882927.79932: getting the next task for host managed_node2 30564 1726882927.79941: done getting next task for host managed_node2 30564 1726882927.79945: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30564 1726882927.79951: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882927.79989: getting variables 30564 1726882927.79992: in VariableManager get_vars() 30564 1726882927.80041: Calling all_inventory to load vars for managed_node2 30564 1726882927.80043: Calling groups_inventory to load vars for managed_node2 30564 1726882927.80046: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882927.80058: Calling all_plugins_play to load vars for managed_node2 30564 1726882927.80060: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882927.80065: Calling groups_plugins_play to load vars for managed_node2 30564 1726882927.82081: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882927.83804: done with get_vars() 30564 1726882927.83828: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:42:07 -0400 (0:00:00.057) 0:02:06.420 ****** 30564 1726882927.83923: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 30564 1726882927.84257: worker is 1 (out of 1 available) 30564 1726882927.84272: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 30564 1726882927.84284: done queuing things up, now waiting for results queue to drain 30564 1726882927.84285: waiting for pending results... 30564 1726882927.84595: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 30564 1726882927.84765: in run() - task 0e448fcc-3ce9-4216-acec-0000000026a4 30564 1726882927.84785: variable 'ansible_search_path' from source: unknown 30564 1726882927.84792: variable 'ansible_search_path' from source: unknown 30564 1726882927.84833: calling self._execute() 30564 1726882927.84951: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882927.84973: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882927.84991: variable 'omit' from source: magic vars 30564 1726882927.85406: variable 'ansible_distribution_major_version' from source: facts 30564 1726882927.85424: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882927.85435: variable 'omit' from source: magic vars 30564 1726882927.85514: variable 'omit' from source: magic vars 30564 1726882927.85672: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882927.88041: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882927.88125: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882927.88160: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882927.88197: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882927.88233: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882927.88319: variable 'network_provider' from source: set_fact 30564 1726882927.88448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882927.88482: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882927.88509: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882927.88555: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882927.88574: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882927.88643: variable 'omit' from source: magic vars 30564 1726882927.88763: variable 'omit' from source: magic vars 30564 1726882927.88875: variable 'network_connections' from source: include params 30564 1726882927.88896: variable 'interface' from source: play vars 30564 1726882927.88959: variable 'interface' from source: play vars 30564 1726882927.89129: variable 'omit' from source: magic vars 30564 1726882927.89142: variable '__lsr_ansible_managed' from source: task vars 30564 1726882927.89217: variable '__lsr_ansible_managed' from source: task vars 30564 1726882927.89404: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 30564 1726882927.89659: Loaded config def from plugin (lookup/template) 30564 1726882927.89672: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 30564 1726882927.89706: File lookup term: get_ansible_managed.j2 30564 1726882927.89714: variable 'ansible_search_path' from source: unknown 30564 1726882927.89724: evaluation_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 30564 1726882927.89747: search_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 30564 1726882927.89775: variable 'ansible_search_path' from source: unknown 30564 1726882927.94692: variable 'ansible_managed' from source: unknown 30564 1726882927.94822: variable 'omit' from source: magic vars 30564 1726882927.94851: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882927.94878: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882927.94898: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882927.94916: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882927.94928: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882927.94956: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882927.94962: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882927.94972: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882927.95052: Set connection var ansible_timeout to 10 30564 1726882927.95061: Set connection var ansible_pipelining to False 30564 1726882927.95069: Set connection var ansible_shell_type to sh 30564 1726882927.95078: Set connection var ansible_shell_executable to /bin/sh 30564 1726882927.95087: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882927.95092: Set connection var ansible_connection to ssh 30564 1726882927.95116: variable 'ansible_shell_executable' from source: unknown 30564 1726882927.95121: variable 'ansible_connection' from source: unknown 30564 1726882927.95126: variable 'ansible_module_compression' from source: unknown 30564 1726882927.95131: variable 'ansible_shell_type' from source: unknown 30564 1726882927.95135: variable 'ansible_shell_executable' from source: unknown 30564 1726882927.95140: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882927.95146: variable 'ansible_pipelining' from source: unknown 30564 1726882927.95150: variable 'ansible_timeout' from source: unknown 30564 1726882927.95157: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882927.95268: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30564 1726882927.95289: variable 'omit' from source: magic vars 30564 1726882927.95298: starting attempt loop 30564 1726882927.95303: running the handler 30564 1726882927.95316: _low_level_execute_command(): starting 30564 1726882927.95325: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882927.95819: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882927.95840: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 30564 1726882927.95853: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882927.95906: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882927.95916: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882927.96036: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882927.97686: stdout chunk (state=3): >>>/root <<< 30564 1726882927.97789: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882927.97832: stderr chunk (state=3): >>><<< 30564 1726882927.97835: stdout chunk (state=3): >>><<< 30564 1726882927.97855: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882927.97865: _low_level_execute_command(): starting 30564 1726882927.97869: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882927.978532-36081-118156847126417 `" && echo ansible-tmp-1726882927.978532-36081-118156847126417="` echo /root/.ansible/tmp/ansible-tmp-1726882927.978532-36081-118156847126417 `" ) && sleep 0' 30564 1726882927.98268: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882927.98280: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882927.98309: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 30564 1726882927.98321: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882927.98373: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882927.98382: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882927.98495: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882928.00350: stdout chunk (state=3): >>>ansible-tmp-1726882927.978532-36081-118156847126417=/root/.ansible/tmp/ansible-tmp-1726882927.978532-36081-118156847126417 <<< 30564 1726882928.00461: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882928.00504: stderr chunk (state=3): >>><<< 30564 1726882928.00508: stdout chunk (state=3): >>><<< 30564 1726882928.00519: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882927.978532-36081-118156847126417=/root/.ansible/tmp/ansible-tmp-1726882927.978532-36081-118156847126417 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882928.00551: variable 'ansible_module_compression' from source: unknown 30564 1726882928.00589: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 30564 1726882928.00610: variable 'ansible_facts' from source: unknown 30564 1726882928.00675: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882927.978532-36081-118156847126417/AnsiballZ_network_connections.py 30564 1726882928.00772: Sending initial data 30564 1726882928.00777: Sent initial data (167 bytes) 30564 1726882928.01477: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882928.01490: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882928.01502: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882928.01515: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882928.01551: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882928.01566: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882928.01579: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882928.01595: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882928.01604: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882928.01612: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882928.01621: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882928.01631: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882928.01642: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882928.01650: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882928.01658: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882928.01671: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882928.01739: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882928.01754: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882928.01772: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882928.01895: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882928.03646: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882928.03742: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882928.03834: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmp7587en9p /root/.ansible/tmp/ansible-tmp-1726882927.978532-36081-118156847126417/AnsiballZ_network_connections.py <<< 30564 1726882928.03926: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882928.05350: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882928.05425: stderr chunk (state=3): >>><<< 30564 1726882928.05433: stdout chunk (state=3): >>><<< 30564 1726882928.05455: done transferring module to remote 30564 1726882928.05470: _low_level_execute_command(): starting 30564 1726882928.05479: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882927.978532-36081-118156847126417/ /root/.ansible/tmp/ansible-tmp-1726882927.978532-36081-118156847126417/AnsiballZ_network_connections.py && sleep 0' 30564 1726882928.06095: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882928.06109: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882928.06124: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882928.06142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882928.06190: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882928.06203: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882928.06217: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882928.06234: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882928.06246: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882928.06257: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882928.06276: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882928.06291: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882928.06307: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882928.06319: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882928.06330: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882928.06343: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882928.06421: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882928.06442: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882928.06458: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882928.06593: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882928.08318: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882928.08389: stderr chunk (state=3): >>><<< 30564 1726882928.08392: stdout chunk (state=3): >>><<< 30564 1726882928.08474: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882928.08478: _low_level_execute_command(): starting 30564 1726882928.08480: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882927.978532-36081-118156847126417/AnsiballZ_network_connections.py && sleep 0' 30564 1726882928.09042: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882928.09056: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882928.09076: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882928.09094: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882928.09136: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882928.09149: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882928.09161: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882928.09191: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882928.09203: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882928.09213: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882928.09224: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882928.09236: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882928.09253: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882928.09266: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882928.09281: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882928.09294: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882928.09377: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882928.09398: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882928.09414: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882928.09546: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882928.33157: stdout chunk (state=3): >>> {"changed": false, "warnings": [], "stderr": "[002] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 30564 1726882928.34673: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882928.34677: stdout chunk (state=3): >>><<< 30564 1726882928.34680: stderr chunk (state=3): >>><<< 30564 1726882928.34776: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "warnings": [], "stderr": "[002] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "statebr", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882928.34780: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'statebr', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882927.978532-36081-118156847126417/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882928.34783: _low_level_execute_command(): starting 30564 1726882928.34785: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882927.978532-36081-118156847126417/ > /dev/null 2>&1 && sleep 0' 30564 1726882928.35509: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882928.35526: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882928.35548: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882928.35578: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882928.35624: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882928.35636: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882928.35655: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882928.35681: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882928.35694: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882928.35705: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882928.35717: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882928.35734: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882928.35753: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882928.35772: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882928.35785: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882928.35799: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882928.35888: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882928.35909: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882928.35927: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882928.36055: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882928.37874: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882928.37950: stderr chunk (state=3): >>><<< 30564 1726882928.37961: stdout chunk (state=3): >>><<< 30564 1726882928.38280: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882928.38288: handler run complete 30564 1726882928.38291: attempt loop complete, returning result 30564 1726882928.38293: _execute() done 30564 1726882928.38295: dumping result to json 30564 1726882928.38297: done dumping result, returning 30564 1726882928.38300: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0e448fcc-3ce9-4216-acec-0000000026a4] 30564 1726882928.38302: sending task result for task 0e448fcc-3ce9-4216-acec-0000000026a4 30564 1726882928.38389: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000026a4 30564 1726882928.38393: WORKER PROCESS EXITING ok: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false } STDERR: [002] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete 30564 1726882928.38510: no more pending results, returning what we have 30564 1726882928.38513: results queue empty 30564 1726882928.38515: checking for any_errors_fatal 30564 1726882928.38521: done checking for any_errors_fatal 30564 1726882928.38522: checking for max_fail_percentage 30564 1726882928.38524: done checking for max_fail_percentage 30564 1726882928.38525: checking to see if all hosts have failed and the running result is not ok 30564 1726882928.38526: done checking to see if all hosts have failed 30564 1726882928.38526: getting the remaining hosts for this loop 30564 1726882928.38529: done getting the remaining hosts for this loop 30564 1726882928.38532: getting the next task for host managed_node2 30564 1726882928.38541: done getting next task for host managed_node2 30564 1726882928.38545: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 30564 1726882928.38551: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882928.38570: getting variables 30564 1726882928.38572: in VariableManager get_vars() 30564 1726882928.38621: Calling all_inventory to load vars for managed_node2 30564 1726882928.38624: Calling groups_inventory to load vars for managed_node2 30564 1726882928.38626: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882928.38637: Calling all_plugins_play to load vars for managed_node2 30564 1726882928.38640: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882928.38643: Calling groups_plugins_play to load vars for managed_node2 30564 1726882928.40424: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882928.42475: done with get_vars() 30564 1726882928.42497: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:42:08 -0400 (0:00:00.586) 0:02:07.007 ****** 30564 1726882928.42591: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 30564 1726882928.42930: worker is 1 (out of 1 available) 30564 1726882928.42943: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 30564 1726882928.42954: done queuing things up, now waiting for results queue to drain 30564 1726882928.42956: waiting for pending results... 30564 1726882928.43356: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 30564 1726882928.43590: in run() - task 0e448fcc-3ce9-4216-acec-0000000026a5 30564 1726882928.43615: variable 'ansible_search_path' from source: unknown 30564 1726882928.43631: variable 'ansible_search_path' from source: unknown 30564 1726882928.43686: calling self._execute() 30564 1726882928.43859: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882928.43883: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882928.43907: variable 'omit' from source: magic vars 30564 1726882928.44500: variable 'ansible_distribution_major_version' from source: facts 30564 1726882928.44527: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882928.44756: variable 'network_state' from source: role '' defaults 30564 1726882928.44780: Evaluated conditional (network_state != {}): False 30564 1726882928.44795: when evaluation is False, skipping this task 30564 1726882928.44804: _execute() done 30564 1726882928.44811: dumping result to json 30564 1726882928.44819: done dumping result, returning 30564 1726882928.44829: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [0e448fcc-3ce9-4216-acec-0000000026a5] 30564 1726882928.44855: sending task result for task 0e448fcc-3ce9-4216-acec-0000000026a5 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 30564 1726882928.45094: no more pending results, returning what we have 30564 1726882928.45098: results queue empty 30564 1726882928.45099: checking for any_errors_fatal 30564 1726882928.45111: done checking for any_errors_fatal 30564 1726882928.45112: checking for max_fail_percentage 30564 1726882928.45114: done checking for max_fail_percentage 30564 1726882928.45116: checking to see if all hosts have failed and the running result is not ok 30564 1726882928.45116: done checking to see if all hosts have failed 30564 1726882928.45117: getting the remaining hosts for this loop 30564 1726882928.45119: done getting the remaining hosts for this loop 30564 1726882928.45123: getting the next task for host managed_node2 30564 1726882928.45133: done getting next task for host managed_node2 30564 1726882928.45137: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30564 1726882928.45145: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882928.45181: getting variables 30564 1726882928.45184: in VariableManager get_vars() 30564 1726882928.45264: Calling all_inventory to load vars for managed_node2 30564 1726882928.45272: Calling groups_inventory to load vars for managed_node2 30564 1726882928.45278: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882928.45290: Calling all_plugins_play to load vars for managed_node2 30564 1726882928.45296: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882928.45300: Calling groups_plugins_play to load vars for managed_node2 30564 1726882928.46332: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000026a5 30564 1726882928.46336: WORKER PROCESS EXITING 30564 1726882928.47483: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882928.49459: done with get_vars() 30564 1726882928.49483: done getting variables 30564 1726882928.49547: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:42:08 -0400 (0:00:00.069) 0:02:07.077 ****** 30564 1726882928.49587: entering _queue_task() for managed_node2/debug 30564 1726882928.49872: worker is 1 (out of 1 available) 30564 1726882928.49884: exiting _queue_task() for managed_node2/debug 30564 1726882928.49896: done queuing things up, now waiting for results queue to drain 30564 1726882928.49897: waiting for pending results... 30564 1726882928.50202: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 30564 1726882928.50343: in run() - task 0e448fcc-3ce9-4216-acec-0000000026a6 30564 1726882928.50362: variable 'ansible_search_path' from source: unknown 30564 1726882928.50373: variable 'ansible_search_path' from source: unknown 30564 1726882928.50418: calling self._execute() 30564 1726882928.50533: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882928.50545: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882928.50567: variable 'omit' from source: magic vars 30564 1726882928.50977: variable 'ansible_distribution_major_version' from source: facts 30564 1726882928.50999: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882928.51012: variable 'omit' from source: magic vars 30564 1726882928.51104: variable 'omit' from source: magic vars 30564 1726882928.51139: variable 'omit' from source: magic vars 30564 1726882928.51189: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882928.51232: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882928.51256: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882928.51285: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882928.51304: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882928.51343: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882928.51351: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882928.51358: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882928.51473: Set connection var ansible_timeout to 10 30564 1726882928.51490: Set connection var ansible_pipelining to False 30564 1726882928.51498: Set connection var ansible_shell_type to sh 30564 1726882928.51509: Set connection var ansible_shell_executable to /bin/sh 30564 1726882928.51522: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882928.51534: Set connection var ansible_connection to ssh 30564 1726882928.51562: variable 'ansible_shell_executable' from source: unknown 30564 1726882928.51574: variable 'ansible_connection' from source: unknown 30564 1726882928.51583: variable 'ansible_module_compression' from source: unknown 30564 1726882928.51595: variable 'ansible_shell_type' from source: unknown 30564 1726882928.51602: variable 'ansible_shell_executable' from source: unknown 30564 1726882928.51609: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882928.51616: variable 'ansible_pipelining' from source: unknown 30564 1726882928.51622: variable 'ansible_timeout' from source: unknown 30564 1726882928.51629: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882928.51780: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882928.51795: variable 'omit' from source: magic vars 30564 1726882928.51807: starting attempt loop 30564 1726882928.51813: running the handler 30564 1726882928.51945: variable '__network_connections_result' from source: set_fact 30564 1726882928.52005: handler run complete 30564 1726882928.52032: attempt loop complete, returning result 30564 1726882928.52039: _execute() done 30564 1726882928.52045: dumping result to json 30564 1726882928.52051: done dumping result, returning 30564 1726882928.52061: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0e448fcc-3ce9-4216-acec-0000000026a6] 30564 1726882928.52074: sending task result for task 0e448fcc-3ce9-4216-acec-0000000026a6 30564 1726882928.52171: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000026a6 ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete" ] } 30564 1726882928.52254: no more pending results, returning what we have 30564 1726882928.52258: results queue empty 30564 1726882928.52259: checking for any_errors_fatal 30564 1726882928.52275: done checking for any_errors_fatal 30564 1726882928.52276: checking for max_fail_percentage 30564 1726882928.52278: done checking for max_fail_percentage 30564 1726882928.52279: checking to see if all hosts have failed and the running result is not ok 30564 1726882928.52280: done checking to see if all hosts have failed 30564 1726882928.52281: getting the remaining hosts for this loop 30564 1726882928.52283: done getting the remaining hosts for this loop 30564 1726882928.52287: getting the next task for host managed_node2 30564 1726882928.52296: done getting next task for host managed_node2 30564 1726882928.52299: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30564 1726882928.52305: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882928.52319: getting variables 30564 1726882928.52321: in VariableManager get_vars() 30564 1726882928.52372: Calling all_inventory to load vars for managed_node2 30564 1726882928.52375: Calling groups_inventory to load vars for managed_node2 30564 1726882928.52379: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882928.52390: Calling all_plugins_play to load vars for managed_node2 30564 1726882928.52393: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882928.52396: Calling groups_plugins_play to load vars for managed_node2 30564 1726882928.53383: WORKER PROCESS EXITING 30564 1726882928.54561: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882928.56510: done with get_vars() 30564 1726882928.56536: done getting variables 30564 1726882928.56600: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:42:08 -0400 (0:00:00.070) 0:02:07.147 ****** 30564 1726882928.56640: entering _queue_task() for managed_node2/debug 30564 1726882928.56947: worker is 1 (out of 1 available) 30564 1726882928.56960: exiting _queue_task() for managed_node2/debug 30564 1726882928.56973: done queuing things up, now waiting for results queue to drain 30564 1726882928.56974: waiting for pending results... 30564 1726882928.57277: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 30564 1726882928.57434: in run() - task 0e448fcc-3ce9-4216-acec-0000000026a7 30564 1726882928.57453: variable 'ansible_search_path' from source: unknown 30564 1726882928.57464: variable 'ansible_search_path' from source: unknown 30564 1726882928.57510: calling self._execute() 30564 1726882928.57630: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882928.57645: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882928.57659: variable 'omit' from source: magic vars 30564 1726882928.58084: variable 'ansible_distribution_major_version' from source: facts 30564 1726882928.58102: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882928.58114: variable 'omit' from source: magic vars 30564 1726882928.58193: variable 'omit' from source: magic vars 30564 1726882928.58234: variable 'omit' from source: magic vars 30564 1726882928.58281: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882928.58322: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882928.58351: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882928.58374: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882928.58390: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882928.58426: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882928.58435: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882928.58442: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882928.58554: Set connection var ansible_timeout to 10 30564 1726882928.58569: Set connection var ansible_pipelining to False 30564 1726882928.58576: Set connection var ansible_shell_type to sh 30564 1726882928.58586: Set connection var ansible_shell_executable to /bin/sh 30564 1726882928.58597: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882928.58603: Set connection var ansible_connection to ssh 30564 1726882928.58633: variable 'ansible_shell_executable' from source: unknown 30564 1726882928.58642: variable 'ansible_connection' from source: unknown 30564 1726882928.58648: variable 'ansible_module_compression' from source: unknown 30564 1726882928.58654: variable 'ansible_shell_type' from source: unknown 30564 1726882928.58660: variable 'ansible_shell_executable' from source: unknown 30564 1726882928.58674: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882928.58684: variable 'ansible_pipelining' from source: unknown 30564 1726882928.58691: variable 'ansible_timeout' from source: unknown 30564 1726882928.58699: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882928.58855: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882928.58875: variable 'omit' from source: magic vars 30564 1726882928.58890: starting attempt loop 30564 1726882928.58898: running the handler 30564 1726882928.58954: variable '__network_connections_result' from source: set_fact 30564 1726882928.59042: variable '__network_connections_result' from source: set_fact 30564 1726882928.59171: handler run complete 30564 1726882928.59201: attempt loop complete, returning result 30564 1726882928.59210: _execute() done 30564 1726882928.59221: dumping result to json 30564 1726882928.59229: done dumping result, returning 30564 1726882928.59241: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0e448fcc-3ce9-4216-acec-0000000026a7] 30564 1726882928.59252: sending task result for task 0e448fcc-3ce9-4216-acec-0000000026a7 30564 1726882928.59372: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000026a7 ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "statebr", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false, "failed": false, "stderr": "[002] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete\n", "stderr_lines": [ "[002] #0, state:down persistent_state:absent, 'statebr': no connection matches 'statebr' to delete" ] } } 30564 1726882928.59473: no more pending results, returning what we have 30564 1726882928.59478: results queue empty 30564 1726882928.59479: checking for any_errors_fatal 30564 1726882928.59486: done checking for any_errors_fatal 30564 1726882928.59487: checking for max_fail_percentage 30564 1726882928.59489: done checking for max_fail_percentage 30564 1726882928.59490: checking to see if all hosts have failed and the running result is not ok 30564 1726882928.59490: done checking to see if all hosts have failed 30564 1726882928.59492: getting the remaining hosts for this loop 30564 1726882928.59493: done getting the remaining hosts for this loop 30564 1726882928.59497: getting the next task for host managed_node2 30564 1726882928.59505: done getting next task for host managed_node2 30564 1726882928.59509: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30564 1726882928.59515: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882928.59530: getting variables 30564 1726882928.59532: in VariableManager get_vars() 30564 1726882928.59581: Calling all_inventory to load vars for managed_node2 30564 1726882928.59584: Calling groups_inventory to load vars for managed_node2 30564 1726882928.59586: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882928.59602: Calling all_plugins_play to load vars for managed_node2 30564 1726882928.59605: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882928.59608: Calling groups_plugins_play to load vars for managed_node2 30564 1726882928.60604: WORKER PROCESS EXITING 30564 1726882928.61416: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882928.63218: done with get_vars() 30564 1726882928.63244: done getting variables 30564 1726882928.63300: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:42:08 -0400 (0:00:00.066) 0:02:07.214 ****** 30564 1726882928.63333: entering _queue_task() for managed_node2/debug 30564 1726882928.63608: worker is 1 (out of 1 available) 30564 1726882928.63620: exiting _queue_task() for managed_node2/debug 30564 1726882928.63630: done queuing things up, now waiting for results queue to drain 30564 1726882928.63632: waiting for pending results... 30564 1726882928.63921: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 30564 1726882928.64058: in run() - task 0e448fcc-3ce9-4216-acec-0000000026a8 30564 1726882928.64083: variable 'ansible_search_path' from source: unknown 30564 1726882928.64091: variable 'ansible_search_path' from source: unknown 30564 1726882928.64135: calling self._execute() 30564 1726882928.64248: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882928.64260: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882928.64279: variable 'omit' from source: magic vars 30564 1726882928.64691: variable 'ansible_distribution_major_version' from source: facts 30564 1726882928.64708: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882928.64845: variable 'network_state' from source: role '' defaults 30564 1726882928.64863: Evaluated conditional (network_state != {}): False 30564 1726882928.64877: when evaluation is False, skipping this task 30564 1726882928.64885: _execute() done 30564 1726882928.64893: dumping result to json 30564 1726882928.64900: done dumping result, returning 30564 1726882928.64911: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0e448fcc-3ce9-4216-acec-0000000026a8] 30564 1726882928.64923: sending task result for task 0e448fcc-3ce9-4216-acec-0000000026a8 30564 1726882928.65040: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000026a8 skipping: [managed_node2] => { "false_condition": "network_state != {}" } 30564 1726882928.65094: no more pending results, returning what we have 30564 1726882928.65099: results queue empty 30564 1726882928.65100: checking for any_errors_fatal 30564 1726882928.65108: done checking for any_errors_fatal 30564 1726882928.65108: checking for max_fail_percentage 30564 1726882928.65111: done checking for max_fail_percentage 30564 1726882928.65112: checking to see if all hosts have failed and the running result is not ok 30564 1726882928.65113: done checking to see if all hosts have failed 30564 1726882928.65114: getting the remaining hosts for this loop 30564 1726882928.65115: done getting the remaining hosts for this loop 30564 1726882928.65120: getting the next task for host managed_node2 30564 1726882928.65129: done getting next task for host managed_node2 30564 1726882928.65133: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 30564 1726882928.65142: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882928.65176: getting variables 30564 1726882928.65179: in VariableManager get_vars() 30564 1726882928.65227: Calling all_inventory to load vars for managed_node2 30564 1726882928.65230: Calling groups_inventory to load vars for managed_node2 30564 1726882928.65233: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882928.65246: Calling all_plugins_play to load vars for managed_node2 30564 1726882928.65249: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882928.65252: Calling groups_plugins_play to load vars for managed_node2 30564 1726882928.66422: WORKER PROCESS EXITING 30564 1726882928.67078: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882928.68757: done with get_vars() 30564 1726882928.68780: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:42:08 -0400 (0:00:00.055) 0:02:07.270 ****** 30564 1726882928.68874: entering _queue_task() for managed_node2/ping 30564 1726882928.69257: worker is 1 (out of 1 available) 30564 1726882928.69271: exiting _queue_task() for managed_node2/ping 30564 1726882928.69284: done queuing things up, now waiting for results queue to drain 30564 1726882928.69285: waiting for pending results... 30564 1726882928.69634: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 30564 1726882928.69844: in run() - task 0e448fcc-3ce9-4216-acec-0000000026a9 30564 1726882928.69884: variable 'ansible_search_path' from source: unknown 30564 1726882928.69892: variable 'ansible_search_path' from source: unknown 30564 1726882928.69931: calling self._execute() 30564 1726882928.70083: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882928.70094: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882928.70108: variable 'omit' from source: magic vars 30564 1726882928.70503: variable 'ansible_distribution_major_version' from source: facts 30564 1726882928.70520: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882928.70532: variable 'omit' from source: magic vars 30564 1726882928.70604: variable 'omit' from source: magic vars 30564 1726882928.70639: variable 'omit' from source: magic vars 30564 1726882928.70686: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882928.70730: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882928.70754: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882928.70782: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882928.70799: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882928.70838: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882928.70848: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882928.70856: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882928.70971: Set connection var ansible_timeout to 10 30564 1726882928.70985: Set connection var ansible_pipelining to False 30564 1726882928.70992: Set connection var ansible_shell_type to sh 30564 1726882928.71003: Set connection var ansible_shell_executable to /bin/sh 30564 1726882928.71016: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882928.71022: Set connection var ansible_connection to ssh 30564 1726882928.71054: variable 'ansible_shell_executable' from source: unknown 30564 1726882928.71062: variable 'ansible_connection' from source: unknown 30564 1726882928.71075: variable 'ansible_module_compression' from source: unknown 30564 1726882928.71084: variable 'ansible_shell_type' from source: unknown 30564 1726882928.71091: variable 'ansible_shell_executable' from source: unknown 30564 1726882928.71098: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882928.71105: variable 'ansible_pipelining' from source: unknown 30564 1726882928.71112: variable 'ansible_timeout' from source: unknown 30564 1726882928.71119: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882928.71330: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30564 1726882928.71348: variable 'omit' from source: magic vars 30564 1726882928.71362: starting attempt loop 30564 1726882928.71375: running the handler 30564 1726882928.71395: _low_level_execute_command(): starting 30564 1726882928.71408: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882928.72174: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882928.72189: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882928.72205: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882928.72228: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882928.72278: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882928.72291: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882928.72307: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882928.72326: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882928.72339: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882928.72352: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882928.72365: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882928.72383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882928.72399: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882928.72413: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882928.72425: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882928.72441: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882928.72523: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882928.72546: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882928.72571: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882928.72782: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882928.74380: stdout chunk (state=3): >>>/root <<< 30564 1726882928.74559: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882928.74563: stdout chunk (state=3): >>><<< 30564 1726882928.74570: stderr chunk (state=3): >>><<< 30564 1726882928.74586: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882928.74599: _low_level_execute_command(): starting 30564 1726882928.74604: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882928.7458427-36151-11014877271142 `" && echo ansible-tmp-1726882928.7458427-36151-11014877271142="` echo /root/.ansible/tmp/ansible-tmp-1726882928.7458427-36151-11014877271142 `" ) && sleep 0' 30564 1726882928.76142: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882928.76184: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882928.76188: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882928.76190: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882928.76233: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882928.76280: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882928.76290: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882928.76303: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882928.76314: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882928.76320: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882928.76328: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882928.76336: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882928.76388: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882928.76395: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882928.76402: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882928.76411: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882928.76487: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882928.76651: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882928.76662: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882928.76793: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882928.78658: stdout chunk (state=3): >>>ansible-tmp-1726882928.7458427-36151-11014877271142=/root/.ansible/tmp/ansible-tmp-1726882928.7458427-36151-11014877271142 <<< 30564 1726882928.78834: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882928.78838: stdout chunk (state=3): >>><<< 30564 1726882928.78845: stderr chunk (state=3): >>><<< 30564 1726882928.78865: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882928.7458427-36151-11014877271142=/root/.ansible/tmp/ansible-tmp-1726882928.7458427-36151-11014877271142 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882928.78912: variable 'ansible_module_compression' from source: unknown 30564 1726882928.78952: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 30564 1726882928.78987: variable 'ansible_facts' from source: unknown 30564 1726882928.79063: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882928.7458427-36151-11014877271142/AnsiballZ_ping.py 30564 1726882928.79670: Sending initial data 30564 1726882928.79673: Sent initial data (152 bytes) 30564 1726882928.81970: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882928.82026: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882928.82136: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882928.82151: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882928.82190: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882928.82197: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882928.82206: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882928.82219: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882928.82228: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882928.82235: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882928.82246: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882928.82255: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882928.82270: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882928.82276: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882928.82284: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882928.82293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882928.82479: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882928.82497: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882928.82509: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882928.82636: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882928.84431: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882928.84529: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882928.84628: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmp25ugpuqf /root/.ansible/tmp/ansible-tmp-1726882928.7458427-36151-11014877271142/AnsiballZ_ping.py <<< 30564 1726882928.84720: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882928.86234: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882928.86240: stderr chunk (state=3): >>><<< 30564 1726882928.86244: stdout chunk (state=3): >>><<< 30564 1726882928.86270: done transferring module to remote 30564 1726882928.86280: _low_level_execute_command(): starting 30564 1726882928.86285: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882928.7458427-36151-11014877271142/ /root/.ansible/tmp/ansible-tmp-1726882928.7458427-36151-11014877271142/AnsiballZ_ping.py && sleep 0' 30564 1726882928.87742: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882928.87779: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882928.87790: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882928.87804: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882928.87839: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882928.87890: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882928.87900: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882928.87913: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882928.87920: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882928.87927: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882928.87935: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882928.87944: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882928.87955: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882928.87999: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882928.88006: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882928.88016: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882928.88087: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882928.88222: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882928.88233: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882928.88354: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882928.90194: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882928.90197: stdout chunk (state=3): >>><<< 30564 1726882928.90204: stderr chunk (state=3): >>><<< 30564 1726882928.90222: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882928.90225: _low_level_execute_command(): starting 30564 1726882928.90230: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882928.7458427-36151-11014877271142/AnsiballZ_ping.py && sleep 0' 30564 1726882928.91803: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882928.91961: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882928.91974: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882928.91988: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882928.92026: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882928.92033: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882928.92043: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882928.92058: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882928.92072: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882928.92076: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882928.92084: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882928.92094: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882928.92105: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882928.92113: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882928.92119: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882928.92128: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882928.92204: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882928.92298: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882928.92310: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882928.92443: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882929.05350: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 30564 1726882929.06405: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882929.06409: stdout chunk (state=3): >>><<< 30564 1726882929.06416: stderr chunk (state=3): >>><<< 30564 1726882929.06439: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882929.06461: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882928.7458427-36151-11014877271142/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882929.06473: _low_level_execute_command(): starting 30564 1726882929.06478: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882928.7458427-36151-11014877271142/ > /dev/null 2>&1 && sleep 0' 30564 1726882929.08675: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882929.08797: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882929.08813: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882929.08830: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882929.08880: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882929.08895: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882929.08909: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882929.08926: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882929.09015: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882929.09027: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882929.09038: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882929.09051: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882929.09072: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882929.09085: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882929.09096: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882929.09112: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882929.09194: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882929.09235: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882929.09250: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882929.09562: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882929.11458: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882929.11461: stdout chunk (state=3): >>><<< 30564 1726882929.11466: stderr chunk (state=3): >>><<< 30564 1726882929.11774: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882929.11777: handler run complete 30564 1726882929.11780: attempt loop complete, returning result 30564 1726882929.11782: _execute() done 30564 1726882929.11784: dumping result to json 30564 1726882929.11785: done dumping result, returning 30564 1726882929.11787: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0e448fcc-3ce9-4216-acec-0000000026a9] 30564 1726882929.11789: sending task result for task 0e448fcc-3ce9-4216-acec-0000000026a9 30564 1726882929.11855: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000026a9 30564 1726882929.11858: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 30564 1726882929.11945: no more pending results, returning what we have 30564 1726882929.11948: results queue empty 30564 1726882929.11950: checking for any_errors_fatal 30564 1726882929.11955: done checking for any_errors_fatal 30564 1726882929.11956: checking for max_fail_percentage 30564 1726882929.11958: done checking for max_fail_percentage 30564 1726882929.11959: checking to see if all hosts have failed and the running result is not ok 30564 1726882929.11959: done checking to see if all hosts have failed 30564 1726882929.11960: getting the remaining hosts for this loop 30564 1726882929.11962: done getting the remaining hosts for this loop 30564 1726882929.11969: getting the next task for host managed_node2 30564 1726882929.11979: done getting next task for host managed_node2 30564 1726882929.11982: ^ task is: TASK: meta (role_complete) 30564 1726882929.11987: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882929.12002: getting variables 30564 1726882929.12004: in VariableManager get_vars() 30564 1726882929.12052: Calling all_inventory to load vars for managed_node2 30564 1726882929.12054: Calling groups_inventory to load vars for managed_node2 30564 1726882929.12056: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882929.12071: Calling all_plugins_play to load vars for managed_node2 30564 1726882929.12074: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882929.12078: Calling groups_plugins_play to load vars for managed_node2 30564 1726882929.14996: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882929.18882: done with get_vars() 30564 1726882929.18910: done getting variables 30564 1726882929.18995: done queuing things up, now waiting for results queue to drain 30564 1726882929.18997: results queue empty 30564 1726882929.18998: checking for any_errors_fatal 30564 1726882929.19000: done checking for any_errors_fatal 30564 1726882929.19001: checking for max_fail_percentage 30564 1726882929.19002: done checking for max_fail_percentage 30564 1726882929.19003: checking to see if all hosts have failed and the running result is not ok 30564 1726882929.19003: done checking to see if all hosts have failed 30564 1726882929.19004: getting the remaining hosts for this loop 30564 1726882929.19005: done getting the remaining hosts for this loop 30564 1726882929.19008: getting the next task for host managed_node2 30564 1726882929.19013: done getting next task for host managed_node2 30564 1726882929.19015: ^ task is: TASK: Asserts 30564 1726882929.19018: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882929.19021: getting variables 30564 1726882929.19022: in VariableManager get_vars() 30564 1726882929.19035: Calling all_inventory to load vars for managed_node2 30564 1726882929.19037: Calling groups_inventory to load vars for managed_node2 30564 1726882929.19039: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882929.19044: Calling all_plugins_play to load vars for managed_node2 30564 1726882929.19047: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882929.19050: Calling groups_plugins_play to load vars for managed_node2 30564 1726882929.21771: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882929.26155: done with get_vars() 30564 1726882929.26190: done getting variables TASK [Asserts] ***************************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:36 Friday 20 September 2024 21:42:09 -0400 (0:00:00.573) 0:02:07.843 ****** 30564 1726882929.26267: entering _queue_task() for managed_node2/include_tasks 30564 1726882929.27619: worker is 1 (out of 1 available) 30564 1726882929.27630: exiting _queue_task() for managed_node2/include_tasks 30564 1726882929.27641: done queuing things up, now waiting for results queue to drain 30564 1726882929.27642: waiting for pending results... 30564 1726882929.28343: running TaskExecutor() for managed_node2/TASK: Asserts 30564 1726882929.28632: in run() - task 0e448fcc-3ce9-4216-acec-0000000020b2 30564 1726882929.28645: variable 'ansible_search_path' from source: unknown 30564 1726882929.28648: variable 'ansible_search_path' from source: unknown 30564 1726882929.28699: variable 'lsr_assert' from source: include params 30564 1726882929.28916: variable 'lsr_assert' from source: include params 30564 1726882929.28983: variable 'omit' from source: magic vars 30564 1726882929.29350: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882929.29358: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882929.29372: variable 'omit' from source: magic vars 30564 1726882929.29956: variable 'ansible_distribution_major_version' from source: facts 30564 1726882929.29973: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882929.29978: variable 'item' from source: unknown 30564 1726882929.30040: variable 'item' from source: unknown 30564 1726882929.30187: variable 'item' from source: unknown 30564 1726882929.30245: variable 'item' from source: unknown 30564 1726882929.30452: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882929.30460: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882929.30472: variable 'omit' from source: magic vars 30564 1726882929.30806: variable 'ansible_distribution_major_version' from source: facts 30564 1726882929.30810: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882929.30816: variable 'item' from source: unknown 30564 1726882929.30993: variable 'item' from source: unknown 30564 1726882929.31021: variable 'item' from source: unknown 30564 1726882929.31128: variable 'item' from source: unknown 30564 1726882929.31201: dumping result to json 30564 1726882929.31973: done dumping result, returning 30564 1726882929.31980: done running TaskExecutor() for managed_node2/TASK: Asserts [0e448fcc-3ce9-4216-acec-0000000020b2] 30564 1726882929.31986: sending task result for task 0e448fcc-3ce9-4216-acec-0000000020b2 30564 1726882929.32041: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000020b2 30564 1726882929.32044: WORKER PROCESS EXITING 30564 1726882929.32097: no more pending results, returning what we have 30564 1726882929.32102: in VariableManager get_vars() 30564 1726882929.32153: Calling all_inventory to load vars for managed_node2 30564 1726882929.32156: Calling groups_inventory to load vars for managed_node2 30564 1726882929.32159: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882929.32176: Calling all_plugins_play to load vars for managed_node2 30564 1726882929.32180: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882929.32183: Calling groups_plugins_play to load vars for managed_node2 30564 1726882929.35844: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882929.41783: done with get_vars() 30564 1726882929.41818: variable 'ansible_search_path' from source: unknown 30564 1726882929.41819: variable 'ansible_search_path' from source: unknown 30564 1726882929.41866: variable 'ansible_search_path' from source: unknown 30564 1726882929.41870: variable 'ansible_search_path' from source: unknown 30564 1726882929.41904: we have included files to process 30564 1726882929.41905: generating all_blocks data 30564 1726882929.41908: done generating all_blocks data 30564 1726882929.41914: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 30564 1726882929.41916: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 30564 1726882929.41918: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 30564 1726882929.42043: in VariableManager get_vars() 30564 1726882929.43074: done with get_vars() 30564 1726882929.43201: done processing included file 30564 1726882929.43203: iterating over new_blocks loaded from include file 30564 1726882929.43204: in VariableManager get_vars() 30564 1726882929.43223: done with get_vars() 30564 1726882929.43225: filtering new block on tags 30564 1726882929.43261: done filtering new block on tags 30564 1726882929.43266: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml for managed_node2 => (item=tasks/assert_profile_absent.yml) 30564 1726882929.43274: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_NetworkManager_NVR.yml 30564 1726882929.43275: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_NetworkManager_NVR.yml 30564 1726882929.43278: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_NetworkManager_NVR.yml 30564 1726882929.44769: done processing included file 30564 1726882929.44771: iterating over new_blocks loaded from include file 30564 1726882929.44773: in VariableManager get_vars() 30564 1726882929.44790: done with get_vars() 30564 1726882929.44792: filtering new block on tags 30564 1726882929.44836: done filtering new block on tags 30564 1726882929.44839: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_NetworkManager_NVR.yml for managed_node2 => (item=tasks/get_NetworkManager_NVR.yml) 30564 1726882929.44842: extending task lists for all hosts with included blocks 30564 1726882929.48040: done extending task lists 30564 1726882929.48041: done processing included files 30564 1726882929.48042: results queue empty 30564 1726882929.48043: checking for any_errors_fatal 30564 1726882929.48045: done checking for any_errors_fatal 30564 1726882929.48046: checking for max_fail_percentage 30564 1726882929.48047: done checking for max_fail_percentage 30564 1726882929.48048: checking to see if all hosts have failed and the running result is not ok 30564 1726882929.48049: done checking to see if all hosts have failed 30564 1726882929.48049: getting the remaining hosts for this loop 30564 1726882929.48051: done getting the remaining hosts for this loop 30564 1726882929.48054: getting the next task for host managed_node2 30564 1726882929.48058: done getting next task for host managed_node2 30564 1726882929.48060: ^ task is: TASK: Include the task 'get_profile_stat.yml' 30564 1726882929.48065: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882929.48070: getting variables 30564 1726882929.48077: in VariableManager get_vars() 30564 1726882929.48090: Calling all_inventory to load vars for managed_node2 30564 1726882929.48093: Calling groups_inventory to load vars for managed_node2 30564 1726882929.48095: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882929.48101: Calling all_plugins_play to load vars for managed_node2 30564 1726882929.48103: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882929.48107: Calling groups_plugins_play to load vars for managed_node2 30564 1726882929.50927: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882929.55287: done with get_vars() 30564 1726882929.55312: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:3 Friday 20 September 2024 21:42:09 -0400 (0:00:00.292) 0:02:08.136 ****** 30564 1726882929.55516: entering _queue_task() for managed_node2/include_tasks 30564 1726882929.56442: worker is 1 (out of 1 available) 30564 1726882929.56454: exiting _queue_task() for managed_node2/include_tasks 30564 1726882929.56470: done queuing things up, now waiting for results queue to drain 30564 1726882929.56472: waiting for pending results... 30564 1726882929.57060: running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' 30564 1726882929.57194: in run() - task 0e448fcc-3ce9-4216-acec-000000002804 30564 1726882929.57222: variable 'ansible_search_path' from source: unknown 30564 1726882929.57231: variable 'ansible_search_path' from source: unknown 30564 1726882929.57276: calling self._execute() 30564 1726882929.58019: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882929.58030: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882929.58043: variable 'omit' from source: magic vars 30564 1726882929.58570: variable 'ansible_distribution_major_version' from source: facts 30564 1726882929.58589: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882929.58605: _execute() done 30564 1726882929.58613: dumping result to json 30564 1726882929.58661: done dumping result, returning 30564 1726882929.58679: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' [0e448fcc-3ce9-4216-acec-000000002804] 30564 1726882929.58691: sending task result for task 0e448fcc-3ce9-4216-acec-000000002804 30564 1726882929.58915: no more pending results, returning what we have 30564 1726882929.58921: in VariableManager get_vars() 30564 1726882929.58986: Calling all_inventory to load vars for managed_node2 30564 1726882929.58989: Calling groups_inventory to load vars for managed_node2 30564 1726882929.58994: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882929.59009: Calling all_plugins_play to load vars for managed_node2 30564 1726882929.59012: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882929.59016: Calling groups_plugins_play to load vars for managed_node2 30564 1726882929.60258: done sending task result for task 0e448fcc-3ce9-4216-acec-000000002804 30564 1726882929.60262: WORKER PROCESS EXITING 30564 1726882929.62188: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882929.65185: done with get_vars() 30564 1726882929.65211: variable 'ansible_search_path' from source: unknown 30564 1726882929.65213: variable 'ansible_search_path' from source: unknown 30564 1726882929.65223: variable 'item' from source: include params 30564 1726882929.65343: variable 'item' from source: include params 30564 1726882929.65387: we have included files to process 30564 1726882929.65388: generating all_blocks data 30564 1726882929.65390: done generating all_blocks data 30564 1726882929.65391: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30564 1726882929.65392: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30564 1726882929.65394: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 30564 1726882929.66700: done processing included file 30564 1726882929.66702: iterating over new_blocks loaded from include file 30564 1726882929.66704: in VariableManager get_vars() 30564 1726882929.66725: done with get_vars() 30564 1726882929.66726: filtering new block on tags 30564 1726882929.66800: done filtering new block on tags 30564 1726882929.66808: in VariableManager get_vars() 30564 1726882929.66827: done with get_vars() 30564 1726882929.66829: filtering new block on tags 30564 1726882929.66890: done filtering new block on tags 30564 1726882929.66892: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node2 30564 1726882929.66898: extending task lists for all hosts with included blocks 30564 1726882929.67300: done extending task lists 30564 1726882929.67302: done processing included files 30564 1726882929.67303: results queue empty 30564 1726882929.67304: checking for any_errors_fatal 30564 1726882929.67308: done checking for any_errors_fatal 30564 1726882929.67308: checking for max_fail_percentage 30564 1726882929.67309: done checking for max_fail_percentage 30564 1726882929.67310: checking to see if all hosts have failed and the running result is not ok 30564 1726882929.67311: done checking to see if all hosts have failed 30564 1726882929.67312: getting the remaining hosts for this loop 30564 1726882929.67313: done getting the remaining hosts for this loop 30564 1726882929.67316: getting the next task for host managed_node2 30564 1726882929.67320: done getting next task for host managed_node2 30564 1726882929.67323: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 30564 1726882929.67326: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882929.67328: getting variables 30564 1726882929.67329: in VariableManager get_vars() 30564 1726882929.67340: Calling all_inventory to load vars for managed_node2 30564 1726882929.67343: Calling groups_inventory to load vars for managed_node2 30564 1726882929.67459: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882929.67471: Calling all_plugins_play to load vars for managed_node2 30564 1726882929.67474: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882929.67477: Calling groups_plugins_play to load vars for managed_node2 30564 1726882929.69561: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882929.72636: done with get_vars() 30564 1726882929.72794: done getting variables 30564 1726882929.72840: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 21:42:09 -0400 (0:00:00.173) 0:02:08.310 ****** 30564 1726882929.72877: entering _queue_task() for managed_node2/set_fact 30564 1726882929.73801: worker is 1 (out of 1 available) 30564 1726882929.73813: exiting _queue_task() for managed_node2/set_fact 30564 1726882929.73828: done queuing things up, now waiting for results queue to drain 30564 1726882929.73829: waiting for pending results... 30564 1726882929.74380: running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag 30564 1726882929.74594: in run() - task 0e448fcc-3ce9-4216-acec-000000002888 30564 1726882929.74610: variable 'ansible_search_path' from source: unknown 30564 1726882929.74613: variable 'ansible_search_path' from source: unknown 30564 1726882929.74763: calling self._execute() 30564 1726882929.75052: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882929.75118: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882929.75180: variable 'omit' from source: magic vars 30564 1726882929.76022: variable 'ansible_distribution_major_version' from source: facts 30564 1726882929.76151: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882929.76155: variable 'omit' from source: magic vars 30564 1726882929.76218: variable 'omit' from source: magic vars 30564 1726882929.76273: variable 'omit' from source: magic vars 30564 1726882929.76317: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882929.76349: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882929.76486: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882929.76505: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882929.76515: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882929.76544: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882929.76547: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882929.76550: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882929.76766: Set connection var ansible_timeout to 10 30564 1726882929.76906: Set connection var ansible_pipelining to False 30564 1726882929.76909: Set connection var ansible_shell_type to sh 30564 1726882929.76915: Set connection var ansible_shell_executable to /bin/sh 30564 1726882929.76923: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882929.76927: Set connection var ansible_connection to ssh 30564 1726882929.76949: variable 'ansible_shell_executable' from source: unknown 30564 1726882929.76953: variable 'ansible_connection' from source: unknown 30564 1726882929.76956: variable 'ansible_module_compression' from source: unknown 30564 1726882929.76958: variable 'ansible_shell_type' from source: unknown 30564 1726882929.76961: variable 'ansible_shell_executable' from source: unknown 30564 1726882929.76965: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882929.76970: variable 'ansible_pipelining' from source: unknown 30564 1726882929.76972: variable 'ansible_timeout' from source: unknown 30564 1726882929.76975: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882929.77332: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882929.77347: variable 'omit' from source: magic vars 30564 1726882929.77352: starting attempt loop 30564 1726882929.77355: running the handler 30564 1726882929.77374: handler run complete 30564 1726882929.77382: attempt loop complete, returning result 30564 1726882929.77387: _execute() done 30564 1726882929.77390: dumping result to json 30564 1726882929.77445: done dumping result, returning 30564 1726882929.77453: done running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag [0e448fcc-3ce9-4216-acec-000000002888] 30564 1726882929.77459: sending task result for task 0e448fcc-3ce9-4216-acec-000000002888 30564 1726882929.77679: done sending task result for task 0e448fcc-3ce9-4216-acec-000000002888 30564 1726882929.77683: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 30564 1726882929.77762: no more pending results, returning what we have 30564 1726882929.77770: results queue empty 30564 1726882929.77772: checking for any_errors_fatal 30564 1726882929.77774: done checking for any_errors_fatal 30564 1726882929.77775: checking for max_fail_percentage 30564 1726882929.77777: done checking for max_fail_percentage 30564 1726882929.77778: checking to see if all hosts have failed and the running result is not ok 30564 1726882929.77778: done checking to see if all hosts have failed 30564 1726882929.77779: getting the remaining hosts for this loop 30564 1726882929.77781: done getting the remaining hosts for this loop 30564 1726882929.77785: getting the next task for host managed_node2 30564 1726882929.77795: done getting next task for host managed_node2 30564 1726882929.77797: ^ task is: TASK: Stat profile file 30564 1726882929.77805: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882929.77810: getting variables 30564 1726882929.77812: in VariableManager get_vars() 30564 1726882929.77869: Calling all_inventory to load vars for managed_node2 30564 1726882929.77873: Calling groups_inventory to load vars for managed_node2 30564 1726882929.77877: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882929.77891: Calling all_plugins_play to load vars for managed_node2 30564 1726882929.77895: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882929.77898: Calling groups_plugins_play to load vars for managed_node2 30564 1726882929.80986: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882929.85554: done with get_vars() 30564 1726882929.85584: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 21:42:09 -0400 (0:00:00.129) 0:02:08.439 ****** 30564 1726882929.85807: entering _queue_task() for managed_node2/stat 30564 1726882929.86554: worker is 1 (out of 1 available) 30564 1726882929.86590: exiting _queue_task() for managed_node2/stat 30564 1726882929.86604: done queuing things up, now waiting for results queue to drain 30564 1726882929.86606: waiting for pending results... 30564 1726882929.87534: running TaskExecutor() for managed_node2/TASK: Stat profile file 30564 1726882929.87749: in run() - task 0e448fcc-3ce9-4216-acec-000000002889 30564 1726882929.87763: variable 'ansible_search_path' from source: unknown 30564 1726882929.87770: variable 'ansible_search_path' from source: unknown 30564 1726882929.87807: calling self._execute() 30564 1726882929.87911: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882929.87916: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882929.87928: variable 'omit' from source: magic vars 30564 1726882929.88836: variable 'ansible_distribution_major_version' from source: facts 30564 1726882929.88849: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882929.88854: variable 'omit' from source: magic vars 30564 1726882929.89022: variable 'omit' from source: magic vars 30564 1726882929.89231: variable 'profile' from source: play vars 30564 1726882929.89234: variable 'interface' from source: play vars 30564 1726882929.89347: variable 'interface' from source: play vars 30564 1726882929.89370: variable 'omit' from source: magic vars 30564 1726882929.89414: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882929.89449: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882929.89476: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882929.89494: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882929.89505: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882929.89539: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882929.89543: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882929.89545: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882929.89655: Set connection var ansible_timeout to 10 30564 1726882929.89661: Set connection var ansible_pipelining to False 30564 1726882929.89665: Set connection var ansible_shell_type to sh 30564 1726882929.89672: Set connection var ansible_shell_executable to /bin/sh 30564 1726882929.89680: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882929.89683: Set connection var ansible_connection to ssh 30564 1726882929.89712: variable 'ansible_shell_executable' from source: unknown 30564 1726882929.89716: variable 'ansible_connection' from source: unknown 30564 1726882929.89718: variable 'ansible_module_compression' from source: unknown 30564 1726882929.89721: variable 'ansible_shell_type' from source: unknown 30564 1726882929.89725: variable 'ansible_shell_executable' from source: unknown 30564 1726882929.89727: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882929.89733: variable 'ansible_pipelining' from source: unknown 30564 1726882929.89736: variable 'ansible_timeout' from source: unknown 30564 1726882929.89738: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882929.89964: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30564 1726882929.89974: variable 'omit' from source: magic vars 30564 1726882929.89980: starting attempt loop 30564 1726882929.89982: running the handler 30564 1726882929.89998: _low_level_execute_command(): starting 30564 1726882929.90007: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882929.90790: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882929.90804: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882929.90816: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882929.90834: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882929.90877: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882929.90886: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882929.90896: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882929.90914: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882929.90922: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882929.90932: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882929.90941: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882929.90951: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882929.90965: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882929.90973: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882929.90981: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882929.90990: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882929.91071: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882929.91089: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882929.91101: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882929.91238: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882929.92905: stdout chunk (state=3): >>>/root <<< 30564 1726882929.93078: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882929.93084: stdout chunk (state=3): >>><<< 30564 1726882929.93092: stderr chunk (state=3): >>><<< 30564 1726882929.93112: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882929.93125: _low_level_execute_command(): starting 30564 1726882929.93131: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882929.931123-36180-160166570992298 `" && echo ansible-tmp-1726882929.931123-36180-160166570992298="` echo /root/.ansible/tmp/ansible-tmp-1726882929.931123-36180-160166570992298 `" ) && sleep 0' 30564 1726882929.94610: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882929.94617: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882929.94731: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882929.94774: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882929.94777: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882929.94791: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882929.94798: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882929.95001: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882929.95020: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882929.95162: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882929.97041: stdout chunk (state=3): >>>ansible-tmp-1726882929.931123-36180-160166570992298=/root/.ansible/tmp/ansible-tmp-1726882929.931123-36180-160166570992298 <<< 30564 1726882929.97214: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882929.97218: stderr chunk (state=3): >>><<< 30564 1726882929.97223: stdout chunk (state=3): >>><<< 30564 1726882929.97246: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882929.931123-36180-160166570992298=/root/.ansible/tmp/ansible-tmp-1726882929.931123-36180-160166570992298 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882929.97301: variable 'ansible_module_compression' from source: unknown 30564 1726882929.97366: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 30564 1726882929.97402: variable 'ansible_facts' from source: unknown 30564 1726882929.97475: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882929.931123-36180-160166570992298/AnsiballZ_stat.py 30564 1726882929.97929: Sending initial data 30564 1726882929.97932: Sent initial data (152 bytes) 30564 1726882930.00578: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882930.00587: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882930.00598: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882930.00618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882930.00660: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882930.00725: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882930.00738: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882930.00751: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882930.00758: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882930.00769: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882930.00775: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882930.00784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882930.00795: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882930.00802: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882930.00809: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882930.00819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882930.00898: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882930.01062: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882930.01078: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882930.01210: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882930.03029: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882930.03123: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882930.03227: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmpbu_7t6h0 /root/.ansible/tmp/ansible-tmp-1726882929.931123-36180-160166570992298/AnsiballZ_stat.py <<< 30564 1726882930.03321: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882930.04818: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882930.05087: stderr chunk (state=3): >>><<< 30564 1726882930.05090: stdout chunk (state=3): >>><<< 30564 1726882930.05092: done transferring module to remote 30564 1726882930.05095: _low_level_execute_command(): starting 30564 1726882930.05097: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882929.931123-36180-160166570992298/ /root/.ansible/tmp/ansible-tmp-1726882929.931123-36180-160166570992298/AnsiballZ_stat.py && sleep 0' 30564 1726882930.06799: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882930.06883: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882930.06899: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882930.06955: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882930.07003: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882930.07056: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882930.07078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882930.07098: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882930.07110: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882930.07165: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882930.07184: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882930.07197: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882930.07211: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882930.07222: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882930.07231: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882930.07243: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882930.07341: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882930.07501: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882930.07519: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882930.07714: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882930.09605: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882930.09609: stdout chunk (state=3): >>><<< 30564 1726882930.09612: stderr chunk (state=3): >>><<< 30564 1726882930.09726: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882930.09730: _low_level_execute_command(): starting 30564 1726882930.09732: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882929.931123-36180-160166570992298/AnsiballZ_stat.py && sleep 0' 30564 1726882930.10934: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882930.10938: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882930.10966: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882930.10972: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882930.10975: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882930.11029: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882930.11289: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882930.11292: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882930.11410: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882930.24353: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 30564 1726882930.25348: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882930.25437: stderr chunk (state=3): >>><<< 30564 1726882930.25441: stdout chunk (state=3): >>><<< 30564 1726882930.25461: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882930.25494: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882929.931123-36180-160166570992298/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882930.25501: _low_level_execute_command(): starting 30564 1726882930.25507: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882929.931123-36180-160166570992298/ > /dev/null 2>&1 && sleep 0' 30564 1726882930.26185: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882930.26202: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882930.26212: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882930.26226: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882930.26263: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882930.26276: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882930.26291: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882930.26313: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882930.26321: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882930.26328: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882930.26335: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882930.26345: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882930.26356: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882930.26365: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882930.26373: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882930.26382: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882930.26475: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882930.26482: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882930.26489: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882930.26618: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882930.28760: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882930.28765: stderr chunk (state=3): >>><<< 30564 1726882930.28769: stdout chunk (state=3): >>><<< 30564 1726882930.28772: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882930.28773: handler run complete 30564 1726882930.28775: attempt loop complete, returning result 30564 1726882930.28777: _execute() done 30564 1726882930.28779: dumping result to json 30564 1726882930.28780: done dumping result, returning 30564 1726882930.28782: done running TaskExecutor() for managed_node2/TASK: Stat profile file [0e448fcc-3ce9-4216-acec-000000002889] 30564 1726882930.28784: sending task result for task 0e448fcc-3ce9-4216-acec-000000002889 30564 1726882930.28972: done sending task result for task 0e448fcc-3ce9-4216-acec-000000002889 30564 1726882930.28976: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 30564 1726882930.29047: no more pending results, returning what we have 30564 1726882930.29051: results queue empty 30564 1726882930.29052: checking for any_errors_fatal 30564 1726882930.29058: done checking for any_errors_fatal 30564 1726882930.29059: checking for max_fail_percentage 30564 1726882930.29061: done checking for max_fail_percentage 30564 1726882930.29062: checking to see if all hosts have failed and the running result is not ok 30564 1726882930.29062: done checking to see if all hosts have failed 30564 1726882930.29065: getting the remaining hosts for this loop 30564 1726882930.29066: done getting the remaining hosts for this loop 30564 1726882930.29070: getting the next task for host managed_node2 30564 1726882930.29078: done getting next task for host managed_node2 30564 1726882930.29081: ^ task is: TASK: Set NM profile exist flag based on the profile files 30564 1726882930.29086: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882930.29092: getting variables 30564 1726882930.29093: in VariableManager get_vars() 30564 1726882930.29133: Calling all_inventory to load vars for managed_node2 30564 1726882930.29135: Calling groups_inventory to load vars for managed_node2 30564 1726882930.29137: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882930.29145: Calling all_plugins_play to load vars for managed_node2 30564 1726882930.29147: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882930.29149: Calling groups_plugins_play to load vars for managed_node2 30564 1726882930.30686: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882930.32173: done with get_vars() 30564 1726882930.32192: done getting variables 30564 1726882930.32238: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 21:42:10 -0400 (0:00:00.464) 0:02:08.903 ****** 30564 1726882930.32263: entering _queue_task() for managed_node2/set_fact 30564 1726882930.32513: worker is 1 (out of 1 available) 30564 1726882930.32526: exiting _queue_task() for managed_node2/set_fact 30564 1726882930.32538: done queuing things up, now waiting for results queue to drain 30564 1726882930.32539: waiting for pending results... 30564 1726882930.32732: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files 30564 1726882930.32812: in run() - task 0e448fcc-3ce9-4216-acec-00000000288a 30564 1726882930.32825: variable 'ansible_search_path' from source: unknown 30564 1726882930.32829: variable 'ansible_search_path' from source: unknown 30564 1726882930.32856: calling self._execute() 30564 1726882930.32942: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882930.32945: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882930.32954: variable 'omit' from source: magic vars 30564 1726882930.33253: variable 'ansible_distribution_major_version' from source: facts 30564 1726882930.33265: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882930.33356: variable 'profile_stat' from source: set_fact 30564 1726882930.33366: Evaluated conditional (profile_stat.stat.exists): False 30564 1726882930.33370: when evaluation is False, skipping this task 30564 1726882930.33373: _execute() done 30564 1726882930.33376: dumping result to json 30564 1726882930.33381: done dumping result, returning 30564 1726882930.33386: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files [0e448fcc-3ce9-4216-acec-00000000288a] 30564 1726882930.33392: sending task result for task 0e448fcc-3ce9-4216-acec-00000000288a 30564 1726882930.33484: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000288a 30564 1726882930.33487: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30564 1726882930.33541: no more pending results, returning what we have 30564 1726882930.33544: results queue empty 30564 1726882930.33546: checking for any_errors_fatal 30564 1726882930.33555: done checking for any_errors_fatal 30564 1726882930.33555: checking for max_fail_percentage 30564 1726882930.33557: done checking for max_fail_percentage 30564 1726882930.33558: checking to see if all hosts have failed and the running result is not ok 30564 1726882930.33559: done checking to see if all hosts have failed 30564 1726882930.33560: getting the remaining hosts for this loop 30564 1726882930.33561: done getting the remaining hosts for this loop 30564 1726882930.33567: getting the next task for host managed_node2 30564 1726882930.33574: done getting next task for host managed_node2 30564 1726882930.33577: ^ task is: TASK: Get NM profile info 30564 1726882930.33582: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882930.33586: getting variables 30564 1726882930.33588: in VariableManager get_vars() 30564 1726882930.33640: Calling all_inventory to load vars for managed_node2 30564 1726882930.33643: Calling groups_inventory to load vars for managed_node2 30564 1726882930.33646: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882930.33657: Calling all_plugins_play to load vars for managed_node2 30564 1726882930.33660: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882930.33662: Calling groups_plugins_play to load vars for managed_node2 30564 1726882930.35980: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882930.38652: done with get_vars() 30564 1726882930.38702: done getting variables 30564 1726882930.38775: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 21:42:10 -0400 (0:00:00.065) 0:02:08.969 ****** 30564 1726882930.38822: entering _queue_task() for managed_node2/shell 30564 1726882930.39600: worker is 1 (out of 1 available) 30564 1726882930.39625: exiting _queue_task() for managed_node2/shell 30564 1726882930.39644: done queuing things up, now waiting for results queue to drain 30564 1726882930.39646: waiting for pending results... 30564 1726882930.40260: running TaskExecutor() for managed_node2/TASK: Get NM profile info 30564 1726882930.40541: in run() - task 0e448fcc-3ce9-4216-acec-00000000288b 30564 1726882930.40584: variable 'ansible_search_path' from source: unknown 30564 1726882930.40594: variable 'ansible_search_path' from source: unknown 30564 1726882930.40698: calling self._execute() 30564 1726882930.40921: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882930.40936: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882930.40952: variable 'omit' from source: magic vars 30564 1726882930.41702: variable 'ansible_distribution_major_version' from source: facts 30564 1726882930.41719: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882930.41725: variable 'omit' from source: magic vars 30564 1726882930.41924: variable 'omit' from source: magic vars 30564 1726882930.42194: variable 'profile' from source: play vars 30564 1726882930.42215: variable 'interface' from source: play vars 30564 1726882930.42286: variable 'interface' from source: play vars 30564 1726882930.42317: variable 'omit' from source: magic vars 30564 1726882930.42363: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882930.42402: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882930.42439: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882930.42457: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882930.42470: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882930.42503: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882930.42506: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882930.42509: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882930.42621: Set connection var ansible_timeout to 10 30564 1726882930.42632: Set connection var ansible_pipelining to False 30564 1726882930.42642: Set connection var ansible_shell_type to sh 30564 1726882930.42653: Set connection var ansible_shell_executable to /bin/sh 30564 1726882930.42662: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882930.42667: Set connection var ansible_connection to ssh 30564 1726882930.42698: variable 'ansible_shell_executable' from source: unknown 30564 1726882930.42701: variable 'ansible_connection' from source: unknown 30564 1726882930.42704: variable 'ansible_module_compression' from source: unknown 30564 1726882930.42706: variable 'ansible_shell_type' from source: unknown 30564 1726882930.42708: variable 'ansible_shell_executable' from source: unknown 30564 1726882930.42710: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882930.42715: variable 'ansible_pipelining' from source: unknown 30564 1726882930.42717: variable 'ansible_timeout' from source: unknown 30564 1726882930.42721: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882930.42899: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882930.42911: variable 'omit' from source: magic vars 30564 1726882930.42915: starting attempt loop 30564 1726882930.42918: running the handler 30564 1726882930.42929: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882930.42949: _low_level_execute_command(): starting 30564 1726882930.42961: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882930.43846: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882930.43858: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882930.43889: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882930.43905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882930.43942: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882930.43948: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882930.43958: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882930.43977: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882930.43996: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882930.44006: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882930.44015: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882930.44024: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882930.44037: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882930.44044: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882930.44051: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882930.44062: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882930.44149: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882930.44166: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882930.44180: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882930.44344: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882930.45984: stdout chunk (state=3): >>>/root <<< 30564 1726882930.46167: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882930.46177: stdout chunk (state=3): >>><<< 30564 1726882930.46180: stderr chunk (state=3): >>><<< 30564 1726882930.46203: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882930.46233: _low_level_execute_command(): starting 30564 1726882930.46237: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882930.4619935-36204-170266225084445 `" && echo ansible-tmp-1726882930.4619935-36204-170266225084445="` echo /root/.ansible/tmp/ansible-tmp-1726882930.4619935-36204-170266225084445 `" ) && sleep 0' 30564 1726882930.46908: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882930.46919: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882930.46922: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882930.46937: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882930.46979: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882930.46985: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882930.46996: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882930.47013: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882930.47018: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882930.47025: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882930.47033: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882930.47042: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882930.47053: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882930.47060: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882930.47069: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882930.47082: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882930.47151: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882930.47166: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882930.47180: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882930.47306: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882930.49194: stdout chunk (state=3): >>>ansible-tmp-1726882930.4619935-36204-170266225084445=/root/.ansible/tmp/ansible-tmp-1726882930.4619935-36204-170266225084445 <<< 30564 1726882930.49346: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882930.49593: stderr chunk (state=3): >>><<< 30564 1726882930.49596: stdout chunk (state=3): >>><<< 30564 1726882930.49617: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882930.4619935-36204-170266225084445=/root/.ansible/tmp/ansible-tmp-1726882930.4619935-36204-170266225084445 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882930.49647: variable 'ansible_module_compression' from source: unknown 30564 1726882930.49704: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30564 1726882930.49742: variable 'ansible_facts' from source: unknown 30564 1726882930.49827: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882930.4619935-36204-170266225084445/AnsiballZ_command.py 30564 1726882930.49986: Sending initial data 30564 1726882930.49990: Sent initial data (156 bytes) 30564 1726882930.51029: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882930.51038: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882930.51048: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882930.51074: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882930.51114: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882930.51135: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882930.51138: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882930.51144: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882930.51171: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882930.51174: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882930.51181: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882930.51221: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882930.51226: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882930.51228: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882930.51250: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882930.51253: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882930.51342: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882930.51350: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882930.51367: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882930.51594: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882930.53323: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882930.53437: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882930.53546: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmp2tjgrb36 /root/.ansible/tmp/ansible-tmp-1726882930.4619935-36204-170266225084445/AnsiballZ_command.py <<< 30564 1726882930.53632: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882930.55027: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882930.55135: stderr chunk (state=3): >>><<< 30564 1726882930.55154: stdout chunk (state=3): >>><<< 30564 1726882930.55166: done transferring module to remote 30564 1726882930.55188: _low_level_execute_command(): starting 30564 1726882930.55191: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882930.4619935-36204-170266225084445/ /root/.ansible/tmp/ansible-tmp-1726882930.4619935-36204-170266225084445/AnsiballZ_command.py && sleep 0' 30564 1726882930.56058: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882930.56100: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882930.56111: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882930.56123: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882930.56178: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882930.56185: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882930.56195: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882930.56232: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882930.56235: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882930.56247: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882930.56255: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882930.56267: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882930.56283: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882930.56291: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882930.56297: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882930.56322: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882930.56400: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882930.56416: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882930.56452: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882930.56611: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882930.58385: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882930.58446: stderr chunk (state=3): >>><<< 30564 1726882930.58452: stdout chunk (state=3): >>><<< 30564 1726882930.58486: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882930.58490: _low_level_execute_command(): starting 30564 1726882930.58492: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882930.4619935-36204-170266225084445/AnsiballZ_command.py && sleep 0' 30564 1726882930.58903: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882930.58906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882930.58943: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 30564 1726882930.58947: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882930.58951: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882930.58997: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882930.59000: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882930.59108: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882930.74100: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-20 21:42:10.720341", "end": "2024-09-20 21:42:10.738978", "delta": "0:00:00.018637", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30564 1726882930.75350: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.11.158 closed. <<< 30564 1726882930.75355: stdout chunk (state=3): >>><<< 30564 1726882930.75357: stderr chunk (state=3): >>><<< 30564 1726882930.75389: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "start": "2024-09-20 21:42:10.720341", "end": "2024-09-20 21:42:10.738978", "delta": "0:00:00.018637", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.11.158 closed. 30564 1726882930.75431: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882930.4619935-36204-170266225084445/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882930.75435: _low_level_execute_command(): starting 30564 1726882930.75437: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882930.4619935-36204-170266225084445/ > /dev/null 2>&1 && sleep 0' 30564 1726882930.75914: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882930.75922: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882930.75956: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882930.75960: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882930.75984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882930.75987: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882930.76038: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882930.76042: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882930.76044: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882930.76168: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882930.77980: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882930.78032: stderr chunk (state=3): >>><<< 30564 1726882930.78035: stdout chunk (state=3): >>><<< 30564 1726882930.78049: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882930.78056: handler run complete 30564 1726882930.78077: Evaluated conditional (False): False 30564 1726882930.78089: attempt loop complete, returning result 30564 1726882930.78092: _execute() done 30564 1726882930.78095: dumping result to json 30564 1726882930.78097: done dumping result, returning 30564 1726882930.78106: done running TaskExecutor() for managed_node2/TASK: Get NM profile info [0e448fcc-3ce9-4216-acec-00000000288b] 30564 1726882930.78111: sending task result for task 0e448fcc-3ce9-4216-acec-00000000288b 30564 1726882930.78213: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000288b 30564 1726882930.78216: WORKER PROCESS EXITING fatal: [managed_node2]: FAILED! => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep statebr | grep /etc", "delta": "0:00:00.018637", "end": "2024-09-20 21:42:10.738978", "rc": 1, "start": "2024-09-20 21:42:10.720341" } MSG: non-zero return code ...ignoring 30564 1726882930.78293: no more pending results, returning what we have 30564 1726882930.78297: results queue empty 30564 1726882930.78298: checking for any_errors_fatal 30564 1726882930.78306: done checking for any_errors_fatal 30564 1726882930.78306: checking for max_fail_percentage 30564 1726882930.78308: done checking for max_fail_percentage 30564 1726882930.78309: checking to see if all hosts have failed and the running result is not ok 30564 1726882930.78310: done checking to see if all hosts have failed 30564 1726882930.78311: getting the remaining hosts for this loop 30564 1726882930.78313: done getting the remaining hosts for this loop 30564 1726882930.78316: getting the next task for host managed_node2 30564 1726882930.78327: done getting next task for host managed_node2 30564 1726882930.78329: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 30564 1726882930.78334: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882930.78338: getting variables 30564 1726882930.78339: in VariableManager get_vars() 30564 1726882930.78393: Calling all_inventory to load vars for managed_node2 30564 1726882930.78395: Calling groups_inventory to load vars for managed_node2 30564 1726882930.78398: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882930.78410: Calling all_plugins_play to load vars for managed_node2 30564 1726882930.78412: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882930.78415: Calling groups_plugins_play to load vars for managed_node2 30564 1726882930.79282: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882930.80225: done with get_vars() 30564 1726882930.80242: done getting variables 30564 1726882930.80290: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 21:42:10 -0400 (0:00:00.414) 0:02:09.384 ****** 30564 1726882930.80322: entering _queue_task() for managed_node2/set_fact 30564 1726882930.80543: worker is 1 (out of 1 available) 30564 1726882930.80555: exiting _queue_task() for managed_node2/set_fact 30564 1726882930.80569: done queuing things up, now waiting for results queue to drain 30564 1726882930.80571: waiting for pending results... 30564 1726882930.80765: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 30564 1726882930.80850: in run() - task 0e448fcc-3ce9-4216-acec-00000000288c 30564 1726882930.80863: variable 'ansible_search_path' from source: unknown 30564 1726882930.80874: variable 'ansible_search_path' from source: unknown 30564 1726882930.80903: calling self._execute() 30564 1726882930.80981: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882930.80985: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882930.80997: variable 'omit' from source: magic vars 30564 1726882930.81277: variable 'ansible_distribution_major_version' from source: facts 30564 1726882930.81288: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882930.81381: variable 'nm_profile_exists' from source: set_fact 30564 1726882930.81391: Evaluated conditional (nm_profile_exists.rc == 0): False 30564 1726882930.81394: when evaluation is False, skipping this task 30564 1726882930.81397: _execute() done 30564 1726882930.81399: dumping result to json 30564 1726882930.81403: done dumping result, returning 30564 1726882930.81409: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0e448fcc-3ce9-4216-acec-00000000288c] 30564 1726882930.81414: sending task result for task 0e448fcc-3ce9-4216-acec-00000000288c 30564 1726882930.81506: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000288c 30564 1726882930.81509: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "nm_profile_exists.rc == 0", "skip_reason": "Conditional result was False" } 30564 1726882930.81566: no more pending results, returning what we have 30564 1726882930.81572: results queue empty 30564 1726882930.81573: checking for any_errors_fatal 30564 1726882930.81580: done checking for any_errors_fatal 30564 1726882930.81580: checking for max_fail_percentage 30564 1726882930.81582: done checking for max_fail_percentage 30564 1726882930.81582: checking to see if all hosts have failed and the running result is not ok 30564 1726882930.81583: done checking to see if all hosts have failed 30564 1726882930.81584: getting the remaining hosts for this loop 30564 1726882930.81585: done getting the remaining hosts for this loop 30564 1726882930.81589: getting the next task for host managed_node2 30564 1726882930.81598: done getting next task for host managed_node2 30564 1726882930.81600: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 30564 1726882930.81604: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882930.81608: getting variables 30564 1726882930.81609: in VariableManager get_vars() 30564 1726882930.81653: Calling all_inventory to load vars for managed_node2 30564 1726882930.81656: Calling groups_inventory to load vars for managed_node2 30564 1726882930.81660: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882930.81672: Calling all_plugins_play to load vars for managed_node2 30564 1726882930.81676: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882930.81678: Calling groups_plugins_play to load vars for managed_node2 30564 1726882930.86661: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882930.87608: done with get_vars() 30564 1726882930.87624: done getting variables 30564 1726882930.87657: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30564 1726882930.87732: variable 'profile' from source: play vars 30564 1726882930.87734: variable 'interface' from source: play vars 30564 1726882930.87777: variable 'interface' from source: play vars TASK [Get the ansible_managed comment in ifcfg-statebr] ************************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 21:42:10 -0400 (0:00:00.074) 0:02:09.459 ****** 30564 1726882930.87799: entering _queue_task() for managed_node2/command 30564 1726882930.88047: worker is 1 (out of 1 available) 30564 1726882930.88061: exiting _queue_task() for managed_node2/command 30564 1726882930.88077: done queuing things up, now waiting for results queue to drain 30564 1726882930.88079: waiting for pending results... 30564 1726882930.88273: running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-statebr 30564 1726882930.88352: in run() - task 0e448fcc-3ce9-4216-acec-00000000288e 30564 1726882930.88364: variable 'ansible_search_path' from source: unknown 30564 1726882930.88372: variable 'ansible_search_path' from source: unknown 30564 1726882930.88398: calling self._execute() 30564 1726882930.88479: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882930.88484: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882930.88492: variable 'omit' from source: magic vars 30564 1726882930.88781: variable 'ansible_distribution_major_version' from source: facts 30564 1726882930.88794: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882930.88884: variable 'profile_stat' from source: set_fact 30564 1726882930.88897: Evaluated conditional (profile_stat.stat.exists): False 30564 1726882930.88902: when evaluation is False, skipping this task 30564 1726882930.88905: _execute() done 30564 1726882930.88908: dumping result to json 30564 1726882930.88910: done dumping result, returning 30564 1726882930.88913: done running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-statebr [0e448fcc-3ce9-4216-acec-00000000288e] 30564 1726882930.88915: sending task result for task 0e448fcc-3ce9-4216-acec-00000000288e 30564 1726882930.89012: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000288e 30564 1726882930.89014: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30564 1726882930.89073: no more pending results, returning what we have 30564 1726882930.89077: results queue empty 30564 1726882930.89078: checking for any_errors_fatal 30564 1726882930.89090: done checking for any_errors_fatal 30564 1726882930.89091: checking for max_fail_percentage 30564 1726882930.89093: done checking for max_fail_percentage 30564 1726882930.89094: checking to see if all hosts have failed and the running result is not ok 30564 1726882930.89094: done checking to see if all hosts have failed 30564 1726882930.89095: getting the remaining hosts for this loop 30564 1726882930.89097: done getting the remaining hosts for this loop 30564 1726882930.89100: getting the next task for host managed_node2 30564 1726882930.89108: done getting next task for host managed_node2 30564 1726882930.89110: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 30564 1726882930.89116: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882930.89119: getting variables 30564 1726882930.89120: in VariableManager get_vars() 30564 1726882930.89171: Calling all_inventory to load vars for managed_node2 30564 1726882930.89175: Calling groups_inventory to load vars for managed_node2 30564 1726882930.89178: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882930.89187: Calling all_plugins_play to load vars for managed_node2 30564 1726882930.89190: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882930.89193: Calling groups_plugins_play to load vars for managed_node2 30564 1726882930.90146: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882930.91154: done with get_vars() 30564 1726882930.91179: done getting variables 30564 1726882930.91234: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30564 1726882930.91344: variable 'profile' from source: play vars 30564 1726882930.91349: variable 'interface' from source: play vars 30564 1726882930.91410: variable 'interface' from source: play vars TASK [Verify the ansible_managed comment in ifcfg-statebr] ********************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 21:42:10 -0400 (0:00:00.036) 0:02:09.495 ****** 30564 1726882930.91441: entering _queue_task() for managed_node2/set_fact 30564 1726882930.91741: worker is 1 (out of 1 available) 30564 1726882930.91752: exiting _queue_task() for managed_node2/set_fact 30564 1726882930.91767: done queuing things up, now waiting for results queue to drain 30564 1726882930.91770: waiting for pending results... 30564 1726882930.92075: running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-statebr 30564 1726882930.92222: in run() - task 0e448fcc-3ce9-4216-acec-00000000288f 30564 1726882930.92242: variable 'ansible_search_path' from source: unknown 30564 1726882930.92250: variable 'ansible_search_path' from source: unknown 30564 1726882930.92294: calling self._execute() 30564 1726882930.92406: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882930.92418: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882930.92437: variable 'omit' from source: magic vars 30564 1726882930.92816: variable 'ansible_distribution_major_version' from source: facts 30564 1726882930.92825: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882930.92919: variable 'profile_stat' from source: set_fact 30564 1726882930.92928: Evaluated conditional (profile_stat.stat.exists): False 30564 1726882930.92931: when evaluation is False, skipping this task 30564 1726882930.92933: _execute() done 30564 1726882930.92936: dumping result to json 30564 1726882930.92940: done dumping result, returning 30564 1726882930.92946: done running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-statebr [0e448fcc-3ce9-4216-acec-00000000288f] 30564 1726882930.92951: sending task result for task 0e448fcc-3ce9-4216-acec-00000000288f 30564 1726882930.93048: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000288f 30564 1726882930.93050: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30564 1726882930.93098: no more pending results, returning what we have 30564 1726882930.93101: results queue empty 30564 1726882930.93102: checking for any_errors_fatal 30564 1726882930.93110: done checking for any_errors_fatal 30564 1726882930.93110: checking for max_fail_percentage 30564 1726882930.93112: done checking for max_fail_percentage 30564 1726882930.93113: checking to see if all hosts have failed and the running result is not ok 30564 1726882930.93113: done checking to see if all hosts have failed 30564 1726882930.93114: getting the remaining hosts for this loop 30564 1726882930.93116: done getting the remaining hosts for this loop 30564 1726882930.93120: getting the next task for host managed_node2 30564 1726882930.93129: done getting next task for host managed_node2 30564 1726882930.93131: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 30564 1726882930.93136: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882930.93139: getting variables 30564 1726882930.93140: in VariableManager get_vars() 30564 1726882930.93180: Calling all_inventory to load vars for managed_node2 30564 1726882930.93182: Calling groups_inventory to load vars for managed_node2 30564 1726882930.93185: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882930.93195: Calling all_plugins_play to load vars for managed_node2 30564 1726882930.93198: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882930.93220: Calling groups_plugins_play to load vars for managed_node2 30564 1726882930.94048: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882930.95020: done with get_vars() 30564 1726882930.95035: done getting variables 30564 1726882930.95084: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30564 1726882930.95160: variable 'profile' from source: play vars 30564 1726882930.95163: variable 'interface' from source: play vars 30564 1726882930.95207: variable 'interface' from source: play vars TASK [Get the fingerprint comment in ifcfg-statebr] **************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 21:42:10 -0400 (0:00:00.037) 0:02:09.533 ****** 30564 1726882930.95229: entering _queue_task() for managed_node2/command 30564 1726882930.95452: worker is 1 (out of 1 available) 30564 1726882930.95470: exiting _queue_task() for managed_node2/command 30564 1726882930.95484: done queuing things up, now waiting for results queue to drain 30564 1726882930.95486: waiting for pending results... 30564 1726882930.95696: running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-statebr 30564 1726882930.95793: in run() - task 0e448fcc-3ce9-4216-acec-000000002890 30564 1726882930.95805: variable 'ansible_search_path' from source: unknown 30564 1726882930.95808: variable 'ansible_search_path' from source: unknown 30564 1726882930.95838: calling self._execute() 30564 1726882930.95931: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882930.95936: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882930.95939: variable 'omit' from source: magic vars 30564 1726882930.96369: variable 'ansible_distribution_major_version' from source: facts 30564 1726882930.96373: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882930.96515: variable 'profile_stat' from source: set_fact 30564 1726882930.96518: Evaluated conditional (profile_stat.stat.exists): False 30564 1726882930.96520: when evaluation is False, skipping this task 30564 1726882930.96522: _execute() done 30564 1726882930.96524: dumping result to json 30564 1726882930.96526: done dumping result, returning 30564 1726882930.96527: done running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-statebr [0e448fcc-3ce9-4216-acec-000000002890] 30564 1726882930.96529: sending task result for task 0e448fcc-3ce9-4216-acec-000000002890 30564 1726882930.96596: done sending task result for task 0e448fcc-3ce9-4216-acec-000000002890 30564 1726882930.96599: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30564 1726882930.96652: no more pending results, returning what we have 30564 1726882930.96655: results queue empty 30564 1726882930.96656: checking for any_errors_fatal 30564 1726882930.96661: done checking for any_errors_fatal 30564 1726882930.96662: checking for max_fail_percentage 30564 1726882930.96666: done checking for max_fail_percentage 30564 1726882930.96667: checking to see if all hosts have failed and the running result is not ok 30564 1726882930.96668: done checking to see if all hosts have failed 30564 1726882930.96669: getting the remaining hosts for this loop 30564 1726882930.96670: done getting the remaining hosts for this loop 30564 1726882930.96674: getting the next task for host managed_node2 30564 1726882930.96681: done getting next task for host managed_node2 30564 1726882930.96684: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 30564 1726882930.96689: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882930.96692: getting variables 30564 1726882930.96693: in VariableManager get_vars() 30564 1726882930.96751: Calling all_inventory to load vars for managed_node2 30564 1726882930.96754: Calling groups_inventory to load vars for managed_node2 30564 1726882930.96757: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882930.96768: Calling all_plugins_play to load vars for managed_node2 30564 1726882930.96771: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882930.97403: Calling groups_plugins_play to load vars for managed_node2 30564 1726882930.98774: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882931.00960: done with get_vars() 30564 1726882931.00987: done getting variables 30564 1726882931.01046: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30564 1726882931.01162: variable 'profile' from source: play vars 30564 1726882931.01170: variable 'interface' from source: play vars 30564 1726882931.01231: variable 'interface' from source: play vars TASK [Verify the fingerprint comment in ifcfg-statebr] ************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 21:42:11 -0400 (0:00:00.060) 0:02:09.593 ****** 30564 1726882931.01262: entering _queue_task() for managed_node2/set_fact 30564 1726882931.01562: worker is 1 (out of 1 available) 30564 1726882931.01888: exiting _queue_task() for managed_node2/set_fact 30564 1726882931.01901: done queuing things up, now waiting for results queue to drain 30564 1726882931.01903: waiting for pending results... 30564 1726882931.02267: running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-statebr 30564 1726882931.02440: in run() - task 0e448fcc-3ce9-4216-acec-000000002891 30564 1726882931.02467: variable 'ansible_search_path' from source: unknown 30564 1726882931.02475: variable 'ansible_search_path' from source: unknown 30564 1726882931.02523: calling self._execute() 30564 1726882931.02646: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882931.02657: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882931.02678: variable 'omit' from source: magic vars 30564 1726882931.03115: variable 'ansible_distribution_major_version' from source: facts 30564 1726882931.03135: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882931.03294: variable 'profile_stat' from source: set_fact 30564 1726882931.03312: Evaluated conditional (profile_stat.stat.exists): False 30564 1726882931.03323: when evaluation is False, skipping this task 30564 1726882931.03329: _execute() done 30564 1726882931.03335: dumping result to json 30564 1726882931.03342: done dumping result, returning 30564 1726882931.03350: done running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-statebr [0e448fcc-3ce9-4216-acec-000000002891] 30564 1726882931.03366: sending task result for task 0e448fcc-3ce9-4216-acec-000000002891 skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 30564 1726882931.03522: no more pending results, returning what we have 30564 1726882931.03526: results queue empty 30564 1726882931.03527: checking for any_errors_fatal 30564 1726882931.03537: done checking for any_errors_fatal 30564 1726882931.03537: checking for max_fail_percentage 30564 1726882931.03539: done checking for max_fail_percentage 30564 1726882931.03540: checking to see if all hosts have failed and the running result is not ok 30564 1726882931.03541: done checking to see if all hosts have failed 30564 1726882931.03541: getting the remaining hosts for this loop 30564 1726882931.03543: done getting the remaining hosts for this loop 30564 1726882931.03547: getting the next task for host managed_node2 30564 1726882931.03558: done getting next task for host managed_node2 30564 1726882931.03560: ^ task is: TASK: Assert that the profile is absent - '{{ profile }}' 30564 1726882931.03567: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882931.03571: getting variables 30564 1726882931.03573: in VariableManager get_vars() 30564 1726882931.03622: Calling all_inventory to load vars for managed_node2 30564 1726882931.03625: Calling groups_inventory to load vars for managed_node2 30564 1726882931.03629: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882931.03643: Calling all_plugins_play to load vars for managed_node2 30564 1726882931.03646: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882931.03650: Calling groups_plugins_play to load vars for managed_node2 30564 1726882931.05460: done sending task result for task 0e448fcc-3ce9-4216-acec-000000002891 30564 1726882931.05463: WORKER PROCESS EXITING 30564 1726882931.08755: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882931.11605: done with get_vars() 30564 1726882931.11631: done getting variables 30564 1726882931.11709: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30564 1726882931.11846: variable 'profile' from source: play vars 30564 1726882931.11851: variable 'interface' from source: play vars 30564 1726882931.11925: variable 'interface' from source: play vars TASK [Assert that the profile is absent - 'statebr'] *************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:5 Friday 20 September 2024 21:42:11 -0400 (0:00:00.106) 0:02:09.700 ****** 30564 1726882931.11958: entering _queue_task() for managed_node2/assert 30564 1726882931.13099: worker is 1 (out of 1 available) 30564 1726882931.13111: exiting _queue_task() for managed_node2/assert 30564 1726882931.13239: done queuing things up, now waiting for results queue to drain 30564 1726882931.13241: waiting for pending results... 30564 1726882931.13838: running TaskExecutor() for managed_node2/TASK: Assert that the profile is absent - 'statebr' 30564 1726882931.13958: in run() - task 0e448fcc-3ce9-4216-acec-000000002805 30564 1726882931.13975: variable 'ansible_search_path' from source: unknown 30564 1726882931.13979: variable 'ansible_search_path' from source: unknown 30564 1726882931.14013: calling self._execute() 30564 1726882931.14111: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882931.14118: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882931.14128: variable 'omit' from source: magic vars 30564 1726882931.15501: variable 'ansible_distribution_major_version' from source: facts 30564 1726882931.15513: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882931.15521: variable 'omit' from source: magic vars 30564 1726882931.15570: variable 'omit' from source: magic vars 30564 1726882931.15676: variable 'profile' from source: play vars 30564 1726882931.15679: variable 'interface' from source: play vars 30564 1726882931.15744: variable 'interface' from source: play vars 30564 1726882931.15762: variable 'omit' from source: magic vars 30564 1726882931.15914: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882931.15951: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882931.15971: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882931.15991: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882931.16004: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882931.16037: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882931.16040: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882931.16043: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882931.16351: Set connection var ansible_timeout to 10 30564 1726882931.16355: Set connection var ansible_pipelining to False 30564 1726882931.16358: Set connection var ansible_shell_type to sh 30564 1726882931.16366: Set connection var ansible_shell_executable to /bin/sh 30564 1726882931.16377: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882931.16380: Set connection var ansible_connection to ssh 30564 1726882931.16406: variable 'ansible_shell_executable' from source: unknown 30564 1726882931.16409: variable 'ansible_connection' from source: unknown 30564 1726882931.16414: variable 'ansible_module_compression' from source: unknown 30564 1726882931.16417: variable 'ansible_shell_type' from source: unknown 30564 1726882931.16419: variable 'ansible_shell_executable' from source: unknown 30564 1726882931.16421: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882931.16424: variable 'ansible_pipelining' from source: unknown 30564 1726882931.16426: variable 'ansible_timeout' from source: unknown 30564 1726882931.16428: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882931.16767: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882931.16782: variable 'omit' from source: magic vars 30564 1726882931.16787: starting attempt loop 30564 1726882931.16790: running the handler 30564 1726882931.17119: variable 'lsr_net_profile_exists' from source: set_fact 30564 1726882931.17122: Evaluated conditional (not lsr_net_profile_exists): True 30564 1726882931.17129: handler run complete 30564 1726882931.17142: attempt loop complete, returning result 30564 1726882931.17145: _execute() done 30564 1726882931.17147: dumping result to json 30564 1726882931.17150: done dumping result, returning 30564 1726882931.17157: done running TaskExecutor() for managed_node2/TASK: Assert that the profile is absent - 'statebr' [0e448fcc-3ce9-4216-acec-000000002805] 30564 1726882931.17163: sending task result for task 0e448fcc-3ce9-4216-acec-000000002805 30564 1726882931.17256: done sending task result for task 0e448fcc-3ce9-4216-acec-000000002805 30564 1726882931.17258: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 30564 1726882931.17313: no more pending results, returning what we have 30564 1726882931.17318: results queue empty 30564 1726882931.17319: checking for any_errors_fatal 30564 1726882931.17327: done checking for any_errors_fatal 30564 1726882931.17328: checking for max_fail_percentage 30564 1726882931.17330: done checking for max_fail_percentage 30564 1726882931.17331: checking to see if all hosts have failed and the running result is not ok 30564 1726882931.17332: done checking to see if all hosts have failed 30564 1726882931.17333: getting the remaining hosts for this loop 30564 1726882931.17334: done getting the remaining hosts for this loop 30564 1726882931.17338: getting the next task for host managed_node2 30564 1726882931.17349: done getting next task for host managed_node2 30564 1726882931.17352: ^ task is: TASK: Get NetworkManager RPM version 30564 1726882931.17357: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882931.17361: getting variables 30564 1726882931.17363: in VariableManager get_vars() 30564 1726882931.17414: Calling all_inventory to load vars for managed_node2 30564 1726882931.17417: Calling groups_inventory to load vars for managed_node2 30564 1726882931.17420: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882931.17432: Calling all_plugins_play to load vars for managed_node2 30564 1726882931.17435: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882931.17438: Calling groups_plugins_play to load vars for managed_node2 30564 1726882931.19692: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882931.21571: done with get_vars() 30564 1726882931.21602: done getting variables 30564 1726882931.21666: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get NetworkManager RPM version] ****************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_NetworkManager_NVR.yml:7 Friday 20 September 2024 21:42:11 -0400 (0:00:00.097) 0:02:09.798 ****** 30564 1726882931.21705: entering _queue_task() for managed_node2/command 30564 1726882931.22034: worker is 1 (out of 1 available) 30564 1726882931.22052: exiting _queue_task() for managed_node2/command 30564 1726882931.22070: done queuing things up, now waiting for results queue to drain 30564 1726882931.22072: waiting for pending results... 30564 1726882931.22381: running TaskExecutor() for managed_node2/TASK: Get NetworkManager RPM version 30564 1726882931.22518: in run() - task 0e448fcc-3ce9-4216-acec-000000002809 30564 1726882931.22536: variable 'ansible_search_path' from source: unknown 30564 1726882931.22544: variable 'ansible_search_path' from source: unknown 30564 1726882931.22589: calling self._execute() 30564 1726882931.22727: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882931.22741: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882931.22755: variable 'omit' from source: magic vars 30564 1726882931.23204: variable 'ansible_distribution_major_version' from source: facts 30564 1726882931.23222: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882931.23235: variable 'omit' from source: magic vars 30564 1726882931.23297: variable 'omit' from source: magic vars 30564 1726882931.23334: variable 'omit' from source: magic vars 30564 1726882931.23394: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882931.23433: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882931.23465: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882931.23499: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882931.23516: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882931.23553: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882931.23564: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882931.23579: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882931.23717: Set connection var ansible_timeout to 10 30564 1726882931.23729: Set connection var ansible_pipelining to False 30564 1726882931.23736: Set connection var ansible_shell_type to sh 30564 1726882931.23747: Set connection var ansible_shell_executable to /bin/sh 30564 1726882931.23760: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882931.23769: Set connection var ansible_connection to ssh 30564 1726882931.23804: variable 'ansible_shell_executable' from source: unknown 30564 1726882931.23826: variable 'ansible_connection' from source: unknown 30564 1726882931.23834: variable 'ansible_module_compression' from source: unknown 30564 1726882931.23840: variable 'ansible_shell_type' from source: unknown 30564 1726882931.23845: variable 'ansible_shell_executable' from source: unknown 30564 1726882931.23851: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882931.23858: variable 'ansible_pipelining' from source: unknown 30564 1726882931.23867: variable 'ansible_timeout' from source: unknown 30564 1726882931.23875: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882931.24026: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882931.24053: variable 'omit' from source: magic vars 30564 1726882931.24062: starting attempt loop 30564 1726882931.24071: running the handler 30564 1726882931.24091: _low_level_execute_command(): starting 30564 1726882931.24103: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882931.24705: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882931.24710: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882931.24736: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882931.24740: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882931.24786: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882931.24789: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882931.24800: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882931.24915: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882931.26575: stdout chunk (state=3): >>>/root <<< 30564 1726882931.26692: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882931.26757: stderr chunk (state=3): >>><<< 30564 1726882931.26763: stdout chunk (state=3): >>><<< 30564 1726882931.26771: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882931.26788: _low_level_execute_command(): starting 30564 1726882931.26791: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882931.267599-36236-198090247819970 `" && echo ansible-tmp-1726882931.267599-36236-198090247819970="` echo /root/.ansible/tmp/ansible-tmp-1726882931.267599-36236-198090247819970 `" ) && sleep 0' 30564 1726882931.27372: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882931.27379: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882931.27390: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882931.27410: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882931.27447: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882931.27458: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882931.27461: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882931.27480: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882931.27488: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882931.27497: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882931.27504: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882931.27519: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882931.27530: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882931.27537: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882931.27544: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882931.27553: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882931.27632: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882931.27644: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882931.27656: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882931.27784: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882931.29657: stdout chunk (state=3): >>>ansible-tmp-1726882931.267599-36236-198090247819970=/root/.ansible/tmp/ansible-tmp-1726882931.267599-36236-198090247819970 <<< 30564 1726882931.29827: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882931.29830: stdout chunk (state=3): >>><<< 30564 1726882931.29838: stderr chunk (state=3): >>><<< 30564 1726882931.29875: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882931.267599-36236-198090247819970=/root/.ansible/tmp/ansible-tmp-1726882931.267599-36236-198090247819970 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882931.29909: variable 'ansible_module_compression' from source: unknown 30564 1726882931.29971: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30564 1726882931.30006: variable 'ansible_facts' from source: unknown 30564 1726882931.30106: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882931.267599-36236-198090247819970/AnsiballZ_command.py 30564 1726882931.30250: Sending initial data 30564 1726882931.30253: Sent initial data (155 bytes) 30564 1726882931.31090: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882931.31098: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882931.31107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882931.31118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882931.31144: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882931.31154: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882931.31179: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882931.31198: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882931.31201: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882931.31250: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882931.31255: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882931.31363: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882931.33078: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882931.33170: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882931.33262: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmpxs9tzlz6 /root/.ansible/tmp/ansible-tmp-1726882931.267599-36236-198090247819970/AnsiballZ_command.py <<< 30564 1726882931.33384: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882931.34588: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882931.34746: stderr chunk (state=3): >>><<< 30564 1726882931.34756: stdout chunk (state=3): >>><<< 30564 1726882931.34876: done transferring module to remote 30564 1726882931.34879: _low_level_execute_command(): starting 30564 1726882931.34881: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882931.267599-36236-198090247819970/ /root/.ansible/tmp/ansible-tmp-1726882931.267599-36236-198090247819970/AnsiballZ_command.py && sleep 0' 30564 1726882931.35498: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882931.35502: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882931.35504: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882931.35507: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882931.35518: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882931.35525: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882931.35537: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882931.35543: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882931.35548: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882931.35569: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 30564 1726882931.35571: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882931.35620: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882931.35625: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882931.35751: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882931.37506: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882931.37544: stderr chunk (state=3): >>><<< 30564 1726882931.37548: stdout chunk (state=3): >>><<< 30564 1726882931.37561: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882931.37566: _low_level_execute_command(): starting 30564 1726882931.37572: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882931.267599-36236-198090247819970/AnsiballZ_command.py && sleep 0' 30564 1726882931.37994: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882931.37997: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882931.38033: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882931.38037: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882931.38040: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882931.38088: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882931.38091: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882931.38197: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882931.73163: stdout chunk (state=3): >>> {"changed": true, "stdout": "NetworkManager-1.51.0-1.el9", "stderr": "", "rc": 0, "cmd": ["rpm", "-qa", "--qf", "%{name}-%{version}-%{release}\\n", "NetworkManager"], "start": "2024-09-20 21:42:11.510966", "end": "2024-09-20 21:42:11.729505", "delta": "0:00:00.218539", "msg": "", "invocation": {"module_args": {"_raw_params": "rpm -qa --qf '%{name}-%{version}-%{release}\\n' NetworkManager", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30564 1726882931.74558: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882931.74562: stdout chunk (state=3): >>><<< 30564 1726882931.74584: stderr chunk (state=3): >>><<< 30564 1726882931.74603: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "NetworkManager-1.51.0-1.el9", "stderr": "", "rc": 0, "cmd": ["rpm", "-qa", "--qf", "%{name}-%{version}-%{release}\\n", "NetworkManager"], "start": "2024-09-20 21:42:11.510966", "end": "2024-09-20 21:42:11.729505", "delta": "0:00:00.218539", "msg": "", "invocation": {"module_args": {"_raw_params": "rpm -qa --qf '%{name}-%{version}-%{release}\\n' NetworkManager", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882931.74668: done with _execute_module (ansible.legacy.command, {'_raw_params': "rpm -qa --qf '%{name}-%{version}-%{release}\\n' NetworkManager", '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882931.267599-36236-198090247819970/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882931.74703: _low_level_execute_command(): starting 30564 1726882931.74706: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882931.267599-36236-198090247819970/ > /dev/null 2>&1 && sleep 0' 30564 1726882931.75439: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882931.75449: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882931.75476: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882931.75491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882931.75528: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882931.75535: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882931.75545: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882931.75559: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882931.75572: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882931.75593: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882931.75601: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882931.75610: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882931.75622: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882931.75630: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882931.75637: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882931.75646: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882931.75731: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882931.75750: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882931.75762: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882931.75891: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882931.77693: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882931.77744: stderr chunk (state=3): >>><<< 30564 1726882931.77751: stdout chunk (state=3): >>><<< 30564 1726882931.77765: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882931.77773: handler run complete 30564 1726882931.77790: Evaluated conditional (False): False 30564 1726882931.77799: attempt loop complete, returning result 30564 1726882931.77802: _execute() done 30564 1726882931.77804: dumping result to json 30564 1726882931.77809: done dumping result, returning 30564 1726882931.77816: done running TaskExecutor() for managed_node2/TASK: Get NetworkManager RPM version [0e448fcc-3ce9-4216-acec-000000002809] 30564 1726882931.77821: sending task result for task 0e448fcc-3ce9-4216-acec-000000002809 30564 1726882931.77926: done sending task result for task 0e448fcc-3ce9-4216-acec-000000002809 30564 1726882931.77929: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "rpm", "-qa", "--qf", "%{name}-%{version}-%{release}\\n", "NetworkManager" ], "delta": "0:00:00.218539", "end": "2024-09-20 21:42:11.729505", "rc": 0, "start": "2024-09-20 21:42:11.510966" } STDOUT: NetworkManager-1.51.0-1.el9 30564 1726882931.78008: no more pending results, returning what we have 30564 1726882931.78011: results queue empty 30564 1726882931.78013: checking for any_errors_fatal 30564 1726882931.78022: done checking for any_errors_fatal 30564 1726882931.78022: checking for max_fail_percentage 30564 1726882931.78024: done checking for max_fail_percentage 30564 1726882931.78025: checking to see if all hosts have failed and the running result is not ok 30564 1726882931.78026: done checking to see if all hosts have failed 30564 1726882931.78027: getting the remaining hosts for this loop 30564 1726882931.78031: done getting the remaining hosts for this loop 30564 1726882931.78038: getting the next task for host managed_node2 30564 1726882931.78052: done getting next task for host managed_node2 30564 1726882931.78055: ^ task is: TASK: Store NetworkManager version 30564 1726882931.78063: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882931.78077: getting variables 30564 1726882931.78079: in VariableManager get_vars() 30564 1726882931.78151: Calling all_inventory to load vars for managed_node2 30564 1726882931.78158: Calling groups_inventory to load vars for managed_node2 30564 1726882931.78162: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882931.78488: Calling all_plugins_play to load vars for managed_node2 30564 1726882931.78561: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882931.78574: Calling groups_plugins_play to load vars for managed_node2 30564 1726882931.80836: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882931.83491: done with get_vars() 30564 1726882931.83510: done getting variables 30564 1726882931.83553: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Store NetworkManager version] ******************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_NetworkManager_NVR.yml:14 Friday 20 September 2024 21:42:11 -0400 (0:00:00.618) 0:02:10.417 ****** 30564 1726882931.83579: entering _queue_task() for managed_node2/set_fact 30564 1726882931.83813: worker is 1 (out of 1 available) 30564 1726882931.83826: exiting _queue_task() for managed_node2/set_fact 30564 1726882931.83838: done queuing things up, now waiting for results queue to drain 30564 1726882931.83839: waiting for pending results... 30564 1726882931.84038: running TaskExecutor() for managed_node2/TASK: Store NetworkManager version 30564 1726882931.84139: in run() - task 0e448fcc-3ce9-4216-acec-00000000280a 30564 1726882931.84149: variable 'ansible_search_path' from source: unknown 30564 1726882931.84154: variable 'ansible_search_path' from source: unknown 30564 1726882931.84191: calling self._execute() 30564 1726882931.84378: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882931.84382: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882931.84386: variable 'omit' from source: magic vars 30564 1726882931.84714: variable 'ansible_distribution_major_version' from source: facts 30564 1726882931.84725: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882931.84732: variable 'omit' from source: magic vars 30564 1726882931.84783: variable 'omit' from source: magic vars 30564 1726882931.84894: variable '__rpm_q_networkmanager' from source: set_fact 30564 1726882931.84920: variable 'omit' from source: magic vars 30564 1726882931.84961: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882931.84998: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882931.85024: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882931.85044: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882931.85052: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882931.85084: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882931.85088: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882931.85091: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882931.85195: Set connection var ansible_timeout to 10 30564 1726882931.85198: Set connection var ansible_pipelining to False 30564 1726882931.85201: Set connection var ansible_shell_type to sh 30564 1726882931.85208: Set connection var ansible_shell_executable to /bin/sh 30564 1726882931.85216: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882931.85218: Set connection var ansible_connection to ssh 30564 1726882931.85340: variable 'ansible_shell_executable' from source: unknown 30564 1726882931.85344: variable 'ansible_connection' from source: unknown 30564 1726882931.85347: variable 'ansible_module_compression' from source: unknown 30564 1726882931.85349: variable 'ansible_shell_type' from source: unknown 30564 1726882931.85351: variable 'ansible_shell_executable' from source: unknown 30564 1726882931.85353: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882931.85355: variable 'ansible_pipelining' from source: unknown 30564 1726882931.85357: variable 'ansible_timeout' from source: unknown 30564 1726882931.85359: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882931.85656: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882931.85686: variable 'omit' from source: magic vars 30564 1726882931.85700: starting attempt loop 30564 1726882931.85717: running the handler 30564 1726882931.85748: handler run complete 30564 1726882931.85775: attempt loop complete, returning result 30564 1726882931.85785: _execute() done 30564 1726882931.85796: dumping result to json 30564 1726882931.85807: done dumping result, returning 30564 1726882931.86273: done running TaskExecutor() for managed_node2/TASK: Store NetworkManager version [0e448fcc-3ce9-4216-acec-00000000280a] 30564 1726882931.86276: sending task result for task 0e448fcc-3ce9-4216-acec-00000000280a 30564 1726882931.86335: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000280a 30564 1726882931.86337: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "networkmanager_nvr": "NetworkManager-1.51.0-1.el9" }, "changed": false } 30564 1726882931.86385: no more pending results, returning what we have 30564 1726882931.86388: results queue empty 30564 1726882931.86389: checking for any_errors_fatal 30564 1726882931.86397: done checking for any_errors_fatal 30564 1726882931.86398: checking for max_fail_percentage 30564 1726882931.86399: done checking for max_fail_percentage 30564 1726882931.86400: checking to see if all hosts have failed and the running result is not ok 30564 1726882931.86401: done checking to see if all hosts have failed 30564 1726882931.86402: getting the remaining hosts for this loop 30564 1726882931.86403: done getting the remaining hosts for this loop 30564 1726882931.86406: getting the next task for host managed_node2 30564 1726882931.86413: done getting next task for host managed_node2 30564 1726882931.86415: ^ task is: TASK: Show NetworkManager version 30564 1726882931.86418: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882931.86421: getting variables 30564 1726882931.86422: in VariableManager get_vars() 30564 1726882931.86457: Calling all_inventory to load vars for managed_node2 30564 1726882931.86459: Calling groups_inventory to load vars for managed_node2 30564 1726882931.86463: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882931.86475: Calling all_plugins_play to load vars for managed_node2 30564 1726882931.86478: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882931.86481: Calling groups_plugins_play to load vars for managed_node2 30564 1726882931.88008: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882931.89981: done with get_vars() 30564 1726882931.90004: done getting variables 30564 1726882931.90072: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show NetworkManager version] ********************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_NetworkManager_NVR.yml:18 Friday 20 September 2024 21:42:11 -0400 (0:00:00.065) 0:02:10.482 ****** 30564 1726882931.90105: entering _queue_task() for managed_node2/debug 30564 1726882931.90426: worker is 1 (out of 1 available) 30564 1726882931.90442: exiting _queue_task() for managed_node2/debug 30564 1726882931.90460: done queuing things up, now waiting for results queue to drain 30564 1726882931.90463: waiting for pending results... 30564 1726882931.90705: running TaskExecutor() for managed_node2/TASK: Show NetworkManager version 30564 1726882931.90799: in run() - task 0e448fcc-3ce9-4216-acec-00000000280b 30564 1726882931.90815: variable 'ansible_search_path' from source: unknown 30564 1726882931.90819: variable 'ansible_search_path' from source: unknown 30564 1726882931.90846: calling self._execute() 30564 1726882931.90930: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882931.90934: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882931.90947: variable 'omit' from source: magic vars 30564 1726882931.91243: variable 'ansible_distribution_major_version' from source: facts 30564 1726882931.91254: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882931.91261: variable 'omit' from source: magic vars 30564 1726882931.91299: variable 'omit' from source: magic vars 30564 1726882931.91320: variable 'omit' from source: magic vars 30564 1726882931.91354: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882931.91386: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882931.91402: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882931.91414: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882931.91424: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882931.91449: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882931.91454: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882931.91456: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882931.91530: Set connection var ansible_timeout to 10 30564 1726882931.91533: Set connection var ansible_pipelining to False 30564 1726882931.91535: Set connection var ansible_shell_type to sh 30564 1726882931.91541: Set connection var ansible_shell_executable to /bin/sh 30564 1726882931.91548: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882931.91551: Set connection var ansible_connection to ssh 30564 1726882931.91570: variable 'ansible_shell_executable' from source: unknown 30564 1726882931.91576: variable 'ansible_connection' from source: unknown 30564 1726882931.91579: variable 'ansible_module_compression' from source: unknown 30564 1726882931.91581: variable 'ansible_shell_type' from source: unknown 30564 1726882931.91584: variable 'ansible_shell_executable' from source: unknown 30564 1726882931.91586: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882931.91588: variable 'ansible_pipelining' from source: unknown 30564 1726882931.91592: variable 'ansible_timeout' from source: unknown 30564 1726882931.91595: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882931.91696: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882931.91709: variable 'omit' from source: magic vars 30564 1726882931.91712: starting attempt loop 30564 1726882931.91715: running the handler 30564 1726882931.91749: variable 'networkmanager_nvr' from source: set_fact 30564 1726882931.91806: variable 'networkmanager_nvr' from source: set_fact 30564 1726882931.91817: handler run complete 30564 1726882931.91829: attempt loop complete, returning result 30564 1726882931.91832: _execute() done 30564 1726882931.91835: dumping result to json 30564 1726882931.91837: done dumping result, returning 30564 1726882931.91843: done running TaskExecutor() for managed_node2/TASK: Show NetworkManager version [0e448fcc-3ce9-4216-acec-00000000280b] 30564 1726882931.91848: sending task result for task 0e448fcc-3ce9-4216-acec-00000000280b 30564 1726882931.91940: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000280b 30564 1726882931.91943: WORKER PROCESS EXITING ok: [managed_node2] => { "networkmanager_nvr": "NetworkManager-1.51.0-1.el9" } 30564 1726882931.91991: no more pending results, returning what we have 30564 1726882931.91994: results queue empty 30564 1726882931.91995: checking for any_errors_fatal 30564 1726882931.92006: done checking for any_errors_fatal 30564 1726882931.92007: checking for max_fail_percentage 30564 1726882931.92009: done checking for max_fail_percentage 30564 1726882931.92010: checking to see if all hosts have failed and the running result is not ok 30564 1726882931.92011: done checking to see if all hosts have failed 30564 1726882931.92012: getting the remaining hosts for this loop 30564 1726882931.92014: done getting the remaining hosts for this loop 30564 1726882931.92017: getting the next task for host managed_node2 30564 1726882931.92026: done getting next task for host managed_node2 30564 1726882931.92033: ^ task is: TASK: Conditional asserts 30564 1726882931.92036: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882931.92040: getting variables 30564 1726882931.92042: in VariableManager get_vars() 30564 1726882931.92080: Calling all_inventory to load vars for managed_node2 30564 1726882931.92087: Calling groups_inventory to load vars for managed_node2 30564 1726882931.92090: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882931.92099: Calling all_plugins_play to load vars for managed_node2 30564 1726882931.92101: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882931.92104: Calling groups_plugins_play to load vars for managed_node2 30564 1726882931.92963: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882931.94471: done with get_vars() 30564 1726882931.94488: done getting variables TASK [Conditional asserts] ***************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:42 Friday 20 September 2024 21:42:11 -0400 (0:00:00.044) 0:02:10.526 ****** 30564 1726882931.94547: entering _queue_task() for managed_node2/include_tasks 30564 1726882931.94743: worker is 1 (out of 1 available) 30564 1726882931.94758: exiting _queue_task() for managed_node2/include_tasks 30564 1726882931.94773: done queuing things up, now waiting for results queue to drain 30564 1726882931.94775: waiting for pending results... 30564 1726882931.94959: running TaskExecutor() for managed_node2/TASK: Conditional asserts 30564 1726882931.95045: in run() - task 0e448fcc-3ce9-4216-acec-0000000020b3 30564 1726882931.95057: variable 'ansible_search_path' from source: unknown 30564 1726882931.95061: variable 'ansible_search_path' from source: unknown 30564 1726882931.95269: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 30564 1726882931.97393: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 30564 1726882931.97396: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 30564 1726882931.97399: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 30564 1726882931.97402: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 30564 1726882931.97404: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 30564 1726882931.97421: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 30564 1726882931.97449: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 30564 1726882931.97480: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 30564 1726882931.97520: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 30564 1726882931.97534: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 30564 1726882931.97636: variable 'lsr_assert_when' from source: include params 30564 1726882931.97745: variable 'network_provider' from source: set_fact 30564 1726882931.97821: variable 'omit' from source: magic vars 30564 1726882931.97931: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882931.97943: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882931.97947: variable 'omit' from source: magic vars 30564 1726882931.98141: variable 'ansible_distribution_major_version' from source: facts 30564 1726882931.98151: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882931.98262: variable 'item' from source: unknown 30564 1726882931.98274: Evaluated conditional (item['condition']): True 30564 1726882931.98350: variable 'item' from source: unknown 30564 1726882931.98386: variable 'item' from source: unknown 30564 1726882931.98445: variable 'item' from source: unknown 30564 1726882931.98593: dumping result to json 30564 1726882931.98595: done dumping result, returning 30564 1726882931.98598: done running TaskExecutor() for managed_node2/TASK: Conditional asserts [0e448fcc-3ce9-4216-acec-0000000020b3] 30564 1726882931.98600: sending task result for task 0e448fcc-3ce9-4216-acec-0000000020b3 30564 1726882931.98637: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000020b3 30564 1726882931.98640: WORKER PROCESS EXITING 30564 1726882931.98671: no more pending results, returning what we have 30564 1726882931.98677: in VariableManager get_vars() 30564 1726882931.98721: Calling all_inventory to load vars for managed_node2 30564 1726882931.98724: Calling groups_inventory to load vars for managed_node2 30564 1726882931.98727: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882931.98736: Calling all_plugins_play to load vars for managed_node2 30564 1726882931.98739: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882931.98742: Calling groups_plugins_play to load vars for managed_node2 30564 1726882932.00439: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882932.02308: done with get_vars() 30564 1726882932.02327: variable 'ansible_search_path' from source: unknown 30564 1726882932.02329: variable 'ansible_search_path' from source: unknown 30564 1726882932.02379: we have included files to process 30564 1726882932.02381: generating all_blocks data 30564 1726882932.02383: done generating all_blocks data 30564 1726882932.02388: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 30564 1726882932.02389: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 30564 1726882932.02391: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 30564 1726882932.02509: in VariableManager get_vars() 30564 1726882932.02530: done with get_vars() 30564 1726882932.02645: done processing included file 30564 1726882932.02647: iterating over new_blocks loaded from include file 30564 1726882932.02649: in VariableManager get_vars() 30564 1726882932.02667: done with get_vars() 30564 1726882932.02669: filtering new block on tags 30564 1726882932.02715: done filtering new block on tags 30564 1726882932.02718: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed_node2 => (item={'what': 'tasks/assert_device_absent.yml', 'condition': True}) 30564 1726882932.02723: extending task lists for all hosts with included blocks 30564 1726882932.04378: done extending task lists 30564 1726882932.04380: done processing included files 30564 1726882932.04381: results queue empty 30564 1726882932.04381: checking for any_errors_fatal 30564 1726882932.04385: done checking for any_errors_fatal 30564 1726882932.04385: checking for max_fail_percentage 30564 1726882932.04387: done checking for max_fail_percentage 30564 1726882932.04387: checking to see if all hosts have failed and the running result is not ok 30564 1726882932.04388: done checking to see if all hosts have failed 30564 1726882932.04389: getting the remaining hosts for this loop 30564 1726882932.04390: done getting the remaining hosts for this loop 30564 1726882932.04393: getting the next task for host managed_node2 30564 1726882932.04397: done getting next task for host managed_node2 30564 1726882932.04399: ^ task is: TASK: Include the task 'get_interface_stat.yml' 30564 1726882932.04402: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882932.04410: getting variables 30564 1726882932.04411: in VariableManager get_vars() 30564 1726882932.04483: Calling all_inventory to load vars for managed_node2 30564 1726882932.04486: Calling groups_inventory to load vars for managed_node2 30564 1726882932.04489: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882932.04493: Calling all_plugins_play to load vars for managed_node2 30564 1726882932.04495: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882932.04497: Calling groups_plugins_play to load vars for managed_node2 30564 1726882932.05906: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882932.07776: done with get_vars() 30564 1726882932.07809: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Friday 20 September 2024 21:42:12 -0400 (0:00:00.133) 0:02:10.660 ****** 30564 1726882932.07882: entering _queue_task() for managed_node2/include_tasks 30564 1726882932.08244: worker is 1 (out of 1 available) 30564 1726882932.08258: exiting _queue_task() for managed_node2/include_tasks 30564 1726882932.08280: done queuing things up, now waiting for results queue to drain 30564 1726882932.08281: waiting for pending results... 30564 1726882932.08599: running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' 30564 1726882932.08738: in run() - task 0e448fcc-3ce9-4216-acec-0000000028d3 30564 1726882932.08759: variable 'ansible_search_path' from source: unknown 30564 1726882932.08778: variable 'ansible_search_path' from source: unknown 30564 1726882932.08823: calling self._execute() 30564 1726882932.08947: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882932.08961: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882932.08979: variable 'omit' from source: magic vars 30564 1726882932.09397: variable 'ansible_distribution_major_version' from source: facts 30564 1726882932.09414: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882932.09430: _execute() done 30564 1726882932.09445: dumping result to json 30564 1726882932.09454: done dumping result, returning 30564 1726882932.09462: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' [0e448fcc-3ce9-4216-acec-0000000028d3] 30564 1726882932.09478: sending task result for task 0e448fcc-3ce9-4216-acec-0000000028d3 30564 1726882932.09611: no more pending results, returning what we have 30564 1726882932.09616: in VariableManager get_vars() 30564 1726882932.09672: Calling all_inventory to load vars for managed_node2 30564 1726882932.09675: Calling groups_inventory to load vars for managed_node2 30564 1726882932.09679: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882932.09695: Calling all_plugins_play to load vars for managed_node2 30564 1726882932.09698: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882932.09702: Calling groups_plugins_play to load vars for managed_node2 30564 1726882932.10832: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000028d3 30564 1726882932.10836: WORKER PROCESS EXITING 30564 1726882932.11578: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882932.12791: done with get_vars() 30564 1726882932.12807: variable 'ansible_search_path' from source: unknown 30564 1726882932.12808: variable 'ansible_search_path' from source: unknown 30564 1726882932.12906: variable 'item' from source: include params 30564 1726882932.12933: we have included files to process 30564 1726882932.12934: generating all_blocks data 30564 1726882932.12935: done generating all_blocks data 30564 1726882932.12936: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30564 1726882932.12937: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30564 1726882932.12938: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 30564 1726882932.13062: done processing included file 30564 1726882932.13066: iterating over new_blocks loaded from include file 30564 1726882932.13067: in VariableManager get_vars() 30564 1726882932.13081: done with get_vars() 30564 1726882932.13082: filtering new block on tags 30564 1726882932.13099: done filtering new block on tags 30564 1726882932.13100: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node2 30564 1726882932.13104: extending task lists for all hosts with included blocks 30564 1726882932.13205: done extending task lists 30564 1726882932.13206: done processing included files 30564 1726882932.13207: results queue empty 30564 1726882932.13207: checking for any_errors_fatal 30564 1726882932.13210: done checking for any_errors_fatal 30564 1726882932.13211: checking for max_fail_percentage 30564 1726882932.13211: done checking for max_fail_percentage 30564 1726882932.13212: checking to see if all hosts have failed and the running result is not ok 30564 1726882932.13212: done checking to see if all hosts have failed 30564 1726882932.13213: getting the remaining hosts for this loop 30564 1726882932.13214: done getting the remaining hosts for this loop 30564 1726882932.13215: getting the next task for host managed_node2 30564 1726882932.13218: done getting next task for host managed_node2 30564 1726882932.13220: ^ task is: TASK: Get stat for interface {{ interface }} 30564 1726882932.13222: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882932.13223: getting variables 30564 1726882932.13224: in VariableManager get_vars() 30564 1726882932.13232: Calling all_inventory to load vars for managed_node2 30564 1726882932.13233: Calling groups_inventory to load vars for managed_node2 30564 1726882932.13235: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882932.13239: Calling all_plugins_play to load vars for managed_node2 30564 1726882932.13241: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882932.13242: Calling groups_plugins_play to load vars for managed_node2 30564 1726882932.13982: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882932.14977: done with get_vars() 30564 1726882932.14994: done getting variables 30564 1726882932.15082: variable 'interface' from source: play vars TASK [Get stat for interface statebr] ****************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:42:12 -0400 (0:00:00.072) 0:02:10.732 ****** 30564 1726882932.15110: entering _queue_task() for managed_node2/stat 30564 1726882932.15389: worker is 1 (out of 1 available) 30564 1726882932.15402: exiting _queue_task() for managed_node2/stat 30564 1726882932.15413: done queuing things up, now waiting for results queue to drain 30564 1726882932.15414: waiting for pending results... 30564 1726882932.15702: running TaskExecutor() for managed_node2/TASK: Get stat for interface statebr 30564 1726882932.15844: in run() - task 0e448fcc-3ce9-4216-acec-000000002979 30564 1726882932.15871: variable 'ansible_search_path' from source: unknown 30564 1726882932.15880: variable 'ansible_search_path' from source: unknown 30564 1726882932.15915: calling self._execute() 30564 1726882932.16007: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882932.16011: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882932.16019: variable 'omit' from source: magic vars 30564 1726882932.16309: variable 'ansible_distribution_major_version' from source: facts 30564 1726882932.16335: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882932.16339: variable 'omit' from source: magic vars 30564 1726882932.16378: variable 'omit' from source: magic vars 30564 1726882932.16517: variable 'interface' from source: play vars 30564 1726882932.16521: variable 'omit' from source: magic vars 30564 1726882932.16524: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882932.16682: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882932.16686: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882932.16688: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882932.16691: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882932.16693: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882932.16695: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882932.16698: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882932.16727: Set connection var ansible_timeout to 10 30564 1726882932.16733: Set connection var ansible_pipelining to False 30564 1726882932.16736: Set connection var ansible_shell_type to sh 30564 1726882932.16741: Set connection var ansible_shell_executable to /bin/sh 30564 1726882932.16749: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882932.16752: Set connection var ansible_connection to ssh 30564 1726882932.16790: variable 'ansible_shell_executable' from source: unknown 30564 1726882932.16794: variable 'ansible_connection' from source: unknown 30564 1726882932.16797: variable 'ansible_module_compression' from source: unknown 30564 1726882932.16799: variable 'ansible_shell_type' from source: unknown 30564 1726882932.16801: variable 'ansible_shell_executable' from source: unknown 30564 1726882932.16803: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882932.16805: variable 'ansible_pipelining' from source: unknown 30564 1726882932.16807: variable 'ansible_timeout' from source: unknown 30564 1726882932.16811: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882932.17002: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 30564 1726882932.17011: variable 'omit' from source: magic vars 30564 1726882932.17013: starting attempt loop 30564 1726882932.17016: running the handler 30564 1726882932.17032: _low_level_execute_command(): starting 30564 1726882932.17040: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882932.17700: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882932.17704: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882932.17716: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882932.17731: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882932.17775: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882932.17779: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882932.17790: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882932.17805: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882932.17813: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882932.17816: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882932.17825: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882932.17834: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882932.17846: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882932.17854: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882932.17861: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882932.17876: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882932.17945: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882932.17959: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882932.17982: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882932.18103: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882932.19748: stdout chunk (state=3): >>>/root <<< 30564 1726882932.19853: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882932.19902: stderr chunk (state=3): >>><<< 30564 1726882932.19906: stdout chunk (state=3): >>><<< 30564 1726882932.19924: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882932.19933: _low_level_execute_command(): starting 30564 1726882932.19940: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882932.1992218-36276-24011223159676 `" && echo ansible-tmp-1726882932.1992218-36276-24011223159676="` echo /root/.ansible/tmp/ansible-tmp-1726882932.1992218-36276-24011223159676 `" ) && sleep 0' 30564 1726882932.20370: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882932.20374: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882932.20414: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882932.20418: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882932.20420: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882932.20462: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882932.20475: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882932.20576: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882932.22435: stdout chunk (state=3): >>>ansible-tmp-1726882932.1992218-36276-24011223159676=/root/.ansible/tmp/ansible-tmp-1726882932.1992218-36276-24011223159676 <<< 30564 1726882932.22544: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882932.22591: stderr chunk (state=3): >>><<< 30564 1726882932.22595: stdout chunk (state=3): >>><<< 30564 1726882932.22609: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882932.1992218-36276-24011223159676=/root/.ansible/tmp/ansible-tmp-1726882932.1992218-36276-24011223159676 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882932.22641: variable 'ansible_module_compression' from source: unknown 30564 1726882932.22689: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 30564 1726882932.22717: variable 'ansible_facts' from source: unknown 30564 1726882932.22775: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882932.1992218-36276-24011223159676/AnsiballZ_stat.py 30564 1726882932.22868: Sending initial data 30564 1726882932.22876: Sent initial data (152 bytes) 30564 1726882932.23504: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882932.23507: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882932.23537: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882932.23540: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882932.23542: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882932.23596: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882932.23601: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882932.23708: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882932.25426: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 30564 1726882932.25448: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882932.25543: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882932.25638: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmps_2tw83x /root/.ansible/tmp/ansible-tmp-1726882932.1992218-36276-24011223159676/AnsiballZ_stat.py <<< 30564 1726882932.25730: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882932.26973: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882932.27146: stderr chunk (state=3): >>><<< 30564 1726882932.27159: stdout chunk (state=3): >>><<< 30564 1726882932.27177: done transferring module to remote 30564 1726882932.27194: _low_level_execute_command(): starting 30564 1726882932.27199: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882932.1992218-36276-24011223159676/ /root/.ansible/tmp/ansible-tmp-1726882932.1992218-36276-24011223159676/AnsiballZ_stat.py && sleep 0' 30564 1726882932.27784: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882932.27787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882932.27814: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882932.27817: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882932.27834: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882932.27846: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882932.27856: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882932.27880: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882932.27886: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882932.27892: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882932.27904: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882932.27916: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882932.27923: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882932.27945: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882932.28027: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882932.28039: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882932.28165: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882932.29928: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882932.30011: stderr chunk (state=3): >>><<< 30564 1726882932.30023: stdout chunk (state=3): >>><<< 30564 1726882932.30039: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882932.30042: _low_level_execute_command(): starting 30564 1726882932.30045: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882932.1992218-36276-24011223159676/AnsiballZ_stat.py && sleep 0' 30564 1726882932.30446: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882932.30454: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882932.30490: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882932.30496: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882932.30504: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882932.30509: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882932.30516: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882932.30526: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882932.30533: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882932.30593: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882932.30599: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882932.30716: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882932.43840: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 30564 1726882932.44811: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882932.44880: stderr chunk (state=3): >>><<< 30564 1726882932.44883: stdout chunk (state=3): >>><<< 30564 1726882932.44899: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/statebr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882932.44923: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/statebr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882932.1992218-36276-24011223159676/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882932.44932: _low_level_execute_command(): starting 30564 1726882932.44938: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882932.1992218-36276-24011223159676/ > /dev/null 2>&1 && sleep 0' 30564 1726882932.45744: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882932.45786: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 30564 1726882932.45790: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 30564 1726882932.45872: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882932.45905: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882932.46026: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882932.46123: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882932.46243: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882932.48103: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882932.48107: stdout chunk (state=3): >>><<< 30564 1726882932.48114: stderr chunk (state=3): >>><<< 30564 1726882932.48131: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882932.48137: handler run complete 30564 1726882932.48161: attempt loop complete, returning result 30564 1726882932.48165: _execute() done 30564 1726882932.48169: dumping result to json 30564 1726882932.48175: done dumping result, returning 30564 1726882932.48182: done running TaskExecutor() for managed_node2/TASK: Get stat for interface statebr [0e448fcc-3ce9-4216-acec-000000002979] 30564 1726882932.48184: sending task result for task 0e448fcc-3ce9-4216-acec-000000002979 30564 1726882932.48292: done sending task result for task 0e448fcc-3ce9-4216-acec-000000002979 30564 1726882932.48295: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 30564 1726882932.48347: no more pending results, returning what we have 30564 1726882932.48351: results queue empty 30564 1726882932.48352: checking for any_errors_fatal 30564 1726882932.48353: done checking for any_errors_fatal 30564 1726882932.48354: checking for max_fail_percentage 30564 1726882932.48356: done checking for max_fail_percentage 30564 1726882932.48357: checking to see if all hosts have failed and the running result is not ok 30564 1726882932.48358: done checking to see if all hosts have failed 30564 1726882932.48359: getting the remaining hosts for this loop 30564 1726882932.48361: done getting the remaining hosts for this loop 30564 1726882932.48366: getting the next task for host managed_node2 30564 1726882932.48381: done getting next task for host managed_node2 30564 1726882932.48384: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 30564 1726882932.48389: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882932.48395: getting variables 30564 1726882932.48396: in VariableManager get_vars() 30564 1726882932.48443: Calling all_inventory to load vars for managed_node2 30564 1726882932.48446: Calling groups_inventory to load vars for managed_node2 30564 1726882932.48450: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882932.48460: Calling all_plugins_play to load vars for managed_node2 30564 1726882932.48472: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882932.48476: Calling groups_plugins_play to load vars for managed_node2 30564 1726882932.50775: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882932.51734: done with get_vars() 30564 1726882932.51752: done getting variables 30564 1726882932.51798: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30564 1726882932.51890: variable 'interface' from source: play vars TASK [Assert that the interface is absent - 'statebr'] ************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Friday 20 September 2024 21:42:12 -0400 (0:00:00.368) 0:02:11.100 ****** 30564 1726882932.51913: entering _queue_task() for managed_node2/assert 30564 1726882932.52130: worker is 1 (out of 1 available) 30564 1726882932.52142: exiting _queue_task() for managed_node2/assert 30564 1726882932.52159: done queuing things up, now waiting for results queue to drain 30564 1726882932.52161: waiting for pending results... 30564 1726882932.52354: running TaskExecutor() for managed_node2/TASK: Assert that the interface is absent - 'statebr' 30564 1726882932.52439: in run() - task 0e448fcc-3ce9-4216-acec-0000000028d4 30564 1726882932.52449: variable 'ansible_search_path' from source: unknown 30564 1726882932.52452: variable 'ansible_search_path' from source: unknown 30564 1726882932.52484: calling self._execute() 30564 1726882932.52568: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882932.52575: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882932.52709: variable 'omit' from source: magic vars 30564 1726882932.53067: variable 'ansible_distribution_major_version' from source: facts 30564 1726882932.53070: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882932.53073: variable 'omit' from source: magic vars 30564 1726882932.53077: variable 'omit' from source: magic vars 30564 1726882932.53372: variable 'interface' from source: play vars 30564 1726882932.53376: variable 'omit' from source: magic vars 30564 1726882932.53379: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882932.53382: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882932.53384: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882932.53386: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882932.53388: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882932.53390: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882932.53392: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882932.53394: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882932.53396: Set connection var ansible_timeout to 10 30564 1726882932.53398: Set connection var ansible_pipelining to False 30564 1726882932.53400: Set connection var ansible_shell_type to sh 30564 1726882932.53402: Set connection var ansible_shell_executable to /bin/sh 30564 1726882932.53404: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882932.53406: Set connection var ansible_connection to ssh 30564 1726882932.53470: variable 'ansible_shell_executable' from source: unknown 30564 1726882932.53476: variable 'ansible_connection' from source: unknown 30564 1726882932.53479: variable 'ansible_module_compression' from source: unknown 30564 1726882932.53482: variable 'ansible_shell_type' from source: unknown 30564 1726882932.53484: variable 'ansible_shell_executable' from source: unknown 30564 1726882932.53486: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882932.53488: variable 'ansible_pipelining' from source: unknown 30564 1726882932.53490: variable 'ansible_timeout' from source: unknown 30564 1726882932.53492: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882932.53568: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882932.53611: variable 'omit' from source: magic vars 30564 1726882932.53614: starting attempt loop 30564 1726882932.53617: running the handler 30564 1726882932.53720: variable 'interface_stat' from source: set_fact 30564 1726882932.53728: Evaluated conditional (not interface_stat.stat.exists): True 30564 1726882932.53734: handler run complete 30564 1726882932.53748: attempt loop complete, returning result 30564 1726882932.53751: _execute() done 30564 1726882932.53753: dumping result to json 30564 1726882932.53755: done dumping result, returning 30564 1726882932.53763: done running TaskExecutor() for managed_node2/TASK: Assert that the interface is absent - 'statebr' [0e448fcc-3ce9-4216-acec-0000000028d4] 30564 1726882932.53769: sending task result for task 0e448fcc-3ce9-4216-acec-0000000028d4 30564 1726882932.53860: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000028d4 30564 1726882932.53863: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 30564 1726882932.54015: no more pending results, returning what we have 30564 1726882932.54017: results queue empty 30564 1726882932.54018: checking for any_errors_fatal 30564 1726882932.54025: done checking for any_errors_fatal 30564 1726882932.54026: checking for max_fail_percentage 30564 1726882932.54028: done checking for max_fail_percentage 30564 1726882932.54028: checking to see if all hosts have failed and the running result is not ok 30564 1726882932.54029: done checking to see if all hosts have failed 30564 1726882932.54030: getting the remaining hosts for this loop 30564 1726882932.54031: done getting the remaining hosts for this loop 30564 1726882932.54034: getting the next task for host managed_node2 30564 1726882932.54043: done getting next task for host managed_node2 30564 1726882932.54046: ^ task is: TASK: Success in test '{{ lsr_description }}' 30564 1726882932.54048: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882932.54052: getting variables 30564 1726882932.54053: in VariableManager get_vars() 30564 1726882932.54098: Calling all_inventory to load vars for managed_node2 30564 1726882932.54101: Calling groups_inventory to load vars for managed_node2 30564 1726882932.54104: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882932.54113: Calling all_plugins_play to load vars for managed_node2 30564 1726882932.54117: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882932.54120: Calling groups_plugins_play to load vars for managed_node2 30564 1726882932.55653: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882932.57640: done with get_vars() 30564 1726882932.57661: done getting variables 30564 1726882932.57723: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 30564 1726882932.57835: variable 'lsr_description' from source: include params TASK [Success in test 'I will not get an error when I try to remove an absent profile'] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:47 Friday 20 September 2024 21:42:12 -0400 (0:00:00.059) 0:02:11.159 ****** 30564 1726882932.57862: entering _queue_task() for managed_node2/debug 30564 1726882932.58160: worker is 1 (out of 1 available) 30564 1726882932.58175: exiting _queue_task() for managed_node2/debug 30564 1726882932.58187: done queuing things up, now waiting for results queue to drain 30564 1726882932.58189: waiting for pending results... 30564 1726882932.58506: running TaskExecutor() for managed_node2/TASK: Success in test 'I will not get an error when I try to remove an absent profile' 30564 1726882932.58643: in run() - task 0e448fcc-3ce9-4216-acec-0000000020b4 30564 1726882932.58671: variable 'ansible_search_path' from source: unknown 30564 1726882932.58683: variable 'ansible_search_path' from source: unknown 30564 1726882932.58722: calling self._execute() 30564 1726882932.58837: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882932.58853: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882932.58871: variable 'omit' from source: magic vars 30564 1726882932.59271: variable 'ansible_distribution_major_version' from source: facts 30564 1726882932.59294: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882932.59306: variable 'omit' from source: magic vars 30564 1726882932.59357: variable 'omit' from source: magic vars 30564 1726882932.59470: variable 'lsr_description' from source: include params 30564 1726882932.59495: variable 'omit' from source: magic vars 30564 1726882932.59550: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882932.59593: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882932.59623: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882932.59653: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882932.59671: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882932.59703: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882932.59711: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882932.59722: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882932.59837: Set connection var ansible_timeout to 10 30564 1726882932.59848: Set connection var ansible_pipelining to False 30564 1726882932.59860: Set connection var ansible_shell_type to sh 30564 1726882932.59879: Set connection var ansible_shell_executable to /bin/sh 30564 1726882932.59893: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882932.59901: Set connection var ansible_connection to ssh 30564 1726882932.59931: variable 'ansible_shell_executable' from source: unknown 30564 1726882932.59941: variable 'ansible_connection' from source: unknown 30564 1726882932.59947: variable 'ansible_module_compression' from source: unknown 30564 1726882932.59952: variable 'ansible_shell_type' from source: unknown 30564 1726882932.59957: variable 'ansible_shell_executable' from source: unknown 30564 1726882932.59967: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882932.59979: variable 'ansible_pipelining' from source: unknown 30564 1726882932.59984: variable 'ansible_timeout' from source: unknown 30564 1726882932.59990: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882932.60135: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882932.60157: variable 'omit' from source: magic vars 30564 1726882932.60169: starting attempt loop 30564 1726882932.60176: running the handler 30564 1726882932.60228: handler run complete 30564 1726882932.60247: attempt loop complete, returning result 30564 1726882932.60254: _execute() done 30564 1726882932.60265: dumping result to json 30564 1726882932.60275: done dumping result, returning 30564 1726882932.60292: done running TaskExecutor() for managed_node2/TASK: Success in test 'I will not get an error when I try to remove an absent profile' [0e448fcc-3ce9-4216-acec-0000000020b4] 30564 1726882932.60308: sending task result for task 0e448fcc-3ce9-4216-acec-0000000020b4 ok: [managed_node2] => {} MSG: +++++ Success in test 'I will not get an error when I try to remove an absent profile' +++++ 30564 1726882932.60456: no more pending results, returning what we have 30564 1726882932.60460: results queue empty 30564 1726882932.60462: checking for any_errors_fatal 30564 1726882932.60475: done checking for any_errors_fatal 30564 1726882932.60476: checking for max_fail_percentage 30564 1726882932.60478: done checking for max_fail_percentage 30564 1726882932.60479: checking to see if all hosts have failed and the running result is not ok 30564 1726882932.60479: done checking to see if all hosts have failed 30564 1726882932.60481: getting the remaining hosts for this loop 30564 1726882932.60482: done getting the remaining hosts for this loop 30564 1726882932.60487: getting the next task for host managed_node2 30564 1726882932.60497: done getting next task for host managed_node2 30564 1726882932.60501: ^ task is: TASK: Cleanup 30564 1726882932.60505: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882932.60513: getting variables 30564 1726882932.60515: in VariableManager get_vars() 30564 1726882932.60568: Calling all_inventory to load vars for managed_node2 30564 1726882932.60571: Calling groups_inventory to load vars for managed_node2 30564 1726882932.60575: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882932.60587: Calling all_plugins_play to load vars for managed_node2 30564 1726882932.60590: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882932.60593: Calling groups_plugins_play to load vars for managed_node2 30564 1726882932.61613: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000020b4 30564 1726882932.61616: WORKER PROCESS EXITING 30564 1726882932.62631: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882932.64438: done with get_vars() 30564 1726882932.64470: done getting variables TASK [Cleanup] ***************************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:66 Friday 20 September 2024 21:42:12 -0400 (0:00:00.066) 0:02:11.226 ****** 30564 1726882932.64561: entering _queue_task() for managed_node2/include_tasks 30564 1726882932.64846: worker is 1 (out of 1 available) 30564 1726882932.64859: exiting _queue_task() for managed_node2/include_tasks 30564 1726882932.64873: done queuing things up, now waiting for results queue to drain 30564 1726882932.64879: waiting for pending results... 30564 1726882932.65169: running TaskExecutor() for managed_node2/TASK: Cleanup 30564 1726882932.65289: in run() - task 0e448fcc-3ce9-4216-acec-0000000020b8 30564 1726882932.65308: variable 'ansible_search_path' from source: unknown 30564 1726882932.65322: variable 'ansible_search_path' from source: unknown 30564 1726882932.65374: variable 'lsr_cleanup' from source: include params 30564 1726882932.65585: variable 'lsr_cleanup' from source: include params 30564 1726882932.65666: variable 'omit' from source: magic vars 30564 1726882932.65821: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882932.65837: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882932.65852: variable 'omit' from source: magic vars 30564 1726882932.66119: variable 'ansible_distribution_major_version' from source: facts 30564 1726882932.66133: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882932.66143: variable 'item' from source: unknown 30564 1726882932.66219: variable 'item' from source: unknown 30564 1726882932.66254: variable 'item' from source: unknown 30564 1726882932.66329: variable 'item' from source: unknown 30564 1726882932.66527: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882932.66541: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882932.66554: variable 'omit' from source: magic vars 30564 1726882932.66729: variable 'ansible_distribution_major_version' from source: facts 30564 1726882932.66740: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882932.66751: variable 'item' from source: unknown 30564 1726882932.66822: variable 'item' from source: unknown 30564 1726882932.66851: variable 'item' from source: unknown 30564 1726882932.66926: variable 'item' from source: unknown 30564 1726882932.67021: dumping result to json 30564 1726882932.67032: done dumping result, returning 30564 1726882932.67041: done running TaskExecutor() for managed_node2/TASK: Cleanup [0e448fcc-3ce9-4216-acec-0000000020b8] 30564 1726882932.67052: sending task result for task 0e448fcc-3ce9-4216-acec-0000000020b8 30564 1726882932.67148: no more pending results, returning what we have 30564 1726882932.67154: in VariableManager get_vars() 30564 1726882932.67213: Calling all_inventory to load vars for managed_node2 30564 1726882932.67215: Calling groups_inventory to load vars for managed_node2 30564 1726882932.67219: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882932.67236: Calling all_plugins_play to load vars for managed_node2 30564 1726882932.67240: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882932.67243: Calling groups_plugins_play to load vars for managed_node2 30564 1726882932.68313: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000020b8 30564 1726882932.68317: WORKER PROCESS EXITING 30564 1726882932.69304: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882932.78159: done with get_vars() 30564 1726882932.78189: variable 'ansible_search_path' from source: unknown 30564 1726882932.78190: variable 'ansible_search_path' from source: unknown 30564 1726882932.78231: variable 'ansible_search_path' from source: unknown 30564 1726882932.78232: variable 'ansible_search_path' from source: unknown 30564 1726882932.78270: we have included files to process 30564 1726882932.78272: generating all_blocks data 30564 1726882932.78274: done generating all_blocks data 30564 1726882932.78276: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30564 1726882932.78278: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30564 1726882932.78281: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml 30564 1726882932.78476: done processing included file 30564 1726882932.78478: iterating over new_blocks loaded from include file 30564 1726882932.78480: in VariableManager get_vars() 30564 1726882932.78497: done with get_vars() 30564 1726882932.78498: filtering new block on tags 30564 1726882932.78518: done filtering new block on tags 30564 1726882932.78520: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml for managed_node2 => (item=tasks/cleanup_profile+device.yml) 30564 1726882932.78524: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 30564 1726882932.78525: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 30564 1726882932.78528: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 30564 1726882932.78865: done processing included file 30564 1726882932.78867: iterating over new_blocks loaded from include file 30564 1726882932.78868: in VariableManager get_vars() 30564 1726882932.78885: done with get_vars() 30564 1726882932.78887: filtering new block on tags 30564 1726882932.78930: done filtering new block on tags 30564 1726882932.78933: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed_node2 => (item=tasks/check_network_dns.yml) 30564 1726882932.78937: extending task lists for all hosts with included blocks 30564 1726882932.80788: done extending task lists 30564 1726882932.80789: done processing included files 30564 1726882932.80790: results queue empty 30564 1726882932.80791: checking for any_errors_fatal 30564 1726882932.80795: done checking for any_errors_fatal 30564 1726882932.80796: checking for max_fail_percentage 30564 1726882932.80797: done checking for max_fail_percentage 30564 1726882932.80798: checking to see if all hosts have failed and the running result is not ok 30564 1726882932.80798: done checking to see if all hosts have failed 30564 1726882932.80799: getting the remaining hosts for this loop 30564 1726882932.80800: done getting the remaining hosts for this loop 30564 1726882932.80803: getting the next task for host managed_node2 30564 1726882932.80807: done getting next task for host managed_node2 30564 1726882932.80810: ^ task is: TASK: Cleanup profile and device 30564 1726882932.80812: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882932.80815: getting variables 30564 1726882932.80816: in VariableManager get_vars() 30564 1726882932.80830: Calling all_inventory to load vars for managed_node2 30564 1726882932.80837: Calling groups_inventory to load vars for managed_node2 30564 1726882932.80839: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882932.80845: Calling all_plugins_play to load vars for managed_node2 30564 1726882932.80847: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882932.80850: Calling groups_plugins_play to load vars for managed_node2 30564 1726882932.82200: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882932.84006: done with get_vars() 30564 1726882932.84045: done getting variables 30564 1726882932.84096: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Cleanup profile and device] ********************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_profile+device.yml:3 Friday 20 September 2024 21:42:12 -0400 (0:00:00.195) 0:02:11.422 ****** 30564 1726882932.84137: entering _queue_task() for managed_node2/shell 30564 1726882932.84537: worker is 1 (out of 1 available) 30564 1726882932.84556: exiting _queue_task() for managed_node2/shell 30564 1726882932.84572: done queuing things up, now waiting for results queue to drain 30564 1726882932.84576: waiting for pending results... 30564 1726882932.84938: running TaskExecutor() for managed_node2/TASK: Cleanup profile and device 30564 1726882932.85081: in run() - task 0e448fcc-3ce9-4216-acec-00000000299e 30564 1726882932.85109: variable 'ansible_search_path' from source: unknown 30564 1726882932.85122: variable 'ansible_search_path' from source: unknown 30564 1726882932.85166: calling self._execute() 30564 1726882932.85284: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882932.85296: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882932.85317: variable 'omit' from source: magic vars 30564 1726882932.85768: variable 'ansible_distribution_major_version' from source: facts 30564 1726882932.85792: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882932.85803: variable 'omit' from source: magic vars 30564 1726882932.85859: variable 'omit' from source: magic vars 30564 1726882932.86047: variable 'interface' from source: play vars 30564 1726882932.86083: variable 'omit' from source: magic vars 30564 1726882932.86136: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882932.86184: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882932.86218: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882932.86241: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882932.86259: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882932.86305: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882932.86318: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882932.86326: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882932.86445: Set connection var ansible_timeout to 10 30564 1726882932.86458: Set connection var ansible_pipelining to False 30564 1726882932.86467: Set connection var ansible_shell_type to sh 30564 1726882932.86480: Set connection var ansible_shell_executable to /bin/sh 30564 1726882932.86492: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882932.86498: Set connection var ansible_connection to ssh 30564 1726882932.86544: variable 'ansible_shell_executable' from source: unknown 30564 1726882932.86552: variable 'ansible_connection' from source: unknown 30564 1726882932.86558: variable 'ansible_module_compression' from source: unknown 30564 1726882932.86566: variable 'ansible_shell_type' from source: unknown 30564 1726882932.86575: variable 'ansible_shell_executable' from source: unknown 30564 1726882932.86581: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882932.86587: variable 'ansible_pipelining' from source: unknown 30564 1726882932.86593: variable 'ansible_timeout' from source: unknown 30564 1726882932.86600: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882932.86762: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882932.86784: variable 'omit' from source: magic vars 30564 1726882932.86793: starting attempt loop 30564 1726882932.86799: running the handler 30564 1726882932.86812: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882932.86844: _low_level_execute_command(): starting 30564 1726882932.86860: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882932.87592: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882932.87603: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882932.87649: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882932.87653: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration <<< 30564 1726882932.87657: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882932.87706: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882932.87709: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882932.87841: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882932.89522: stdout chunk (state=3): >>>/root <<< 30564 1726882932.89623: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882932.89687: stderr chunk (state=3): >>><<< 30564 1726882932.89695: stdout chunk (state=3): >>><<< 30564 1726882932.89718: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882932.89729: _low_level_execute_command(): starting 30564 1726882932.89735: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882932.8971686-36308-39120574178614 `" && echo ansible-tmp-1726882932.8971686-36308-39120574178614="` echo /root/.ansible/tmp/ansible-tmp-1726882932.8971686-36308-39120574178614 `" ) && sleep 0' 30564 1726882932.90316: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882932.90324: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882932.90335: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882932.90346: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882932.90389: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882932.90396: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882932.90407: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882932.90421: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882932.90429: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882932.90436: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882932.90444: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882932.90454: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882932.90467: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882932.90483: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882932.90491: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882932.90502: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882932.90597: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882932.90606: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882932.90609: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882932.90859: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882932.92657: stdout chunk (state=3): >>>ansible-tmp-1726882932.8971686-36308-39120574178614=/root/.ansible/tmp/ansible-tmp-1726882932.8971686-36308-39120574178614 <<< 30564 1726882932.92765: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882932.92819: stderr chunk (state=3): >>><<< 30564 1726882932.92822: stdout chunk (state=3): >>><<< 30564 1726882932.92838: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882932.8971686-36308-39120574178614=/root/.ansible/tmp/ansible-tmp-1726882932.8971686-36308-39120574178614 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882932.92873: variable 'ansible_module_compression' from source: unknown 30564 1726882932.92916: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30564 1726882932.92947: variable 'ansible_facts' from source: unknown 30564 1726882932.93013: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882932.8971686-36308-39120574178614/AnsiballZ_command.py 30564 1726882932.93122: Sending initial data 30564 1726882932.93125: Sent initial data (155 bytes) 30564 1726882932.94072: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882932.94076: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882932.94078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882932.94081: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882932.94085: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882932.94088: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882932.94090: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882932.94099: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882932.94278: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882932.94282: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882932.94285: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882932.94287: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882932.94290: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882932.94292: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882932.94294: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882932.94297: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882932.94299: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882932.94301: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882932.94303: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882932.94948: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882932.96183: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 30564 1726882932.96210: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882932.96301: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882932.96407: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmpdgnac87m /root/.ansible/tmp/ansible-tmp-1726882932.8971686-36308-39120574178614/AnsiballZ_command.py <<< 30564 1726882932.96497: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882932.97841: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882932.97927: stderr chunk (state=3): >>><<< 30564 1726882932.97930: stdout chunk (state=3): >>><<< 30564 1726882932.97953: done transferring module to remote 30564 1726882932.97966: _low_level_execute_command(): starting 30564 1726882932.97974: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882932.8971686-36308-39120574178614/ /root/.ansible/tmp/ansible-tmp-1726882932.8971686-36308-39120574178614/AnsiballZ_command.py && sleep 0' 30564 1726882932.98606: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882932.98614: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882932.98625: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882932.98660: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882932.98710: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882932.98717: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882932.98726: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882932.98739: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882932.98746: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882932.98753: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882932.98761: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882932.98776: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882932.98788: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882932.98793: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882932.98802: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882932.98817: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882932.98936: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882932.98954: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882932.98969: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882932.99132: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882933.01380: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882933.01384: stderr chunk (state=3): >>><<< 30564 1726882933.01386: stdout chunk (state=3): >>><<< 30564 1726882933.01388: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882933.01390: _low_level_execute_command(): starting 30564 1726882933.01392: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882932.8971686-36308-39120574178614/AnsiballZ_command.py && sleep 0' 30564 1726882933.01704: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882933.01716: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882933.01742: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882933.01744: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882933.01791: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882933.01798: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882933.01807: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882933.01840: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882933.01845: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882933.01848: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882933.01850: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882933.01855: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882933.01867: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882933.01880: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882933.01894: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882933.01903: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882933.02060: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882933.02145: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882933.02150: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882933.02153: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882933.18753: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "Error: unknown connection 'statebr'.\nError: cannot delete unknown connection(s): 'statebr'.\nCannot find device \"statebr\"", "rc": 1, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-20 21:42:13.149492", "end": "2024-09-20 21:42:13.185408", "delta": "0:00:00.035916", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30564 1726882933.19992: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.11.158 closed. <<< 30564 1726882933.20019: stderr chunk (state=3): >>><<< 30564 1726882933.20022: stdout chunk (state=3): >>><<< 30564 1726882933.20174: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "Error: unknown connection 'statebr'.\nError: cannot delete unknown connection(s): 'statebr'.\nCannot find device \"statebr\"", "rc": 1, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "start": "2024-09-20 21:42:13.149492", "end": "2024-09-20 21:42:13.185408", "delta": "0:00:00.035916", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.11.158 closed. 30564 1726882933.20185: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882932.8971686-36308-39120574178614/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882933.20188: _low_level_execute_command(): starting 30564 1726882933.20190: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882932.8971686-36308-39120574178614/ > /dev/null 2>&1 && sleep 0' 30564 1726882933.20736: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882933.20750: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882933.20771: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882933.20790: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882933.20832: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882933.20845: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882933.20859: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882933.20883: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882933.20894: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882933.20903: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882933.20914: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882933.20926: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882933.20946: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882933.20958: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882933.20982: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882933.21003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882933.21072: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882933.21083: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882933.21188: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882933.22995: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882933.23049: stderr chunk (state=3): >>><<< 30564 1726882933.23052: stdout chunk (state=3): >>><<< 30564 1726882933.23066: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882933.23077: handler run complete 30564 1726882933.23100: Evaluated conditional (False): False 30564 1726882933.23207: attempt loop complete, returning result 30564 1726882933.23210: _execute() done 30564 1726882933.23212: dumping result to json 30564 1726882933.23216: done dumping result, returning 30564 1726882933.23218: done running TaskExecutor() for managed_node2/TASK: Cleanup profile and device [0e448fcc-3ce9-4216-acec-00000000299e] 30564 1726882933.23221: sending task result for task 0e448fcc-3ce9-4216-acec-00000000299e 30564 1726882933.23302: done sending task result for task 0e448fcc-3ce9-4216-acec-00000000299e fatal: [managed_node2]: FAILED! => { "changed": false, "cmd": "nmcli con delete statebr\nnmcli con load /etc/sysconfig/network-scripts/ifcfg-statebr\nrm -f /etc/sysconfig/network-scripts/ifcfg-statebr\nip link del statebr\n", "delta": "0:00:00.035916", "end": "2024-09-20 21:42:13.185408", "rc": 1, "start": "2024-09-20 21:42:13.149492" } STDERR: Error: unknown connection 'statebr'. Error: cannot delete unknown connection(s): 'statebr'. Cannot find device "statebr" MSG: non-zero return code ...ignoring 30564 1726882933.23380: no more pending results, returning what we have 30564 1726882933.23386: results queue empty 30564 1726882933.23387: checking for any_errors_fatal 30564 1726882933.23389: done checking for any_errors_fatal 30564 1726882933.23390: checking for max_fail_percentage 30564 1726882933.23391: done checking for max_fail_percentage 30564 1726882933.23392: checking to see if all hosts have failed and the running result is not ok 30564 1726882933.23393: done checking to see if all hosts have failed 30564 1726882933.23394: getting the remaining hosts for this loop 30564 1726882933.23396: done getting the remaining hosts for this loop 30564 1726882933.23399: getting the next task for host managed_node2 30564 1726882933.23410: done getting next task for host managed_node2 30564 1726882933.23412: ^ task is: TASK: Check routes and DNS 30564 1726882933.23418: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882933.23423: getting variables 30564 1726882933.23424: in VariableManager get_vars() 30564 1726882933.23476: Calling all_inventory to load vars for managed_node2 30564 1726882933.23479: Calling groups_inventory to load vars for managed_node2 30564 1726882933.23483: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882933.23495: Calling all_plugins_play to load vars for managed_node2 30564 1726882933.23503: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882933.23506: Calling groups_plugins_play to load vars for managed_node2 30564 1726882933.24265: WORKER PROCESS EXITING 30564 1726882933.25319: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882933.26296: done with get_vars() 30564 1726882933.26313: done getting variables 30564 1726882933.26354: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Friday 20 September 2024 21:42:13 -0400 (0:00:00.422) 0:02:11.845 ****** 30564 1726882933.26381: entering _queue_task() for managed_node2/shell 30564 1726882933.26614: worker is 1 (out of 1 available) 30564 1726882933.26626: exiting _queue_task() for managed_node2/shell 30564 1726882933.26639: done queuing things up, now waiting for results queue to drain 30564 1726882933.26641: waiting for pending results... 30564 1726882933.26865: running TaskExecutor() for managed_node2/TASK: Check routes and DNS 30564 1726882933.26970: in run() - task 0e448fcc-3ce9-4216-acec-0000000029a2 30564 1726882933.26979: variable 'ansible_search_path' from source: unknown 30564 1726882933.27016: variable 'ansible_search_path' from source: unknown 30564 1726882933.27025: calling self._execute() 30564 1726882933.27670: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882933.27675: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882933.27678: variable 'omit' from source: magic vars 30564 1726882933.27681: variable 'ansible_distribution_major_version' from source: facts 30564 1726882933.27683: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882933.27685: variable 'omit' from source: magic vars 30564 1726882933.27687: variable 'omit' from source: magic vars 30564 1726882933.27689: variable 'omit' from source: magic vars 30564 1726882933.27692: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882933.28158: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882933.28162: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882933.28169: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882933.28172: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882933.28175: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882933.28177: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882933.28180: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882933.28182: Set connection var ansible_timeout to 10 30564 1726882933.28184: Set connection var ansible_pipelining to False 30564 1726882933.28186: Set connection var ansible_shell_type to sh 30564 1726882933.28189: Set connection var ansible_shell_executable to /bin/sh 30564 1726882933.28192: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882933.28194: Set connection var ansible_connection to ssh 30564 1726882933.28196: variable 'ansible_shell_executable' from source: unknown 30564 1726882933.28198: variable 'ansible_connection' from source: unknown 30564 1726882933.28201: variable 'ansible_module_compression' from source: unknown 30564 1726882933.28203: variable 'ansible_shell_type' from source: unknown 30564 1726882933.28205: variable 'ansible_shell_executable' from source: unknown 30564 1726882933.28207: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882933.28210: variable 'ansible_pipelining' from source: unknown 30564 1726882933.28212: variable 'ansible_timeout' from source: unknown 30564 1726882933.28214: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882933.28217: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882933.28220: variable 'omit' from source: magic vars 30564 1726882933.28223: starting attempt loop 30564 1726882933.28225: running the handler 30564 1726882933.28228: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882933.28230: _low_level_execute_command(): starting 30564 1726882933.28233: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882933.29030: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882933.29034: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882933.29036: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882933.29039: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882933.29041: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882933.29043: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882933.29047: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882933.29049: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882933.29051: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882933.29053: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882933.29055: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882933.29057: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882933.29060: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882933.29062: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882933.29073: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882933.29077: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882933.29080: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882933.29082: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882933.29085: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882933.29196: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882933.30774: stdout chunk (state=3): >>>/root <<< 30564 1726882933.30879: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882933.30919: stderr chunk (state=3): >>><<< 30564 1726882933.30922: stdout chunk (state=3): >>><<< 30564 1726882933.30941: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882933.30953: _low_level_execute_command(): starting 30564 1726882933.30974: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882933.3094053-36330-90276027878898 `" && echo ansible-tmp-1726882933.3094053-36330-90276027878898="` echo /root/.ansible/tmp/ansible-tmp-1726882933.3094053-36330-90276027878898 `" ) && sleep 0' 30564 1726882933.31393: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882933.31406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882933.31439: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882933.31481: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882933.31488: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882933.31497: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882933.31514: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882933.31521: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882933.31527: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882933.31535: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882933.31543: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882933.31556: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882933.31562: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882933.31572: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882933.31588: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882933.31694: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882933.31697: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882933.31711: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882933.31842: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882933.33691: stdout chunk (state=3): >>>ansible-tmp-1726882933.3094053-36330-90276027878898=/root/.ansible/tmp/ansible-tmp-1726882933.3094053-36330-90276027878898 <<< 30564 1726882933.33797: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882933.33842: stderr chunk (state=3): >>><<< 30564 1726882933.33845: stdout chunk (state=3): >>><<< 30564 1726882933.33865: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882933.3094053-36330-90276027878898=/root/.ansible/tmp/ansible-tmp-1726882933.3094053-36330-90276027878898 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882933.33896: variable 'ansible_module_compression' from source: unknown 30564 1726882933.33935: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30564 1726882933.33965: variable 'ansible_facts' from source: unknown 30564 1726882933.34028: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882933.3094053-36330-90276027878898/AnsiballZ_command.py 30564 1726882933.34134: Sending initial data 30564 1726882933.34138: Sent initial data (155 bytes) 30564 1726882933.34779: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882933.34783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882933.34813: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882933.34817: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882933.34819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882933.34878: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882933.34881: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882933.34986: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882933.36718: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882933.36811: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882933.36911: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmppgx0bj97 /root/.ansible/tmp/ansible-tmp-1726882933.3094053-36330-90276027878898/AnsiballZ_command.py <<< 30564 1726882933.37006: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882933.38019: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882933.38117: stderr chunk (state=3): >>><<< 30564 1726882933.38120: stdout chunk (state=3): >>><<< 30564 1726882933.38135: done transferring module to remote 30564 1726882933.38144: _low_level_execute_command(): starting 30564 1726882933.38149: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882933.3094053-36330-90276027878898/ /root/.ansible/tmp/ansible-tmp-1726882933.3094053-36330-90276027878898/AnsiballZ_command.py && sleep 0' 30564 1726882933.38580: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882933.38587: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882933.38621: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 30564 1726882933.38634: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration <<< 30564 1726882933.38646: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882933.38694: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882933.38706: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882933.38812: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882933.40536: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882933.40589: stderr chunk (state=3): >>><<< 30564 1726882933.40592: stdout chunk (state=3): >>><<< 30564 1726882933.40607: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882933.40610: _low_level_execute_command(): starting 30564 1726882933.40613: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882933.3094053-36330-90276027878898/AnsiballZ_command.py && sleep 0' 30564 1726882933.41037: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882933.41040: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882933.41072: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882933.41078: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882933.41080: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882933.41133: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882933.41136: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882933.41247: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882933.55209: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:4f:68:7a:de:b1 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.11.158/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 2930sec preferred_lft 2930sec\n inet6 fe80::104f:68ff:fe7a:deb1/64 scope link \n valid_lft forever preferred_lft forever\n30: rpltstbr: mtu 1500 qdisc noqueue state DOWN group default qlen 1000\n link/ether 2e:06:5a:d7:92:57 brd ff:ff:ff:ff:ff:ff\n inet 192.0.2.72/31 scope global noprefixroute rpltstbr\n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.11.158 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.11.158 metric 100 \n192.0.2.72/31 dev rpltstbr proto kernel scope link src 192.0.2.72 metric 425 linkdown \nIP -6 ROUTE\n::1 dev lo proto kernel metric 256 pref medium\nfe80::/64 dev eth0 proto kernel metric 256 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 21:42:13.541595", "end": "2024-09-20 21:42:13.549881", "delta": "0:00:00.008286", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30564 1726882933.56317: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882933.56380: stderr chunk (state=3): >>><<< 30564 1726882933.56383: stdout chunk (state=3): >>><<< 30564 1726882933.56401: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:4f:68:7a:de:b1 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.11.158/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 2930sec preferred_lft 2930sec\n inet6 fe80::104f:68ff:fe7a:deb1/64 scope link \n valid_lft forever preferred_lft forever\n30: rpltstbr: mtu 1500 qdisc noqueue state DOWN group default qlen 1000\n link/ether 2e:06:5a:d7:92:57 brd ff:ff:ff:ff:ff:ff\n inet 192.0.2.72/31 scope global noprefixroute rpltstbr\n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.11.158 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.11.158 metric 100 \n192.0.2.72/31 dev rpltstbr proto kernel scope link src 192.0.2.72 metric 425 linkdown \nIP -6 ROUTE\n::1 dev lo proto kernel metric 256 pref medium\nfe80::/64 dev eth0 proto kernel metric 256 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 21:42:13.541595", "end": "2024-09-20 21:42:13.549881", "delta": "0:00:00.008286", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882933.56440: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882933.3094053-36330-90276027878898/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882933.56449: _low_level_execute_command(): starting 30564 1726882933.56456: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882933.3094053-36330-90276027878898/ > /dev/null 2>&1 && sleep 0' 30564 1726882933.56912: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882933.56917: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882933.56961: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882933.56969: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882933.56972: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882933.57034: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882933.57041: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882933.57043: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882933.57140: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882933.58922: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882933.58970: stderr chunk (state=3): >>><<< 30564 1726882933.58975: stdout chunk (state=3): >>><<< 30564 1726882933.58990: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882933.58997: handler run complete 30564 1726882933.59017: Evaluated conditional (False): False 30564 1726882933.59029: attempt loop complete, returning result 30564 1726882933.59032: _execute() done 30564 1726882933.59035: dumping result to json 30564 1726882933.59043: done dumping result, returning 30564 1726882933.59051: done running TaskExecutor() for managed_node2/TASK: Check routes and DNS [0e448fcc-3ce9-4216-acec-0000000029a2] 30564 1726882933.59056: sending task result for task 0e448fcc-3ce9-4216-acec-0000000029a2 30564 1726882933.59172: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000029a2 30564 1726882933.59175: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.008286", "end": "2024-09-20 21:42:13.549881", "rc": 0, "start": "2024-09-20 21:42:13.541595" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 12:4f:68:7a:de:b1 brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.11.158/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0 valid_lft 2930sec preferred_lft 2930sec inet6 fe80::104f:68ff:fe7a:deb1/64 scope link valid_lft forever preferred_lft forever 30: rpltstbr: mtu 1500 qdisc noqueue state DOWN group default qlen 1000 link/ether 2e:06:5a:d7:92:57 brd ff:ff:ff:ff:ff:ff inet 192.0.2.72/31 scope global noprefixroute rpltstbr valid_lft forever preferred_lft forever IP ROUTE default via 10.31.8.1 dev eth0 proto dhcp src 10.31.11.158 metric 100 10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.11.158 metric 100 192.0.2.72/31 dev rpltstbr proto kernel scope link src 192.0.2.72 metric 425 linkdown IP -6 ROUTE ::1 dev lo proto kernel metric 256 pref medium fe80::/64 dev eth0 proto kernel metric 256 pref medium RESOLV # Generated by NetworkManager search us-east-1.aws.redhat.com nameserver 10.29.169.13 nameserver 10.29.170.12 nameserver 10.2.32.1 30564 1726882933.59241: no more pending results, returning what we have 30564 1726882933.59245: results queue empty 30564 1726882933.59246: checking for any_errors_fatal 30564 1726882933.59255: done checking for any_errors_fatal 30564 1726882933.59256: checking for max_fail_percentage 30564 1726882933.59258: done checking for max_fail_percentage 30564 1726882933.59259: checking to see if all hosts have failed and the running result is not ok 30564 1726882933.59259: done checking to see if all hosts have failed 30564 1726882933.59260: getting the remaining hosts for this loop 30564 1726882933.59262: done getting the remaining hosts for this loop 30564 1726882933.59268: getting the next task for host managed_node2 30564 1726882933.59276: done getting next task for host managed_node2 30564 1726882933.59278: ^ task is: TASK: Verify DNS and network connectivity 30564 1726882933.59282: ^ state is: HOST STATE: block=8, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882933.59292: getting variables 30564 1726882933.59294: in VariableManager get_vars() 30564 1726882933.59335: Calling all_inventory to load vars for managed_node2 30564 1726882933.59337: Calling groups_inventory to load vars for managed_node2 30564 1726882933.59340: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882933.59351: Calling all_plugins_play to load vars for managed_node2 30564 1726882933.59354: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882933.59356: Calling groups_plugins_play to load vars for managed_node2 30564 1726882933.60366: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882933.61337: done with get_vars() 30564 1726882933.61353: done getting variables 30564 1726882933.61401: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Friday 20 September 2024 21:42:13 -0400 (0:00:00.350) 0:02:12.195 ****** 30564 1726882933.61424: entering _queue_task() for managed_node2/shell 30564 1726882933.61648: worker is 1 (out of 1 available) 30564 1726882933.61661: exiting _queue_task() for managed_node2/shell 30564 1726882933.61677: done queuing things up, now waiting for results queue to drain 30564 1726882933.61678: waiting for pending results... 30564 1726882933.61864: running TaskExecutor() for managed_node2/TASK: Verify DNS and network connectivity 30564 1726882933.61949: in run() - task 0e448fcc-3ce9-4216-acec-0000000029a3 30564 1726882933.61960: variable 'ansible_search_path' from source: unknown 30564 1726882933.61965: variable 'ansible_search_path' from source: unknown 30564 1726882933.61997: calling self._execute() 30564 1726882933.62076: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882933.62082: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882933.62092: variable 'omit' from source: magic vars 30564 1726882933.62376: variable 'ansible_distribution_major_version' from source: facts 30564 1726882933.62387: Evaluated conditional (ansible_distribution_major_version != '6'): True 30564 1726882933.62487: variable 'ansible_facts' from source: unknown 30564 1726882933.62973: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): True 30564 1726882933.62978: variable 'omit' from source: magic vars 30564 1726882933.63010: variable 'omit' from source: magic vars 30564 1726882933.63033: variable 'omit' from source: magic vars 30564 1726882933.63078: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 30564 1726882933.63101: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 30564 1726882933.63116: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 30564 1726882933.63129: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882933.63139: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 30564 1726882933.63164: variable 'inventory_hostname' from source: host vars for 'managed_node2' 30564 1726882933.63170: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882933.63175: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882933.63242: Set connection var ansible_timeout to 10 30564 1726882933.63248: Set connection var ansible_pipelining to False 30564 1726882933.63251: Set connection var ansible_shell_type to sh 30564 1726882933.63256: Set connection var ansible_shell_executable to /bin/sh 30564 1726882933.63263: Set connection var ansible_module_compression to ZIP_DEFLATED 30564 1726882933.63269: Set connection var ansible_connection to ssh 30564 1726882933.63292: variable 'ansible_shell_executable' from source: unknown 30564 1726882933.63297: variable 'ansible_connection' from source: unknown 30564 1726882933.63299: variable 'ansible_module_compression' from source: unknown 30564 1726882933.63302: variable 'ansible_shell_type' from source: unknown 30564 1726882933.63304: variable 'ansible_shell_executable' from source: unknown 30564 1726882933.63307: variable 'ansible_host' from source: host vars for 'managed_node2' 30564 1726882933.63309: variable 'ansible_pipelining' from source: unknown 30564 1726882933.63311: variable 'ansible_timeout' from source: unknown 30564 1726882933.63314: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 30564 1726882933.63409: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882933.63418: variable 'omit' from source: magic vars 30564 1726882933.63424: starting attempt loop 30564 1726882933.63426: running the handler 30564 1726882933.63436: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 30564 1726882933.63453: _low_level_execute_command(): starting 30564 1726882933.63460: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 30564 1726882933.63987: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882933.64003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882933.64014: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882933.64026: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882933.64078: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882933.64101: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882933.64201: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882933.65754: stdout chunk (state=3): >>>/root <<< 30564 1726882933.65854: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882933.65906: stderr chunk (state=3): >>><<< 30564 1726882933.65911: stdout chunk (state=3): >>><<< 30564 1726882933.65932: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882933.65941: _low_level_execute_command(): starting 30564 1726882933.65947: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882933.6592963-36347-115612795989192 `" && echo ansible-tmp-1726882933.6592963-36347-115612795989192="` echo /root/.ansible/tmp/ansible-tmp-1726882933.6592963-36347-115612795989192 `" ) && sleep 0' 30564 1726882933.66368: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882933.66380: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882933.66410: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882933.66422: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882933.66484: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882933.66489: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882933.66599: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882933.68433: stdout chunk (state=3): >>>ansible-tmp-1726882933.6592963-36347-115612795989192=/root/.ansible/tmp/ansible-tmp-1726882933.6592963-36347-115612795989192 <<< 30564 1726882933.68541: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882933.68585: stderr chunk (state=3): >>><<< 30564 1726882933.68591: stdout chunk (state=3): >>><<< 30564 1726882933.68609: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882933.6592963-36347-115612795989192=/root/.ansible/tmp/ansible-tmp-1726882933.6592963-36347-115612795989192 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882933.68632: variable 'ansible_module_compression' from source: unknown 30564 1726882933.68673: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-30564uwjv555r/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 30564 1726882933.68706: variable 'ansible_facts' from source: unknown 30564 1726882933.68772: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882933.6592963-36347-115612795989192/AnsiballZ_command.py 30564 1726882933.68869: Sending initial data 30564 1726882933.68873: Sent initial data (156 bytes) 30564 1726882933.69506: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882933.69509: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882933.69541: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882933.69545: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882933.69547: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882933.69645: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882933.69726: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882933.71443: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 <<< 30564 1726882933.71446: stderr chunk (state=3): >>>debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 30564 1726882933.71539: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 30564 1726882933.71639: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-30564uwjv555r/tmppcm7ubc2 /root/.ansible/tmp/ansible-tmp-1726882933.6592963-36347-115612795989192/AnsiballZ_command.py <<< 30564 1726882933.71732: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 30564 1726882933.72816: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882933.72905: stderr chunk (state=3): >>><<< 30564 1726882933.72909: stdout chunk (state=3): >>><<< 30564 1726882933.72923: done transferring module to remote 30564 1726882933.72932: _low_level_execute_command(): starting 30564 1726882933.72936: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882933.6592963-36347-115612795989192/ /root/.ansible/tmp/ansible-tmp-1726882933.6592963-36347-115612795989192/AnsiballZ_command.py && sleep 0' 30564 1726882933.73350: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882933.73357: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882933.73415: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 30564 1726882933.73418: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882933.73420: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882933.73422: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882933.73424: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882933.73475: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882933.73487: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882933.73591: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882933.75316: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882933.75356: stderr chunk (state=3): >>><<< 30564 1726882933.75361: stdout chunk (state=3): >>><<< 30564 1726882933.75376: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882933.75380: _low_level_execute_command(): starting 30564 1726882933.75384: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882933.6592963-36347-115612795989192/AnsiballZ_command.py && sleep 0' 30564 1726882933.75801: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882933.75805: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882933.75838: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 30564 1726882933.75844: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882933.75846: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882933.75894: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882933.75900: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882933.76012: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882934.01225: stdout chunk (state=3): >>> {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 11730 0 --:--:-- --:--:-- --:--:-- 11730\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 4098 0 --:--:-- --:--:-- --:--:-- 4041", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 21:42:13.888730", "end": "2024-09-20 21:42:14.010074", "delta": "0:00:00.121344", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 30564 1726882934.02594: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 30564 1726882934.02615: stderr chunk (state=3): >>><<< 30564 1726882934.02619: stdout chunk (state=3): >>><<< 30564 1726882934.02771: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 11730 0 --:--:-- --:--:-- --:--:-- 11730\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 4098 0 --:--:-- --:--:-- --:--:-- 4041", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 21:42:13.888730", "end": "2024-09-20 21:42:14.010074", "delta": "0:00:00.121344", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 30564 1726882934.02780: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts "$host"; then\n echo FAILED to lookup host "$host"\n exit 1\n fi\n if ! curl -o /dev/null https://"$host"; then\n echo FAILED to contact host "$host"\n exit 1\n fi\ndone\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882933.6592963-36347-115612795989192/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 30564 1726882934.02784: _low_level_execute_command(): starting 30564 1726882934.02787: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882933.6592963-36347-115612795989192/ > /dev/null 2>&1 && sleep 0' 30564 1726882934.03344: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 30564 1726882934.03359: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882934.03377: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882934.03395: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882934.03436: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882934.03450: stderr chunk (state=3): >>>debug2: match not found <<< 30564 1726882934.03468: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882934.03488: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 30564 1726882934.03500: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 30564 1726882934.03511: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 30564 1726882934.03524: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 30564 1726882934.03537: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 30564 1726882934.03554: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 30564 1726882934.03569: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 30564 1726882934.03581: stderr chunk (state=3): >>>debug2: match found <<< 30564 1726882934.03595: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 30564 1726882934.03671: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 30564 1726882934.03690: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 30564 1726882934.03706: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 30564 1726882934.03834: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 30564 1726882934.05733: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 30564 1726882934.05737: stdout chunk (state=3): >>><<< 30564 1726882934.05743: stderr chunk (state=3): >>><<< 30564 1726882934.05777: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 30564 1726882934.05781: handler run complete 30564 1726882934.05804: Evaluated conditional (False): False 30564 1726882934.05818: attempt loop complete, returning result 30564 1726882934.05821: _execute() done 30564 1726882934.05825: dumping result to json 30564 1726882934.05827: done dumping result, returning 30564 1726882934.05834: done running TaskExecutor() for managed_node2/TASK: Verify DNS and network connectivity [0e448fcc-3ce9-4216-acec-0000000029a3] 30564 1726882934.05840: sending task result for task 0e448fcc-3ce9-4216-acec-0000000029a3 30564 1726882934.05981: done sending task result for task 0e448fcc-3ce9-4216-acec-0000000029a3 30564 1726882934.05984: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "delta": "0:00:00.121344", "end": "2024-09-20 21:42:14.010074", "rc": 0, "start": "2024-09-20 21:42:13.888730" } STDOUT: CHECK DNS AND CONNECTIVITY 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org STDERR: % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 305 100 305 0 0 11730 0 --:--:-- --:--:-- --:--:-- 11730 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 291 100 291 0 0 4098 0 --:--:-- --:--:-- --:--:-- 4041 30564 1726882934.06052: no more pending results, returning what we have 30564 1726882934.06056: results queue empty 30564 1726882934.06057: checking for any_errors_fatal 30564 1726882934.06078: done checking for any_errors_fatal 30564 1726882934.06079: checking for max_fail_percentage 30564 1726882934.06081: done checking for max_fail_percentage 30564 1726882934.06082: checking to see if all hosts have failed and the running result is not ok 30564 1726882934.06083: done checking to see if all hosts have failed 30564 1726882934.06084: getting the remaining hosts for this loop 30564 1726882934.06085: done getting the remaining hosts for this loop 30564 1726882934.06089: getting the next task for host managed_node2 30564 1726882934.06101: done getting next task for host managed_node2 30564 1726882934.06103: ^ task is: TASK: meta (flush_handlers) 30564 1726882934.06105: ^ state is: HOST STATE: block=9, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882934.06109: getting variables 30564 1726882934.06111: in VariableManager get_vars() 30564 1726882934.06150: Calling all_inventory to load vars for managed_node2 30564 1726882934.06152: Calling groups_inventory to load vars for managed_node2 30564 1726882934.06155: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882934.06168: Calling all_plugins_play to load vars for managed_node2 30564 1726882934.06171: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882934.06174: Calling groups_plugins_play to load vars for managed_node2 30564 1726882934.07820: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882934.10993: done with get_vars() 30564 1726882934.11019: done getting variables 30564 1726882934.11097: in VariableManager get_vars() 30564 1726882934.11111: Calling all_inventory to load vars for managed_node2 30564 1726882934.11114: Calling groups_inventory to load vars for managed_node2 30564 1726882934.11116: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882934.11120: Calling all_plugins_play to load vars for managed_node2 30564 1726882934.11122: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882934.11125: Calling groups_plugins_play to load vars for managed_node2 30564 1726882934.13857: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882934.15784: done with get_vars() 30564 1726882934.15826: done queuing things up, now waiting for results queue to drain 30564 1726882934.15828: results queue empty 30564 1726882934.15829: checking for any_errors_fatal 30564 1726882934.15833: done checking for any_errors_fatal 30564 1726882934.15834: checking for max_fail_percentage 30564 1726882934.15835: done checking for max_fail_percentage 30564 1726882934.15836: checking to see if all hosts have failed and the running result is not ok 30564 1726882934.15837: done checking to see if all hosts have failed 30564 1726882934.15838: getting the remaining hosts for this loop 30564 1726882934.15839: done getting the remaining hosts for this loop 30564 1726882934.15842: getting the next task for host managed_node2 30564 1726882934.15845: done getting next task for host managed_node2 30564 1726882934.15847: ^ task is: TASK: meta (flush_handlers) 30564 1726882934.15849: ^ state is: HOST STATE: block=10, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882934.15852: getting variables 30564 1726882934.15853: in VariableManager get_vars() 30564 1726882934.15868: Calling all_inventory to load vars for managed_node2 30564 1726882934.15870: Calling groups_inventory to load vars for managed_node2 30564 1726882934.15873: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882934.15879: Calling all_plugins_play to load vars for managed_node2 30564 1726882934.15881: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882934.15884: Calling groups_plugins_play to load vars for managed_node2 30564 1726882934.17283: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882934.19136: done with get_vars() 30564 1726882934.19157: done getting variables 30564 1726882934.19218: in VariableManager get_vars() 30564 1726882934.19230: Calling all_inventory to load vars for managed_node2 30564 1726882934.19233: Calling groups_inventory to load vars for managed_node2 30564 1726882934.19235: Calling all_plugins_inventory to load vars for managed_node2 30564 1726882934.19240: Calling all_plugins_play to load vars for managed_node2 30564 1726882934.19242: Calling groups_plugins_inventory to load vars for managed_node2 30564 1726882934.19245: Calling groups_plugins_play to load vars for managed_node2 30564 1726882934.21303: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 30564 1726882934.23214: done with get_vars() 30564 1726882934.23246: done queuing things up, now waiting for results queue to drain 30564 1726882934.23248: results queue empty 30564 1726882934.23249: checking for any_errors_fatal 30564 1726882934.23251: done checking for any_errors_fatal 30564 1726882934.23251: checking for max_fail_percentage 30564 1726882934.23252: done checking for max_fail_percentage 30564 1726882934.23253: checking to see if all hosts have failed and the running result is not ok 30564 1726882934.23254: done checking to see if all hosts have failed 30564 1726882934.23254: getting the remaining hosts for this loop 30564 1726882934.23256: done getting the remaining hosts for this loop 30564 1726882934.23258: getting the next task for host managed_node2 30564 1726882934.23262: done getting next task for host managed_node2 30564 1726882934.23262: ^ task is: None 30564 1726882934.23266: ^ state is: HOST STATE: block=11, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 30564 1726882934.23267: done queuing things up, now waiting for results queue to drain 30564 1726882934.23268: results queue empty 30564 1726882934.23268: checking for any_errors_fatal 30564 1726882934.23269: done checking for any_errors_fatal 30564 1726882934.23270: checking for max_fail_percentage 30564 1726882934.23271: done checking for max_fail_percentage 30564 1726882934.23271: checking to see if all hosts have failed and the running result is not ok 30564 1726882934.23272: done checking to see if all hosts have failed 30564 1726882934.23274: getting the next task for host managed_node2 30564 1726882934.23277: done getting next task for host managed_node2 30564 1726882934.23277: ^ task is: None 30564 1726882934.23279: ^ state is: HOST STATE: block=11, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node2 : ok=334 changed=10 unreachable=0 failed=0 skipped=312 rescued=0 ignored=11 Friday 20 September 2024 21:42:14 -0400 (0:00:00.619) 0:02:12.815 ****** =============================================================================== fedora.linux_system_roles.network : Check which services are running ---- 1.78s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.77s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.77s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.76s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.76s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.76s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Gathering Facts --------------------------------------------------------- 1.72s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_states_nm.yml:6 fedora.linux_system_roles.network : Check which services are running ---- 1.68s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.66s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.66s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.63s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.63s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.62s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.62s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.61s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.59s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which packages are installed --- 1.39s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Check which packages are installed --- 1.14s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Check which packages are installed --- 1.09s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Check which packages are installed --- 1.04s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 30564 1726882934.23651: RUNNING CLEANUP